Skip to content

repack Git Command Guide

The git repack command combines all unpacked objects in a repository into packs and can reorganize existing packs into more efficient structures. It helps optimize repository storage, reduce disk usage, and improve performance.

Terminal window
git repack [-a] [-A] [-d] [-f] [-F] [-l] [-n] [-q] [-b] [-m]
[--window=<n>] [--depth=<n>] [--threads=<n>] [--keep-pack=<pack-name>]
[--write-midx] [--name-hash-version=<n>] [--path-walk]
OptionDescription
-aPack all objects into single pack (incremental by default)
-APack all objects including unreachable ones
-dDelete redundant packs after repacking
OptionDescription
-fForce repack even if unnecessary
-FUse alternate pack algorithm
-mFind and use deltas to minimize size
--window=<n>Delta search window size (default 10)
--depth=<n>Delta chain depth (default 50)
--threads=<n>Number of threads to use
OptionDescription
-q, --quietSuppress output
-nDry run - show what would be done
-lIgnore packed refs, repack only local
-bCreate bitmap index for faster clone
OptionDescription
--keep-pack=<name>Keep specified pack file
--write-midxWrite multi-pack index
--name-hash-version=<n>Hash version for pack names
--path-walkUse path-based object traversal
Pack Files:
├── Collections of compressed Git objects
├── Stored as .pack and .idx pairs in .git/objects/pack/
├── Use delta compression to save space
├── Enable efficient data transfer
├── Improve repository performance
Pack Structure:
├── .pack file: Compressed object data
├── .idx file: Index for quick object lookup
├── Bitmap index: Accelerates clone operations
└── Multi-pack index: Manages multiple packs
Repack Workflow:
1. Identify loose objects in .git/objects/
2. Find existing pack files
3. Create new pack with optimal compression
4. Update indexes and references
5. Remove redundant packs (with -d)
6. Update repository statistics
Delta Compression:
├── Base object + differences = delta
├── Reduces storage for similar objects
├── Window size controls search scope
├── Depth limits delta chain length
Optimization Goals:
├── Minimize total repository size
├── Maximize compression efficiency
├── Improve access performance
├── Reduce network transfer time
├── Maintain repository integrity
Trade-offs:
├── -a: Complete repack (slower, optimal size)
├── Incremental: Faster, may leave suboptimal packs
├── -m: Better compression (slower)
├── --depth: Balance size vs speed
Terminal window
# Incremental repack (default)
git repack
# Full repack with cleanup
git repack -a -d
# Aggressive repack with optimization
git repack -a -d -f
# Repack with bitmap for faster clones
git repack -a -d -b
Terminal window
# Clean up loose objects
git repack -a
# Remove redundant packs
git repack -d
# Force repack when needed
git repack -f
# Quiet operation
git repack -q
Terminal window
# Optimize for size (slower)
git repack -a -d -m --window=250 --depth=250
# Optimize for speed
git repack -a -d --window=10 --depth=50
# Multi-threaded repack
git repack -a -d --threads=4
# Use alternate algorithm
git repack -F -a -d
Terminal window
# Complete repository optimization
complete_repack() {
echo "Performing complete repository repack..."
# Repack all objects
git repack -a -d -f
# Create bitmap index
git repack -b
# Clean up reflogs
git reflog expire --expire=30.days --all
# Aggressive garbage collection
git gc --aggressive --prune=now
# Verify repository
git fsck --full
echo "Repository optimization complete"
}
complete_repack
Terminal window
# Handle large repositories efficiently
large_repo_repack() {
echo "Repacking large repository..."
# Use multiple threads
git repack -a -d --threads=8
# Optimize delta compression
git repack -m --window=500 --depth=500
# Create multi-pack index
git repack --write-midx
# Monitor disk usage
du -sh .git/objects/pack/
echo "Large repository repack complete"
}
large_repo_repack
Terminal window
# Repack specific pack files
selective_repack() {
local keep_pack="$1"
echo "Selective repack keeping $keep_pack..."
# Keep specific pack
git repack -a -d --keep-pack="$keep_pack"
# Verify kept pack
ls -la .git/objects/pack/ | grep "$keep_pack"
echo "Selective repack complete"
}
selective_repack "large-project.pack"
Terminal window
# Incremental repack maintenance
incremental_maintenance() {
echo "Performing incremental maintenance..."
# Check for loose objects
loose_count=$(find .git/objects -type f -name "[0-9a-f][0-9a-f]*" | wc -l)
if [ "$loose_count" -gt 1000 ]; then
echo "Found $loose_count loose objects - repacking..."
git repack -a -d
else
echo "Only $loose_count loose objects - skipping repack"
fi
# Clean up old packs periodically
if [ $(date +%d) = "01" ]; then # Monthly
echo "Monthly pack cleanup..."
git repack -A -d # Include unreachable
fi
echo "Incremental maintenance complete"
}
incremental_maintenance
Terminal window
# Configure repack behavior
git config repack.usedeltabaseoffset true # Use delta base offsets
git config repack.writebitmaps true # Write bitmap indexes
git config pack.window 10 # Delta search window
git config pack.depth 50 # Delta chain depth
git config pack.threads 1 # Default threads
# Configure pack compression
git config core.compression 6 # Compression level
git config pack.compression 6 # Pack compression
git config pack.deltaCacheSize 256m # Delta cache size
# Configure GC integration
git config gc.auto 6700 # GC threshold
git config gc.autopacklimit 50 # Pack limit before GC
Terminal window
# Always backup before major repacks
git bundle create backup.bundle --all
# Check repository status
git status
git fsck
# Monitor disk space
df -h .
du -sh .git/
# Test after repack
git log --oneline -5
git status
Terminal window
# Choose repack strategy based on situation:
# - Daily development: git repack (incremental)
# - Weekly maintenance: git repack -a -d
# - Monthly cleanup: git repack -A -d (include unreachable)
# - Performance optimization: git repack -a -d -m --window=250
# - Large repos: git repack -a -d --threads=8
# General rules:
# - Use -a -d for complete optimization
# - Use -f when repack seems unnecessary
# - Use -b for repositories that are cloned often
# - Monitor impact on repository size and performance
#!/bin/bash
# Automated repository maintenance with repack
automated_maintenance() {
local repo_path="${1:-.}"
local max_pack_size="${2:-100m}"
cd "$repo_path" || exit 1
echo "Starting automated maintenance for $(pwd)"
# Check repository health
if ! git status >/dev/null 2>&1; then
echo "Not a git repository"
exit 1
fi
# Count loose objects
loose_objects=$(find .git/objects -type f -name "[0-9a-f]*" 2>/dev/null | wc -l)
# Check pack sizes
total_pack_size=$(du -bc .git/objects/pack/*.pack 2>/dev/null | tail -1 | cut -f1)
# Decide on repack strategy
if [ "$loose_objects" -gt 1000 ] || [ "$total_pack_size" -gt "$(numfmt --from=iec "$max_pack_size")" ]; then
echo "Repacking repository (loose: $loose_objects, packs: $(numfmt --to=iec $total_pack_size))"
# Full repack with cleanup
git repack -a -d -f --threads=4
# Create bitmap if beneficial
if [ "$(git branch -r | wc -l)" -gt 5 ]; then
git repack -b
fi
# Update statistics
echo "Repack complete. New pack sizes:"
ls -lh .git/objects/pack/*.pack | head -5
else
echo "Repository optimization not needed (loose: $loose_objects, packs: $(numfmt --to=iec $total_pack_size))"
fi
# Clean up old reflogs
git reflog expire --expire=30.days --all
# Final garbage collection
git gc --quiet
echo "Automated maintenance complete"
}
automated_maintenance "/path/to/repo" "500m"
Terminal window
# Repack in CI/CD pipelines
ci_repack() {
echo "CI/CD repository repack..."
# Only repack on schedule or when needed
if [ "$CI_SCHEDULE" = "weekly" ] || [ "$FORCE_REPACK" = "true" ]; then
# Shallow repack for CI speed
git repack -a -d --threads=2
# Create bitmap for faster subsequent clones
git repack -b
# Clean up
git gc --quiet --prune=now
# Verify repository integrity
git fsck --quick
echo "CI repack complete"
else
echo "Skipping repack - not scheduled"
fi
}
ci_repack
Terminal window
# Manage repack across multiple repositories
multi_repo_repack() {
local repos_dir="$1"
local max_age="${2:-7}" # days
echo "Multi-repository repack for $repos_dir"
find "$repos_dir" -name ".git" -type d -mtime -$max_age | while read -r git_dir; do
repo_dir=$(dirname "$git_dir")
echo "Processing $repo_dir..."
(
cd "$repo_dir" || continue
# Check if repack needed
if git count-objects -v | grep -q "count: [1-9]"; then
echo " Repacking..."
git repack -a -d -q
else
echo " No repack needed"
fi
)
done
echo "Multi-repository repack complete"
}
multi_repo_repack "/projects" 3
Terminal window
# Optimize slow repacks
optimize_repack_performance() {
echo "Optimizing repack performance..."
# Increase threads
git repack -a -d --threads="$(nproc)"
# Reduce window for speed
git repack -a -d --window=10 --depth=50
# Use faster compression
git config core.compression 1
git repack -a -d
# Reset compression
git config core.compression 6
echo "Performance optimization complete"
}
optimize_repack_performance
Terminal window
# Handle disk space constraints during repack
low_disk_repack() {
echo "Repack with low disk space..."
# Check available space
available=$(df . | tail -1 | awk '{print $4}')
if [ "$available" -lt 1000000 ]; then # Less than 1GB
echo "Low disk space detected"
# Incremental repack
git repack
# Clean up aggressively
git gc --aggressive --prune=now
# Remove old reflogs
git reflog expire --expire=1.day --all
else
# Normal repack
git repack -a -d
fi
echo "Low disk repack complete"
}
low_disk_repack
Terminal window
# Recover from repack corruption
recover_repack_corruption() {
echo "Attempting to recover from repack corruption..."
# Check repository status
git status
# Verify objects
git fsck --full
# Try to recover from backup
if [ -f "backup.bundle" ]; then
git bundle unbundle backup.bundle
fi
# Rebuild from remote
git fetch --all
git reset --hard origin/main
# Clean and repack
git clean -fd
git repack -a -d
echo "Recovery attempt complete"
}
recover_repack_corruption
Terminal window
# Handle very large pack files
manage_large_packs() {
echo "Managing large pack files..."
# Find large packs
find .git/objects/pack -name "*.pack" -size +1G -exec ls -lh {} \;
# Split large repositories
echo "Consider splitting large repositories or using git-submodule"
# Optimize large pack access
git config core.deltaBaseCacheLimit 2g
git config pack.threads 1 # Reduce for large packs
# Repack with care
git repack -a -d --window=50 --depth=100
echo "Large pack management complete"
}
manage_large_packs
Terminal window
# Handle concurrent repack operations
safe_concurrent_repack() {
echo "Safe concurrent repack..."
# Use lock file to prevent concurrent repacks
lockfile=".git/repack.lock"
if [ -f "$lockfile" ]; then
echo "Repack already in progress"
exit 1
fi
# Create lock
touch "$lockfile"
# Perform repack
trap 'rm -f "$lockfile"' EXIT
git repack -a -d -q
# Remove lock
rm -f "$lockfile"
echo "Concurrent-safe repack complete"
}
safe_concurrent_repack
Terminal window
# Handle memory constraints
memory_conscious_repack() {
echo "Memory-conscious repack..."
# Reduce memory usage
git config pack.deltaCacheSize 64m
git config core.deltaBaseCacheLimit 512m
# Use fewer threads
git repack -a -d --threads=1
# Smaller window and depth
git repack -a -d --window=10 --depth=25
# Clean up memory settings
git config --unset pack.deltaCacheSize
git config --unset core.deltaBaseCacheLimit
echo "Memory-conscious repack complete"
}
memory_conscious_repack
#!/bin/bash
# Enterprise repository optimization
enterprise_repack() {
local repo_path="$1"
local maintenance_window="${2:-3600}" # 1 hour default
echo "Enterprise repack for $repo_path"
cd "$repo_path" || exit 1
# Check repository size
repo_size=$(du -sh .git | cut -f1)
echo "Repository size: $repo_size"
# Start timing
start_time=$(date +%s)
# Perform optimized repack
echo "Starting repack with $(nproc) threads..."
# Phase 1: Incremental repack
timeout "$maintenance_window" git repack -a -d --threads="$(nproc)" ||
echo "Incremental repack timed out"
# Phase 2: Bitmap creation
timeout "$((maintenance_window / 4))" git repack -b ||
echo "Bitmap creation timed out"
# Phase 3: Multi-pack index
timeout "$((maintenance_window / 4))" git repack --write-midx ||
echo "Multi-pack index creation timed out"
# Phase 4: Final cleanup
git gc --quiet --prune=2.weeks
# Calculate time taken
end_time=$(date +%s)
duration=$((end_time - start_time))
# Report results
new_size=$(du -sh .git | cut -f1)
echo "Repack complete in ${duration}s"
echo "Size change: $repo_size -> $new_size"
# Verify repository
if git fsck --quick >/dev/null 2>&1; then
echo "✓ Repository integrity verified"
else
echo "✗ Repository integrity check failed"
fi
}
enterprise_repack "/enterprise/repo" 7200 # 2 hours
Terminal window
# Team repository maintenance script
team_repo_maintenance() {
local team_repos="$1"
echo "Team repository maintenance"
for repo in "$team_repos"/*/; do
if [ -d "$repo/.git" ]; then
echo "Maintaining $repo..."
(
cd "$repo" || continue
# Check for maintenance needs
loose=$(git count-objects | awk '/count:/ {print $2}')
packs=$(ls .git/objects/pack/*.pack 2>/dev/null | wc -l)
if [ "$loose" -gt 100 ] || [ "$packs" -gt 10 ]; then
echo " Repacking (loose: $loose, packs: $packs)"
# Team-optimized repack
git repack -a -d -q --threads=2
# Update team statistics
echo "$(date),$repo,$loose,$packs" >> /tmp/team-repo-stats.csv
else
echo " Skipping (loose: $loose, packs: $packs)"
fi
)
fi
done
echo "Team maintenance complete"
}
team_repo_maintenance "/team/projects"
Terminal window
# Monitor and maintain repository health
repo_health_monitor() {
local repo_path="$1"
local alert_threshold="${2:-1000}"
echo "Monitoring repository health: $repo_path"
cd "$repo_path" || exit 1
# Gather statistics
stats=$(git count-objects -v)
loose_objects=$(echo "$stats" | awk '/count:/ {print $2}')
pack_files=$(echo "$stats" | awk '/packs:/ {print $2}')
pack_size=$(echo "$stats" | awk '/size-pack:/ {print $2}')
pruneable=$(echo "$stats" | awk '/prune-packable:/ {print $2}')
# Check health indicators
health_issues=0
if [ "$loose_objects" -gt "$alert_threshold" ]; then
echo "⚠ High loose objects: $loose_objects"
health_issues=$((health_issues + 1))
fi
if [ "$pack_files" -gt 50 ]; then
echo "⚠ Many pack files: $pack_files"
health_issues=$((health_issues + 1))
fi
if [ "$pruneable" -gt 0 ]; then
echo "⚠ Pruneable objects: $pruneable"
health_issues=$((health_issues + 1))
fi
# Perform maintenance if needed
if [ "$health_issues" -gt 0 ]; then
echo "Performing maintenance..."
# Safe repack
git repack -a -d -q
# Clean up
git gc --quiet --prune=2.weeks
echo "✓ Maintenance completed"
else
echo "✓ Repository health good"
fi
# Log statistics
echo "$(date),$loose_objects,$pack_files,$pack_size,$pruneable,$health_issues" >> repo-health.log
}
repo_health_monitor "/important/repo" 500
Terminal window
# Repack for disaster recovery
disaster_recovery_repack() {
local backup_source="$1"
local recovery_repo="$2"
echo "Disaster recovery repack"
# Create recovery repository
git init "$recovery_repo"
cd "$recovery_repo" || exit 1
# Restore from backup
if [ -f "$backup_source" ]; then
git bundle unbundle "$backup_source"
else
echo "No backup source found"
exit 1
fi
# Optimize recovered repository
echo "Optimizing recovered repository..."
# Aggressive repack for recovery
git repack -A -d -f --threads=4
# Rebuild indexes
git repack -b
git repack --write-midx
# Verify recovery
if git fsck --full >/dev/null 2>&1; then
echo "✓ Recovery successful"
# Show recovery statistics
git count-objects -v
# Clean up
git gc --aggressive --prune=now
echo "✓ Repository fully recovered and optimized"
else
echo "✗ Recovery verification failed"
exit 1
fi
}
disaster_recovery_repack "/backups/repo.bundle" "/recovered/repo"
Terminal window
# Benchmark repack performance
benchmark_repack() {
local repo_path="$1"
local iterations="${2:-3}"
echo "Benchmarking repack performance in $repo_path"
cd "$repo_path" || exit 1
# Create test repository state
git reset --hard HEAD~100 # Go back to create loose objects
git reset --hard HEAD~100 # Create more loose objects
echo "Running $iterations repack benchmarks..."
for i in $(seq 1 "$iterations"); do
echo "Iteration $i:"
# Time different repack strategies
echo " Incremental repack:"
time git repack -q
echo " Full repack:"
time git repack -a -d -q
echo " Optimized repack:"
time git repack -a -d -m --window=100 --depth=100 -q
# Clean up for next iteration
git reset --hard HEAD
done
echo "Benchmarking complete"
}
benchmark_repack "/test/repo" 5

What’s the difference between git repack and git gc?

Section titled “What’s the difference between git repack and git gc?”

git repack focuses on packing objects into efficient pack files; git gc is broader maintenance that includes repack, pruning, and cleanup of reflogs and other repository data.

Use -a -d for complete repository optimization: -a packs all objects, -d removes redundant packs. Good for weekly/monthly maintenance.

Creates a bitmap index for the pack file, which significantly speeds up clone and fetch operations by allowing Git to quickly find objects without reading the entire index.

How do I reduce repository size with repack?

Section titled “How do I reduce repository size with repack?”

Use git repack -a -d -m —window=250 —depth=250 for maximum compression. This finds more delta opportunities but takes longer.

No, repack is safe and doesn’t delete objects. It creates new pack files and only removes old ones with -d after verifying the new packs are complete.

What’s the impact of repack on repository performance?

Section titled “What’s the impact of repack on repository performance?”

Repack can temporarily slow operations during execution but generally improves performance by creating more efficient pack files and bitmap indexes.

For active development repos: weekly with git repack -a -d. For large repos: monthly with full optimization. Use git gc —auto for automatic maintenance.

Loose objects are individual files in .git/objects/. Packing combines them into compressed pack files for better storage efficiency and performance.

No, repack only works on local repositories. Remote repositories are maintained by the server hosting them.

Use git count-objects -v to see loose object count. If over 1000 loose objects or many pack files, repack is beneficial.

What’s the difference between incremental and full repack?

Section titled “What’s the difference between incremental and full repack?”

Incremental (default) only packs new loose objects; full repack (-a) repacks all objects for maximum optimization but takes longer.

Repack can be resource-intensive. Use locking mechanisms or run during low-usage periods to avoid conflicts with other Git operations.

Use git repack without -q to see progress. For large repos, monitor with top/htop or check .git/objects/pack/ directory for new files.

Repack needs temporary space for new packs. Ensure 2x repository size free space. Use incremental repack or clean up old pack files first.

Repack doesn’t delete objects immediately. Use git fsck —unreachable to find old objects, or restore from backup if needed.

With -b (bitmap), clones are much faster as Git can quickly determine which objects to send. Without bitmap, clones read entire pack indexes.

What’s multi-pack index and when to use it?

Section titled “What’s multi-pack index and when to use it?”

Multi-pack index (—write-midx) manages multiple pack files efficiently. Useful for repos with many packs or frequent repacks.

  1. Repository Size Optimization: Reduce disk usage by efficiently packing objects
  2. Performance Improvement: Create optimized pack files for faster Git operations
  3. Maintenance Automation: Regular cleanup of loose objects and redundant packs
  4. Clone Speed Enhancement: Generate bitmap indexes for faster repository cloning
  5. Storage Efficiency: Maximize compression through delta encoding and optimal packing
  6. Repository Health Management: Maintain optimal repository structure and performance