Skip to content

Commit efe4abd

Browse files
meyeringJunio C Hamano
authored andcommitted
Run "git repack -a -d" once more at end, if there's 1MB or more of not-packed data.
Although I converted upstream coreutils to git last month, I just reconverted coreutils once again, as a test, and ended up with a git repository of about 130MB (contrast with my packed git repo of size 52MB). That was because there were a lot of commits (but < 1024) after the final automatic "git-repack -a -d". Running a final git-repack -a -d && git-prune-packed cut the final repository size down to the expected size. So this looks like an easy way to improve git-cvsimport. Just run "git repack ..." at the end if there's more than some reasonable amount of not-packed data. My choice of 1MB is a little arbitrarily. I wouldn't mind missing the minimal repo size by 1MB. At the other end of the spectrum, it's probably not worthwhile to pack everything when the total repository size is less than 1MB. Here's the patch: Signed-off-by: Jim Meyering <jim@meyering.net> Signed-off-by: Junio C Hamano <junkio@cox.net>
1 parent faa1bbf commit efe4abd

File tree

1 file changed

+10
-0
lines changed

1 file changed

+10
-0
lines changed

git-cvsimport.perl

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -876,6 +876,16 @@ sub commit {
876876
}
877877
commit() if $branch and $state != 11;
878878
879+
# The heuristic of repacking every 1024 commits can leave a
880+
# lot of unpacked data. If there is more than 1MB worth of
881+
# not-packed objects, repack once more.
882+
my $line = `git-count-objects`;
883+
if ($line =~ /^(\d+) objects, (\d+) kilobytes$/) {
884+
my ($n_objects, $kb) = ($1, $2);
885+
1024 < $kb
886+
and system("git repack -a -d");
887+
}
888+
879889
foreach my $git_index (values %index) {
880890
if ($git_index ne '.git/index') {
881891
unlink($git_index);

0 commit comments

Comments
 (0)