41

I have a repository full of zip files, re-compressing theses files will be a waste of time.

I've tried to set core.compression = 0 on the remote and the local copy without success

git config core.compression 0
git config core.loosecompression 0

git pull still do

remote: Counting objects: 23, done.
remote: Compressing objects: ...
1
  • 2
    Have you measured the difference in performance? I wouldn't worry about time spent compressing already-compressed data; networks are likely much slower than your CPU. Commented Aug 18, 2011 at 3:55

3 Answers 3

62

The time problem I had was caused by delta compression.

The solution for me was

echo '*.zip -delta' > .gitattributes
git gc

I will quote this excellent response from Re: serious performance issues with images, audio files, and other "non-code" data:

Git does spend a fair bit of time in zlib for some workloads, but it should not create problems on the order of minutes.

For pushing and pulling, you're probably seeing delta compression, which can be slow for large files

 core.compression 0   # Didn't seem to work.

That should disable zlib compression of loose objects and objects within packfiles. It can save a little time for objects which won't compress, but you will lose the size benefits for any text files.

But it won't turn off delta compression, which is what the "compressing..." phase during push and pull is doing. And which is much more likely the cause of slowness.

 pack.window 0

It sets the number of other objects git will consider when doing delta compression. Setting it low should improve your push/pull times. But you will lose the substantial benefit of delta-compression of your non-image files (and git's meta objects). So the "-delta" option above for specific files is a much better solution.

 echo '*.jpg -delta' >> .gitattributes

Also, consider repacking your repository, which will generate a packfile that will be re-used during push and pull.

Note that the settings have to be made on the repo you are fetching/pulling from, not the one you are fetching/pulling to.

1
  • 16
    suggest using ">>" since this will overwrite anything else you might already have in your .gitattributes file: echo '*.zip -delta' >> .gitattributes
    – scottgwald
    Commented Nov 9, 2015 at 16:37
8

The compressing object line means it is do the packing work. That include diffing the trees and stuff. It is not "compressing" in the sense of core.compression.

6
  • 4
    @Doud You don't. Git packs things. This is good and desirable behaviour.
    – user229044
    Commented Aug 18, 2011 at 4:32
  • 1
    @J-16 SDiZ When I serve my git folder via http git fetch/pull don't pack anything. Maybe you should try to serve your house via http.
    – hdorio
    Commented Aug 21, 2011 at 9:45
  • 1
    @Doud, Indeed, the old (non-smart) http don't pack stuff. But it (almost always) use more bandwidth --- it (almost always) send old commit you don't need.
    – J-16 SDiZ
    Commented Aug 21, 2011 at 10:26
  • 3
    FYI, The 'compressing object' line is the call to 'git pack-objects' and have nothing to do with 'diffing the trees and stuff' also 'core.compression' and 'pack.compression' will affect the packing work
    – hdorio
    Commented Aug 22, 2011 at 20:16
  • 3
    so how do disable packing the whole git repo on git clone? i have no problems with bandwidth
    – Reishin
    Commented Aug 29, 2017 at 6:24
2

This is useful if your remote server has very limited ram. SSH to the remote server go to the repository then run:

git config --add core.bigFileThreshold 1

This will turn off delta compression. if you want to undo it:

git config --unset core.bigFileThreshold
0

Not the answer you're looking for? Browse other questions tagged or ask your own question.