2

I am a web developer, working together with a graphic designer. I would like this graphic designer to be able to locally check out the latest versions of my project's repos from GitHub, so he can view and test them locally on his machine.

While the setup generally works, we have the problem that I prefer to put all media files on my .gitignore list, as they are large and tend to change often in the process and I don't want to clutter my git history and repo.

Now, this obviously leads to the problem that he cannot check out my project as it is, as all the media files are missing.

My question therefore: Is there a way to include these files somehow via git, but have them not included in version control? So they sync along for the moment, but I don't keep any old versions of them?

Or, alternatively, is there a good way to sync my media files only (Google Drive etc. maybe) and sync everything else via git and GitHub? Of course the media files could be spread in any directories and subdirectories of my project.

2
  • It's called git-lfs.
    – phd
    Commented Jul 23, 2019 at 18:57
  • I have never used git lfs, but doesn’t it put large files under version control as well?
    – mdomino
    Commented Jul 23, 2019 at 19:10

1 Answer 1

2

As @phd pointed out, git-lfs is probably your simplest and best option.

git-lfs - will track the files in you git repository. However, when you clone the repository by default you will only include the latest version of that file- whereas your normal git history includes every version of the file (in compressed form). In terms of simplicity, this is probably the best option. This ends up being the best of both worlds, you control large binary files in your git repository, but you don't have every version locally - only what you need.

Edit:

Remote repo size concerns: This is a valid concern. With github.com charging for bandwidth and total storage size this could become problematic. git-lfs is configurable to have an alternate LFS remote, which may be a good option. Github does seem to support this, although I can only find instructions for their enterprise software here

Deploying git repo size: If you are referring to the lfs specific server, yes, the size would become very large over time. Per this open issue, there does not appear to be an officially supported way to deal with this (yet). I would think you could manually remove the data from the remote server and re-push only the objects you wanted, but this is not something I have experience with.

If you are referring to a production server, that does not host the lfs server, then size should not be a concern. When you checkout the latest version, lfs will only hold onto the latest versions of the LFS files (this is configurable)

Edit 2:

One other option you have is to use a cloud hosting service like AWS S3. You could upload the binary files to that, and have a script in your repository that downloads the files you need. With this solution is is a bit more self-managed and you need to be careful that you enable authentication properly and not check the API key into your repository.

2
  • My concern is not really to have all the file's history locally – locally I usually have enough storage to not care. My concern is generally blowing up a git repository to many GB, so that I won't be able to host it anymore on the free GitHub plans. I don't want to work in a way, where I need to be really careful not to replace my images and videos all to often.
    – mdomino
    Commented Jul 23, 2019 at 21:37
  • Also, when I deploy from git to my server, I would end up with a huge repo on my server, wouldn't I? Or would git-lfs take care of only putting the final version of the large files to my server.
    – mdomino
    Commented Jul 23, 2019 at 21:39

Not the answer you're looking for? Browse other questions tagged or ask your own question.