I have all my homework archived in a single folder on my home PC (mac), and I have that folder mirrored on my personal account at university (linux). I'd like to keep those folders in sync, and I'm looking for alternative ways to do that. Main difficulty: I'm working in the folder both at home and at university (but fortunately never simultaneously).
Currently, the folder is also mirrored on my VPS (linux), to which my home PC rsync's all the changes every hour with a cron job. I have a script in my home folder at university that can rsync all changes down from the VPS to the university system (it's an NFS to a central server) if I invoke it, and another that rsync's any changes up to the VPS. The download script is also on my home PC for fetching changes made while at university.
The scripts have a --delete
flag that it just passes on to rsync, that I can use if I deleted files at some point (otherwise rsync never deletes files, for very good reasons). Using this system to synchronise all my files between home and university works pretty well, but it's a pain to always have to manually invoke rsync from university (I could have it run automatically, but I'd still have to download changes, and I'm worried about conflicts on my VPS, so I'd rather only have one endpoint auto-sync my changes).
I've thought about using git for this, having my VPS act as a git server. This would solve any divergence issues I have (I lost a few files lately due to excessive use of --delete
), but I'm not sure git will be able to handle a ~600MB repo with 3000+ files. (Don't ask.) Are there any other methods/tools for doing this effectively? Is it worth my time to write a FUSE filesystem that just passes through to the native filesystem, but logs changes on the way (and maybe syncs everything up in a magical way...)?