I have a directory containing many files. Together, these files take up several gigabytes of space. I'd like to compress this directory.
But compressing the directory into a single file will make that file difficult to move around, so I'd like to have several files.
I could use:
tar cvzf - dir/ | split --bytes=200MB - sda1.backup.tar.gz.
To do this, but I'm worried that I would then need all of the backup files in order to resurrect any of the data. I'd much prefer that each file was its own, independent unit containing a part of the source data.
One way I can think of to do this would be to build a script which calculates the size of each input file and greedily appends files to a list until a maximum size is reached. The list of files is then tar-ed and a new list is begun. This is repeated until all files are in tars. The tars can then be independently extracted.
This is not a duplicate of other questions because I am specifically wondering how to perform this operation in such a way that every part of the total archive is itself a valid archive and every file can be reconstructed without needing to union archives.
Is there a utility that does such a thing?