3

I have a script which periodically backs up a directory using the command "tar -czvf [name] [directory]" but my problem is that the script has recently been putting a lot of stress on the server (Minecraft SMP) and tends to lag players as it backs up, which recently has been taking nearly 5 minutes.

So I need to know if there's a way to control the GZip compression rate at the same time that it archives and backs up the files?

I understand that I can first tar the files and then GZip them separately with a different compression rate afterwards, but this would not work because it names the files with the current server time, which sometimes changes in between commands.

Any insight? Thanks ahead of time.

2
  • 1
    this isn't a programming question; it might be better on the power users or linux/unix stackexchange sites. Commented Jun 4, 2011 at 5:23
  • 4
    I apologize, I only posted it here because it was part of a shell script.. I figured out my problem though, the simple fix was to put "GZIP=-[compression level]" immediately before the tar command like so: GZIP=-[compression level] tar -czvf [name].tar.gz [directory]
    – Alex Bennett
    Commented Jun 4, 2011 at 6:41

3 Answers 3

2

Doing it in two steps is probably more portable. If you need a timestamp, store it first:

filename=/what/ever/backup-$(date +%Y%m%d%H%M%S).tar
tar cvf $filename ...
gzip -1 $filename

I'd also suggest you look into nice and ionice. They could help you lessen the effects of the backups on server responsiveness.

1
  • 1
    I actually solved it another way by using GZIP=-[compression level] tar -czvf [name].tar.gz [directory] in which I used 1 for the compression level, but I'll definitely give your method a try to make the backup names more uniform and hopefully lessen the lag even more by changing the priorities which is something that never crossed my mind before. Thanks a ton.
    – Alex Bennett
    Commented Jun 4, 2011 at 6:54
1

Maybe a little late, but this could help others...

If you are backing up similar data regularly, you should consider making only a copy at first (and not compressing it, so basically making just a tar file) and compress later more backup copies together. In case of a Minecraft server you probably make a few backups every day. You can schedule compressing those backups together with a strong algorithm like every day or so, running it when there aren't too many people on the server. The lag will be longer, but the point is that the resulting archive will be really small (because most of the world files don't change at all between backups). And if you have some spare power on some of the cores (which is fairly likely) the compression will use that (and cause lesser lag to the rest of the server), possibly increasing performance during backups.

0

I often do something like this so when the tar process is done, I don't have to remember to gzip as it's all done on one line:

tar cvf - $nameOfDirOrFileToBeBackedUp | gzip -$compressionLevel > $backupLocation/$nameOfDirOrFileToBeBackedUp.tar.gz

This method works on older versions of tar that don't support gzip (Solaris 10 still doesn't).

You must log in to answer this question.