1

I'm currently attempting to use tar and split to get DVD-sized archives of my Time Machine backup database. It's going to work this time, as I have a partition on my external drive large enough to hold two copies of the massive Backups.backupdb folder, but I want to know if there is a smaller and faster method.

What I'd like to do: Create 4.7GB archives of my folder one by one, and write them each to disk in turn. The end result would be a set of DVDs, the contents of which could be copied to a hard drive, and then have an extract command run, to form a copy of my original 100+GB folder. Also, if the archive could retain its integrity if one (or more) of these disks was destroyed, that would be awesome. The entire process would never require more than 4.7GB of memory. Even better would be piping the data into my disk burning program, and never writing anything to the hard disk. Can I really pipe tar into split into drutil (or other disk-burning app), and expect this to work over a couple dozen DVDs with all those interruptions?

What I'm doing now: I run tar to make the archive, then I split its output into a bunch of smaller archives:

sudo tar cvjsp /Volumes/BackupDisk/Backups.backupdb/ | split -d -b 4480m - Backups.backupdb.tar.bz2.

This gives me a bunch of files (Backups.backupdb.tar.bz2.01, Backups.backupdb.tar.bz2.02, ...), ready to be written to DVDs. These take up as much space as the compressed archive (of course). However, it now exists both as these files and as the original data, which adds up to almost double the size of my backups, requiring me to have lots of extra space. I've got it for now, but this won't last much longer...

Any ideas? How do you back up hard drives to optical media?

Update: Started process early this morning, still working when I had to leave for work, came back to this: alt text

2
  • I just answered a very similar question, superuser.com/questions/177823/…. It's not quite you need, but may be useful anyway.
    – Catherine
    Commented Aug 19, 2010 at 11:34
  • @Whitequark - And I just answered that question myself, linking to this one! Feel free to post your script here as well. If you don't, I'll probably end up modifying it to do DVD writes, and post it here if it works... Commented Aug 19, 2010 at 11:38

2 Answers 2

2

Save yourself time and grief and pick up an external HDD to backup to instead of (or in addition to) optical media.

4
  • 1
    HDD mentioned, +1. RESTORING stuff is the most important part of taking backups, and backing up HD to several pieces of optical media doesn't sound like a fool-proof and/or easy method when it's time to restore something important. Commented Aug 19, 2010 at 7:53
  • I have an external hard drive. This is about the "In addition to" part of your answer. Commented Aug 19, 2010 at 11:19
  • @Janne - Reliability is another important part of taking backups, and backing up to a couple of magnetic plates with a disk head spinning dangerously close to the data is much less reliable than a piece of burnt plastic read by a laser at a distance. Also, if there's a splittable backup format which can be recovered if one of the disks is destroyed, I'd love to hear about it. Commented Aug 19, 2010 at 11:27
  • @reemrevnivek - So buy another external drive which you don't keep on all the time - just for your weekly/whatever backups. Or if you're desperate for optical media you can use the new hard-drive to store the temporary files, also negating your problem. Honestly, you could futz around with unreliable on-the-fly burning etc. or just throw $100 at the problem and be done with it.
    – sml
    Commented Aug 20, 2010 at 4:13
0

As the script from my other answer was suitable for this question too, I adapted that script to write parts to DVD.

#!/bin/bash
# (c) whitequark 2010

set -e

if [ $# != 2 ]; then
  echo "Usage: $0 <filename>"
  echo "  This script will split file to multiple DVD-sized parts,"
  echo "  starting from the end, and truncating the original file"
  echo "  in process."
  echo "  Use at your own risk."
  exit 0
fi

filename=$1
partsize=4700000000

size=$(stat -c '%s' "${filename}")
parts=$(($size / $partsize))

do_split() {
  _part=$1
  _size=$2

  _partname="${filename}.$(printf '%04d' $_part)"

  echo "Splitting part $_part"
  echo $(($partsize * ($_part - 1)))
  dd if="${filename}" of="${_partname}" \
      count=1 bs=$partsize skip=$(($_part - 1))
  echo "Truncating source file"
  truncate "${filename}" --size="-$_size"

  growisofs -Z /dev/dvd -R -J -dvd-compat "${_partname}"
}

lastsize=$(($size % $partsize))
if [ $lastsize != 0 ]; then
  do_split $(($parts + 1)) $lastsize
fi

for i in $(seq $parts -1 1); do
  do_split $i $partsize
done

rm "${filename}"
5
  • What I'd like to do is to burn DVDs of a big folder, without needing space for all the DVDs at the same time. Looking more closely at your script, it looks like it may not work as well for my application. I start out with a folder, which I'd like to still have when the program is done running. Both my hypothetical method and your method require that I have space for one full part in addition to my current archive, which is fine. Commented Aug 19, 2010 at 14:08
  • dd looks promising, though - I think I'll similarly use the bs, skip, and count options to get a part of the disk, burn the disk, and then delete the part. I was barking up the wrong tree with split. Commented Aug 19, 2010 at 14:10
  • Something like (pseudoscript) tar [opts] | myscript.sh, where myscript.sh is something like for i in size/4.7GB; do \ dd [your opts] of=part.tar."$i" | mkisofs [] | wodim []; rm part.tar.$i; echo "insert new disk"; read; done. I'll turn this into real bash when I get home to my Linux box tonight. Commented Aug 19, 2010 at 14:22
  • I updated that script to write parts to DVD. I'll think about integrating with tar, there may be some troubles with pipes&buffering.
    – Catherine
    Commented Aug 19, 2010 at 14:35
  • Speaking of troubles with pipes and buffering, take a look at the update to my question. More work is required... Commented Aug 19, 2010 at 23:27

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .