Skip to main content
expanded
Source Link
Dana the Sane
  • 15.1k
  • 8
  • 58
  • 80

I think X-Istence is on the right track, but there are a few more improvements you can make to this strategy. First, use:

$pg_dump --schema ... 

to dump the tables, sequences, etc and place this file under version control. You'll use this to separate the compatibility changes between your branches.

Next, perform a data dump for the set of tables that contain configuration required for your application to runoperate (should probably skip user data, etc), like form defaults and other data non-user modifiable data. You can do this selectively by using:

$pg_dump --table=.. <or> --exclude-table=..

This is a good idea because the repo can get really clunky when your database gets to 100Mb or more100Mb+ when doing a full data dump. A better idea is to back up a more minimal set of data that you require to test your app. If your default data is very large though, this may still cause problems though. 

If you absolutely need to place full backups in the repo, consider doing it in a branch outside of your source tree. An external backup system with some reference to the matching svn rev is likely best for this though.

Also, I suggest using text format dumps over binary for revision purposes (for the schema at least) since these are easier to diff. You can always compress these to save space prior to checking in.

Finally, have a look at the postgres backup documentation if you haven't already. The way you're commenting on backing up 'the database' rather than a dump makes me wonder if you're thinking of file system based backups (see section 23.2 for caveats).

I think X-Istence is on the right track, but there are a few more improvements you can make to this strategy. First, use:

$pg_dump --schema ... 

to dump the tables, sequences, etc and place this file under version control. Next, perform a data dump for the set of tables that contain configuration required for your application to run (should probably skip user data, etc). You can do this selectively by using:

$pg_dump --table=.. <or> --exclude-table=..

This is a good idea because the repo can get really clunky when your database gets to 100Mb or more. A better idea is to back up a more minimal set of data that you require to test your app. If you absolutely need to place full backups in the repo, consider doing it in a branch outside of your source tree. An external backup system with some reference to the matching svn rev is likely best for this though.

Finally, have a look at the postgres backup documentation if you haven't already. The way you're commenting on backing up 'the database' rather than a dump makes me wonder if you're thinking of file system based backups (see section 23.2 for caveats).

I think X-Istence is on the right track, but there are a few more improvements you can make to this strategy. First, use:

$pg_dump --schema ... 

to dump the tables, sequences, etc and place this file under version control. You'll use this to separate the compatibility changes between your branches.

Next, perform a data dump for the set of tables that contain configuration required for your application to operate (should probably skip user data, etc), like form defaults and other data non-user modifiable data. You can do this selectively by using:

$pg_dump --table=.. <or> --exclude-table=..

This is a good idea because the repo can get really clunky when your database gets to 100Mb+ when doing a full data dump. A better idea is to back up a more minimal set of data that you require to test your app. If your default data is very large though, this may still cause problems though. 

If you absolutely need to place full backups in the repo, consider doing it in a branch outside of your source tree. An external backup system with some reference to the matching svn rev is likely best for this though.

Also, I suggest using text format dumps over binary for revision purposes (for the schema at least) since these are easier to diff. You can always compress these to save space prior to checking in.

Finally, have a look at the postgres backup documentation if you haven't already. The way you're commenting on backing up 'the database' rather than a dump makes me wonder if you're thinking of file system based backups (see section 23.2 for caveats).

Source Link
Dana the Sane
  • 15.1k
  • 8
  • 58
  • 80

I think X-Istence is on the right track, but there are a few more improvements you can make to this strategy. First, use:

$pg_dump --schema ... 

to dump the tables, sequences, etc and place this file under version control. Next, perform a data dump for the set of tables that contain configuration required for your application to run (should probably skip user data, etc). You can do this selectively by using:

$pg_dump --table=.. <or> --exclude-table=..

This is a good idea because the repo can get really clunky when your database gets to 100Mb or more. A better idea is to back up a more minimal set of data that you require to test your app. If you absolutely need to place full backups in the repo, consider doing it in a branch outside of your source tree. An external backup system with some reference to the matching svn rev is likely best for this though.

Finally, have a look at the postgres backup documentation if you haven't already. The way you're commenting on backing up 'the database' rather than a dump makes me wonder if you're thinking of file system based backups (see section 23.2 for caveats).