DB Scalability

PostgreSQL Database Backup Takes Too Long

While giving a training course on PostgreSQL, the problem arose that a pg_dump of their database would take too long. The database contains more or less nothing else than BLOBs (PDF files). Currently only 500MB, but there are 45GB of archived files (from the last 10 years or so) and new ones will be added on a daily basis. So at least 50GB (older files could be removed) will be stored in the database. Doing a test dump with pg_dump takes approximately 3 minutes for 500MB (on the given system) - which means 50GB will take somewhere around 5 hours. That is definitely too long, as the backup will need to run every night and additional file system backups (with IBM Tivoli) need to be performed as well.
Subscribe to DB Scalability