Maintenance at the end of year


what are the options for maintaining a duplicate backup?

I would like to wait for my backup jobs over Christmas and New Year. There are several servers that are hardly worked on during this time (local NAS, SharePoint V2, S3, FTP).

  1. Delete all versions older than half a year (DELETE -keep-time=6month) - if I am sure that the last six months are enough for me

  2. Deletion of old data at the backend (COMPACT --threshold=0)

  3. Delete local database and rebuild it (it only took a few hours for my attempts with 500GB on Sharepoint V2) - backup local database first

  4. It is not necessary to run Vacuum (–auto-vacuum=true --auto-vacuum-interval=0m), because in step three the database was newly created. Did I understand that correctly?

  5. Complete check of the backups (TEST --full-remote-verification).

What is your opinion?

Many greetings

Personally I don’t plan on doing anything special. I am using a retention policy that automatically reduces the frequency of retained backups the older they get.

I like your test at #3 though - I think it’s a good check. Instead of deleting the local database, I might choose to rename it (just in case). But good to confirm that a full database recreation is working as expected, and doesn’t take an inordinate amount of time. On my systems (all running a recent Canary version), database recreation takes at most like 5 or 10 minutes.

#5 might be interesting but it would definitely be time consuming, especially if the backup data is remote.

#5 –full-remote-verification is full only in the sense of being more thorough with the same sample set.
The TEST command default is 1 set of dlist, dindex, dblock. If you want all the files, you can say all, which could definitely take awhile… In any event, test command is aimed at integrity of remote files.

How to test / verify your backups talks about this method of testing.

Minimal footprint restore test is a feature request for one aimed at more fully testing the restore works, however for now the best test of that is a restore (general advice for any backup software), preferably either on a different system using direct restore to simulate disaster recovery. This proves database is recreatable from destination (at least to extent needed), and also prevents an optimization of using the existing source file blocks in the restore. The latter can be avoided using option –no-local-blocks=true.