Help me setup best backup job strategy for years of photo images...?

It’s no problem to create a single backup job for your complete photo archive.

Deduplication doesn’t work across multiple backup tasks. Creating a single backup job will save remote storage space if (parts of) source files are identical.

For better performance, you can increase the block size with advanced option --blocksize. If you set the block size to 500KB, the number of blocks will be reduced with 80% (default block size is 100KB). Queries to the local database will be much faster, the size of the local database will be smaller and recreation of the local database will take less time. Note that a larger block size will make deduplication a bit less efficient, but I expect the effect will be minimal when backing up a photo collection.

To reduce the number of files at the backend, you can increase the dblock size (Upload volume size in the last step of the Add backup wizard, or advanced option --dblock-size). Setting it to 500MB will reduce the remote files with 90%. Note that you have to download all comlete remove volumes that contain (parts of) any files you want to restore.

More information about choosing sizes can be found here:

Use this to get an indication what’s the effect of changing these settings in your situation:

2 Likes