Local archive and backup verification scheme help, new user

Hi there,
Little background:
I have 6+TB of video/audio/project files that I need to remove from my editing system. In the past, I would offload files to bare drives using teracopy and USB3, make some notes, and place them on a shelf. Admittedly not a great system and very time-intensive and hard to restore. While searching for an alternative to what I have used in other facilities(Retrospect), I came across Duplicati, and after running some tests and tweaking some settings I have come up with what I think is a pretty good base backup job. Below is what I am using for large media files that rarely change i.e. Camera footage. I use the default settings for docs and project files, which change every hour.

–compression-module=zip --dblock-size=2GB
–no-encryption=true
–blocksize=250MB
–full-remote-verification=false
–full-block-verification=false
–backup-test-percentage=100
–exclude-files-attributes=“system,temporary”
–disable-module=console-password-input

The reason for my post:
I was experimenting around with different ways to verify the backups and I am having a hard time making it bend to my will. I was playing around with Full remote & Full block verification but wasn’t able to see any activity in the logs, then I read about --backup-test-percentage, which works great however it verifies 100% of the backup EVERY TIME I run the job, no matter the delta of data between backups.

The desired action I want:
Is for the NEW data of each backup job to be verified against the original data after being written to disk. Not the entire backup from the beginning of time. The reason is is that once the job is over, I delete all the files from my RAID for “safekeeping” on slower HDD.

Looking for any guidance or suggestions.
BTW- I am using the GUI.

Thank you for your time.

Welcome to the forum @MxFilms

–backup-test-percentage is a scaled-with-backup-size version of –backup-test-samples, which is a semi-random sample with more attention paid to less sampled items. Backup Test block selection logic details.

Add backup-test-percentage in addition to backup-test-samples [$25] #3296 is the feature request for this.

If you’re asking for verification of uploaded files, you might try list-verify-uploads (which seems little used). This looks like it should at least check file existence and size. It won’t actually download file to check data. There’s always (unless you turn it off) a check of all uploaded files (even old ones) list at end of backup.

If original data refers to source files, that’s a step farther removed, and that type of test also doesn’t exist, though there’s probably a feature request somewhere… For now, manual restore tests are the way to go.

Restore checks the file hash of the restored file against the original database record, so “should” be good even if you no longer have that particular version (or any version) of the source file in the original location.

Is RAID the source file system that’s being backed up with Duplicati as well as file-copied to slower HDD? That shouldn’t have any impact on verification except that any problem found may mean going to the copy.

I recommend against using Duplicati as a “backup-then-delete” archive, but a second copy makes it safer.

What is Duplicati backing up to? If it’s accessible locally, –upload-verification-file will let you verify data with one of the DuplicatiVerify.* scripts in the Duplicati utilities directory, but it will be read-everything verification.