Hi there,
Little background:
I have 6+TB of video/audio/project files that I need to remove from my editing system. In the past, I would offload files to bare drives using teracopy and USB3, make some notes, and place them on a shelf. Admittedly not a great system and very time-intensive and hard to restore. While searching for an alternative to what I have used in other facilities(Retrospect), I came across Duplicati, and after running some tests and tweaking some settings I have come up with what I think is a pretty good base backup job. Below is what I am using for large media files that rarely change i.e. Camera footage. I use the default settings for docs and project files, which change every hour.
–compression-module=zip --dblock-size=2GB
–no-encryption=true
–blocksize=250MB
–full-remote-verification=false
–full-block-verification=false
–backup-test-percentage=100
–exclude-files-attributes=“system,temporary”
–disable-module=console-password-input
The reason for my post:
I was experimenting around with different ways to verify the backups and I am having a hard time making it bend to my will. I was playing around with Full remote & Full block verification but wasn’t able to see any activity in the logs, then I read about --backup-test-percentage, which works great however it verifies 100% of the backup EVERY TIME I run the job, no matter the delta of data between backups.
The desired action I want:
Is for the NEW data of each backup job to be verified against the original data after being written to disk. Not the entire backup from the beginning of time. The reason is is that once the job is over, I delete all the files from my RAID for “safekeeping” on slower HDD.
Looking for any guidance or suggestions.
BTW- I am using the GUI.
Thank you for your time.