I don’t think this feature exists and my apologies if I over looked it. Could an optional check be available that if too many files have changed (maybe a percentage, user defined) the backup job would error out.
I have first hand experience with local files being “locked” or “encrypted” and held for ransom and I wouldn’t want those files to overwrite the last known good backup when the nightly backup job runs. This isn’t perfect, but I think a decent way to help combat ransomware for disaster recovery backups. Thoughts??
Sounds like a good idea overall.
Not sure how to implement it though, as the file changes are not discovered until after the file has been scanned, meaning that the backup will have started backing up the destroyed files when you get the percentage of changes high enough to trigger the stop.
If this sounds OK for your scenario, then I think it is trivial to implement.
Hmmm, I figured the source was scanned completely before copying begins. Maybe some kind of pre-scan/check? Maybe something that integrate the existing option for testing the backup without copying files? Just throwing out ideas.
Problem is that touching a large number of files is going to be slow. And since you have to compare it to the previous version to see if there are any changes, this is essentially the work that the backup process does anyway.
If the goal is simply encryption detection, it would be likely that most files have their size changed. This will happen because encryption is working with fixed-size blocks, and will pad the input data to fit the block-size and write the entire block back. This check is faster but you still need to touch many files in advance and look up their previous size.
Another approach could be to stop the backup once the threshold is exceeded and mark the backup “bad” somehow, so no new backups will run (should be manually clearable for false positives).
A different option could be using the Duplicati Portal, which includes anomaly detection for file changes and can alert you if the threshold has been exceeded. It is a little rough in the current version, so you cannot set the threshold, but for your case it should work.
is a similar feature request with more discussion.
Duplicati doesn’t overwrite previous backups when source files change. It makes a new backup version.
Just be sure not to set your retention policy to something super short like only keeping the latest version.
At some point the ransomware might get into Duplicati’s database or temporary files and kill the backup.
This isn’t a big deal as long as the destination files are still around, so don’t backup to easily killed place.