Ransomware protection

Hello,

Ransomware is a common problem these days, and backup software isn’t really offering a last protection against it.

As Duplicacy is indexing all files before backup anyway, wouldn’t it be a good function to have ransomware protection? If more than x% of files have changed versus the last backup, fail with an errorcode and don’t backup?

Yves

Welcome!

I disagree. Backups - the right kind of backups - are what allow people and companies to recover from ransomware (without paying the ransom). To be clear it must be a true backup - not file synchronization. Versioning and retention is key, and the backup data needs to be stored somewhere that ransomware can’t discover and/or doesn’t have permission to access. (Cloud storage accounts are great for this, local disk targets not so much.)

I am aware of enterprise backup systems that have features to flag anomalies like this. The backup is still performed, because there’s no added risk by completing the backup, but an alert is signaled that maybe something needs to be looked at more closely.

It might be cool for Duplicati to have this feature. I think it should still complete the backup though, in case it’s a false alarm.

By the way Duplicacy is a different backup program. :slight_smile:

Welcome to the forum @Yves_Smolders

To my understanding, it’s not. It backs up while finding files. You wouldn’t want to wait before uploading…

Ransomware Detection with SyncBack describes a % of files have changed based approach, along with another approach based on change in a file of your choice that should NOT change, similarly to this topic which would let you write this yourself using run-script-before to do what you like, including don’t back up.

I worry about that plan, because I don’t think the ransomware is guaranteed to clobber chosen file first…
Ransomware that gets to either Duplicati program or its database is likely to halt the backup at that point. Admittedly there is a risk of a messed up backup (which would be unfortunate) if a hard stop messes up.

According to the comparison chart, the feature is present even in SyncBack Free which lacks versioning. That’s probably where it’s most useful. True versioned backups could just restore from an earlier version, provided ransomware didn’t get to the backup (which seems a bigger risk if backup goes to ordinary files.

There are some other posts where people are busy figuring out the best way to protect their backup files.

I can see the value of feature not to backup when ransomware has been detected. This so your backup won’t be contaminated with files you do not want in your backup.
However, you would probably want the backup to run despite it being overrun by ransomware, because you might save files that have changed between the last backup, but haven’t been corrupted yet by the ransomware…

A few years ago our compaby was attacked by ransomware, our sales database was about 60GB, because of the cheer size, the ransomware had not encrypted the file when we discovered the infection. If we had been to late, I would have liked my backup software to have backupped our database.

BTW I never understand why the suggested anti ransomware could not be added as an option.
So people could choose how Duplicati should react when detecting ransomware.

We live in a world of individual choices now!
You would like it when your supermarket decided you were ony allowed to eat peanutbutter on your slices of bread ? :grin: :grin:

*** the above is not an attack, but I believe in giving people choices and working towards solutions instead of talking about why not to solve a problem ***

With Duplicati it won’t really contaminate your backup data. Because of the dedupe, it tracks all blocks from each backup job independently. If Duplicati backed up ransomware files and you wanted to “undo” that backup, you could delete that specific backup version if you wanted.

How would this be implemented? How ransomware operates varies greatly - they all seem to use a unique added extension, unique “decryption instruction” filename, etc.

I guess one idea is to somehow detect an anomalous amount of data being modified, or an anomalous number of file additions/deletions. It isn’t really clear-cut.