First off, I didn’t know what category to use for this post. The moderators should feel free to move this post to a better suited category.
The point of this post is to get input from people to help me determine if Duplicati is protecting my specific system or not.
I have a linux server with multiple SMB shares. All the other computers in the network sync and replicate their data to these SMB shares.
I installed Duplicati on the linux server to backup the contents of the SMB shares to a dedicated backup drive in the server. This drive is not shared and is only accessible by the server.
My understanding is that if I’m ever a victim of ransomware, it will most likely affect one of the client computers since those are the ones receiving emails and actively visiting websites etc. When that happens, the ransomware program will start encrypting all the files in that computer as well as any remote files accessible by that computer such as the SMB shares.
However, since client computers do not have access to the backup drive in the server, it seems that the data in the backup files created by Duplicati prior the infection should not be affected by the ransomware.
If this is true, then i should be able to use these old backup files to recover my data. Sure, i would loose the most recent version of my data but the bulk of it should be recoverable.
Is my analysis correct?
Or am I missing something?
If its connected there is potential to be affected, and even if its not connected, depending on how its used, there’s still potential to be affected but less so.
The whole thing is if its given what it needs (each thing needs different things) then it can do its thing. The less its given the harder it is.
What happens if what you’re backing up is infected? It goes right to your backup drive. It may not be able to do anything there which is a good starting point.
If it can gain access to the server from the computers? If it cannot gain that access then it would be kept from reaching it. That depends on a bunch of things. If its too easy like no username and password to the server (just an easy connection) then it will have a clear unobstructed path.
Also if you do something not good on your server, the sever could be opened up to be the cause. There’s also security flaws and user mistakes that could open it up.
And it would be better to not get it in the first place. That almost entirely depends on what you do on the internet. eg Duplicati forum, github, etc would 99.999% be unlikely to ever give you ransomeware but other things most certainly will. The user is the first line of defence.
Even with a virus, an infected file just sitting there is just sitting there. If it’s run, there’s an issue.
Duplicati backup files are just data files, and wouldn’t be run short of heavy social engineering…
If one gets trashed (e.g. due to ransomware on the server), you lose the ability to restore from it.
You would probably do better getting opinions from experts who actually study this area in detail.
There are people who ask about how ransomware spreads. The below sounds like combination
encrypted-file-contents-plus-a-virus-stuck-in-front-to-be-run isn’t common, but nice invention.
Thank you all for your responses.
It appears everyone agrees that as long as the server is not the one getting the ransomware AND the client computers cannot discover or access the target drive for the backups, then the data in the old backup files should be safe from such hypothetical ransomware attack.
Everything is password protected. The client computers do not have access to the server (at least not to the root account), only to the SMB shares. The backup drive is not shared, therefore is not discoverable from the network. The only way a client computer could access the backup files is if it SSH into the server and logs in using the root credentials.
I thought about not having the drive mounted all the time as you suggest. Then run a script to mount the drive before the backup job and another script to unmount the drive when the backup job finishes.
But i thought it was going to overcomplicate my setup. Besides, while this is an additional layer of protection, it is not a very effective one since when the backup drive is mounted the client computer with ransomware will also be active. If the client computer can discover/access the backup drive then the ransomware could encrypt the old backup files.
The real protection comes from preventing the client computers from discovering/accessing the backup drive.
@ts678 I’m not big enough to be target of social engineering just yet (maybe someday ). Thanks for the link, it was a good read that also confirms my conclusions.
Yes and no. “Should be” is about right but it depends on the attacks. If you don’t do security updates well and it targets one where it can get in through ssh flaw or an open port or whatever and the server is discoverable on the network or it otherwise gains access to the ssh connection then there’s still a chance.
“All depends” I would say is more correct as far as what I go with. ssh is a good place to be at though for a minimum. The chances should be cut a lot over something like a publicly browable server on the network or a mounted drive to the server. But if you left a gaping hole in the server security even unknowingly, it might not do anything.
As the Western Digital issue recently highlights. Security needs to be maintained or its swiss cheese and not worth using.
This is true, as long as the backup drive doesn’t get compromised itself. I guess for a private environment that’s already a good protection, especially since at least the server doesn’t run on Windows too (which would be an easy second target).
This a good point to start. Depending how much security you want, the next step is to prevent having the backup drive always connected to the backup machine. If you want to be sure, I’d recommend to do additional backups on a device which is not always connected to that machine, like an external drive. This saves your data also from other dangers. Remember that Ramsoftware is an important but not the only danger: For example an electrical short circuit or overvoltage would destroy your data on a second disk on the server if you don’t have an additional offline copy.
I personally do offline backups once per month using a dedicated backup PC. This machine doesn’t have any remote interface like SSH, VNC or anything else. Just a minimal basic desktop with the required tools for doing backups. It connects to the servers which should be backed up, like my NAS server. So there is only a one-way connection, which reduces the attack surface massively. In case of an infection it’s extremely unlikely that this machine got compromised.
And to make sure that my backups can’t get destroyed while a backup is running, I have two backup drives. At least an older state is still there if this happens. It may be rarely, but since a full backup takes some time and ALL my data are stored there, I don’t want to take the risk.
Having more backups is also not wrong at all, if you think about other risks again. This already helped me with a problem where a sync client destroyed single files by syncing them as 0 byte file. It first happened on files I’m not using often, so it took some time before I noticed this. At that time, my main backups were already overwritten with the 0 byte file. Luckily I had a much older backup on my second drive, which let me recover the file.
Just keep in mind, that THE prerequisite is, that all computers having write access to the backup location are never infected. This is of course especially true for the one used to create the backup. If it would be infected, there is a high chance, that your remote backup is lost as well.
Duplicati needs to know the credentials to access the remote storage location. Unfortunately it stores them in a way, that they can be recovered. There are quite a few trojans which check for such locations. That is why (interactive) programs Filezilla and Win SCP meanwhile have a Masterpassword,