We want to replace our backup solution.
The current solution looks like below:
In our company each user have a laptop with a database application on it. Databases contain personal data and are encrypted because the laptops can be easy stollen.
When the database app is started then is started automatically a backup of the database. Unfortunately, the backup isn’t encrypted, nor incremental. During the database backup process are created temporary files for each table from the database. Those temporary files are deleted after they are added to an encrypted archive. The encrypted archive is automatically uploaded to a private ownCloud server. The encrypted archive is usually larger than 2 GB and we need a lot of bandwidth and a lot of time to upload it.
So, we want to replace our backup solution with one that needs less bandwidth and less time to finalise the backup.
If we will use Duplicati then we can’t use the encrypted archive because (at least theoretically) each daily encrypted archive will differ significantly from the previous one. Probably we need to back up the temporary files and delete those file after backup.
In this scenario, can we gain time and/or bandwidth with Duplicati over our current solution?
The Duplicati app will be confused by the fact that each file is recreated daily?
Can we find other scenarios for our backup process?
We want to replace our backup solution.
Hello and welcome!
Yep, I agree with this. Would be better to let Duplicati back up the database dumps. Hopefully the deduplication will work well and you’ll save on bandwidth, but you’ll need to test to find out if it saves time overall.
The fact that the files are recreated every day isn’t a problem.
I’d give it a shot and see how it works for you.
We use only Windows clients.
I read the manual. There is a paragraph:
Duplicati can make backups of files that are opened by other processes. For Windows, a snapshot of the file system is created using Volume Shadowcopy Services (VSS), LVM is used on Linux Systems. To be able to create a VSS snapshot, Duplicati needs C++ run-time components for Visual Studio 2015 to be installed and must be run with administrator privileges.
Our database app uses separate files for each table (~150 files). Each table is opened only when needed. Can I assume that using the previous feature I can back-up my database folder without the need for a separate dump? i.e. in my scenario, I can skip the step where are created the temporary files?
VSS can let you back up open/locked files, but be aware that they may not be in an “application consistent” state. It depends - does the database engine include VSS support? (What database engine is it, anyway?)
If “application consistent” snapshot cannot be achieved then you’ll end up with a “crash consistent” backup. The files will be backed up, but they may not be in an ideal state. You’d want to do some testing. Or ask the database engine vendor if they support backing up the live data files using VSS.
Thank you again!
The database is nexusDB (nexusdb.com). They recommend a database backup. So, we will keep the step that creates the temporary files and we will back up those temporary files. Probably we will use the Duplicati command-line tool to do this step. We will start some tests.
I have another question:
Can I “save” the same backup in three destinations. For example, I want to save the backup locally (in another folder) and into an SFTP account and (from time to time) into a USB stick. I can’t fount in the manual such functionality!
Duplicati doesn’t support sending backups to multiple destinations simultaneously.
You could set up three backup jobs, each targeting a different destination. They will operate independently of each other. Personally I am not a fan of this approach.
The other (better) option in my opinion is to back up with only one job, maybe locally. Then use a different tool to synchronize this backup data to other locations as needed.
I appreciate your suggestions. Look very interesting your suggestion about three backup jobs.
In this scenario, I need:
- First, to start my database backup process
- after the database backup is finished I need to start the Duplicati Backup job
- after the Duplicati Backup job is finished I need to delete the temporary files because those aren’t encrypted
Can I create such type of automation with Duplicati?
I intend to use an SFTP server as the destination. If that server fails and I will replace it with a new one then Duplicati is able to reinitialise the backup automatically?
Also, for the SFTP server, you recommend ZFS, Brtfs or another file system?
I ask about file systems because I know that the USB sticks and HDD/SSD drives are not reliable over time and I can’t be sure that a backup file will be kept unaltered for a long time in an HDD/SSD or a USB device. Duplicati has a functionality of file recovery if I try to restore files from altered backups?
Duplicati supports running scripts before and/or after the backup job, so it’s very flexible. I think it can do what you’re looking for.
Would you lose all the backup data?
If not, Duplicati might not even notice. If you do lose backup data, Duplicati would notice the files missing and complain. If you are ok with losing the backup data, you can simply delete the local job database to have Duplicati start over.
Both ZFS and Btrfs are great filesystems for avoiding silent corruption of data. Can’t go wrong with either, in my opinion. Cloud storage is also extremely durable.
If any back end files are modified/corrupted, it will affect the reliability of your restores, and some data may not be recoverable. Check this out, it walks you through a simulated disaster where some files are corrupted: Disaster Recovery - Duplicati 2 User's Manual
This is exactly what I need.
I will start some tests next week.
Thank you for your effort!