Duplicati for business backup

Hi everyone

  1. Can be duplicati used for business purposes? Use for backup business files.
  2. As I can see Duplicati does not make a backup like other backup tools. Backup on the destination is compressed and cant be downloaded without Duplicati, right?
  3. If not (last question). How to be sure that I will be able to download the backup and that something will not go wrong?
  4. Is there any other way to receive backup files if this normal way, for some reason, fails?
  5. Did anyone have a case where they failed to download their backup and lost them?
    Thank you
  1. Sure, why not.
  2. “Other tools”, vague stuff. Can be downloaded without Duplicati, and decompressed and decrypted. But that’s pretty much where it ends. Then you’ve got the data, but it’s in small unsorted blocks. If data is important it can be recovered from that stage, if not, then it’s better to forget the process. It’s just like recovering data from hard drive after drive meta data is lost. It’s doable, if required. But in normal terms “impossible”.
  3. Having multiple copies and automated scheduled and systematic restore testing. There’s no way with any software / methods without that to be sure.
  4. Basically no as mentioned in step 2.
  5. Yes, nothing new there. Yet it happens with other tools as well. Just the rate differs. Is it due to underlying storage, network, software bug, logic corruption, etc.
1 Like

To add:

2: “other backup tools” confused me too. Most don’t just copy files directly. Got names? Sync tools would, however you lose lots, such as ability to get to old versions at all, or efficient storage of versions. Features shows several that would not be possibly with direct file copy, however there are definitely some tradeoffs.

4: Paths beyond the “normal way” include downloading directly from destination using any tool it supports, Duplicati.CommandLine.RecoveryTool.exe which is sometimes needed if the backup has been damaged. Independent restore program is a Python script without any of the normal Duplicati code.

Agree with prior answers to 3 and 5. There are likely more stable backups than Duplicati (still in Beta), but there are no perfect ones. I feel uneasy looking at this list, at all the complex questions it’s not considering. Possibly some further study of how to choose a backup program for your specific needs would be helpful.

Hi, all.
First of all sorry for my English, it is not my native language.
I had no intention of belittling the program and maybe my questions sounded like that, but it wasn’t the intention. I know the program is good, but what worries me, unlike other paid programs (Iperius backup, cloudberry, GoodSync …) is that Duplicates, for example on Backblaze, do compressed files and if one of those files gets corrupted or accidentally deleted I can no longer download any backed up documents or am I wrong?
With the other backup programs listed above backup files directly and if one document gets damaged, I can download others, directly from Backblaze not so much damage.

Have a look at the manual: Introduction - Duplicati 2 User's Manual

Online backup verification
Duplicati is built to work with simple storage systems. Many providers offer compatible storages and often at cheap prices. As a downside of this, some storage systems may store corrupted data. Most people only notice the corruption when they attempt to restore files they have lost and restoring fails. To avoid that Duplicati regularly downloads a random set of backup files, restores their content and checks their integrity. That way you can detect problems with your online storage before you run into trouble.

Thank you for some clarification

Do you mind Duplicati do it automatically in the background?

AFAIK it is done everytime you run a backup for a portion of the data.

2 Likes

@T-6 is correct that it’s automatic. The manual says more, and note that it’s not in the background:

Verifying backend files

At the end of each backup job, Duplicati checks the integrity by downloading a few files from the backend. The contents of these file is checked against what Duplicati expects it to be. This procedure increases the reliability of the backup files, but backups take a bit longer to complete and use some download bandwidth.

backup-test-samples defaults to 1, but you can make it far higher with backup-test-samples or this one:

  --backup-test-percentage (Integer): The percentage of samples to test after
    a backup
    After a backup is completed, some (dblock, dindex, dlist) files from the
    remote backend are selected for verification. Use this option to specify
    the percentage (between 0 and 100) of files to test. If the
    backup-test-samples option is also provided, the number of samples tested
    is the maximum implied by the two options. If the no-backend-verification
    option is provided, no remote files are verified.
    * default value: 0

The TEST command explains what a sample is – typically 3 files in 1 sample set, but may be less.

There is also a complete file listing on every backup to check remote file names and sizes, and this theoretically will catch most problems of damage on the remote, avoiding need to download to look.

A fine but important point is that this does not do or replace doing file restore tests. It checks that the backup files look as the local database says they should, and it also self-checks the local database.

is a legitimate concern for any space-efficient backup format that does things like only saving changes rather than entire files. Loss of a single remote file may affect many. The AFFECTED command shows which versions and files are impacted. In contrast, keeping many separate total file copies takes space while making it even less likely that damage to a single version of a single file will cause wider damage.

Duplicati’s database keeps records not only of the destination files, but the source files that were backed up, and when a restore is done, the final step is to make sure right file content restored (or you’ll be told).

So the database checks on things that may go wrong, but the question is what if things go wrong with the database, which can sometimes happen. Generally issues are either repairable with various tools, or the database is recreated from the destination. Fixing a database that’s gone wrong can vary in its pain level.

So from a reliability point of view, and especially if you will only do one backup copy (which is dangerous), and even more especially if you delete the original after making it remote (something I never suggest), the copy-the-file-lots-of-times approach probably wins on reliability, providing there’s some check that the file made it. Duplicati checks that the specialized files that it uploads list correctly. It’s not simply “assumed”.

So these are some of the tradeoffs, and we haven’t even started on other things you may want in backup. Ultimately of course, it’s your business needs that need to be met, and your choice on how to meet them.