Incremental Differential Backups that include removal of data from source? Is there an easy way OR what if I do it manually?

Overview says

Duplicati is not:

  • A file synchronization program.

Duplicati is a block based backup solution.

(if it were a sync program then it could mirror deletions – backup programs like Duplicati are different.
A particular point-in-time backup version will match the source of course, but it’s supposed to match)

“Drop”, “Delete”, “Don’t keep” (after a specific number of backups) – all control retention/deletion result.

Correct because you still have a backup version that is holding the file, but you might have to go find it.
Note that “persist” doesn’t mean you have a file of that name sitting there. It’s a special backup format.
I’m trying to get you to not use those, because long-term format support is a risk for long-term archive.

Duplicati will not intentionally delete it, but your data risks permanent irrrecoverable loss if Duplicati fails.
This is why I’m trying very hard to not have you delete original files. What if you want them in 20 years?

Archival storage should use extremely standard formats, not relying on any particular software package. While Duplicati does offer a Python script to restore without Duplicati, Python also changes periodically.

Although cloud storage always has some risk of the vendor going away, Google seems pretty safe until someone gets in your account and deletes everything, or your credit card fails thus your data is deleted.

There’s really no substitute for actual backups (multiple file copies) if you truly seek to avoid losing data.

I have no product recommendations. Duplicati is backup, and poorly suited to archive-and-delete usage.
Add a global search for all versions of backup feature request would make finding files a little bit simpler, however I still think you’d have a hard time finding any software guaranteeing perfect archiving forever…

Records retention is essential to businesses (sometimes required by law). I think costly software exists. Periodic migrations may be required as technologies change over the decades. Time frames matter too.

From a restore point of view, you should (until something breaks or too much time passes) be able to get source files back from the backup files thanks to “Keep all backups”. I still advise against doing it this way. “Should” is not always “will”, and if you hang around the forum, you can see things do sometimes go bad. Duplicati in its current Beta form is very good for many people, but to give it your one copy forever? Risky.

If brick means because weird breakages will happen when disk fills, Duplicati can also suffer such breaks.
In terms of “safely”, I don’t think your plan is particularly safe, except for off-site protects against drive loss. Only you can decide how much risk you want to take with your data. I’ve laid out my thinking extensively…

As a side note on Google, if you use Google Drive and allow it to sync the remote Duplicati backup, it adds further local storage usage but it can be configured. Google Cloud Storage is a different Google offering

If your computer is teetering on the verge of collapse, you can try some lower-risk things to free up space, such as perhaps buying a USB hard drive to temporarily free some space so you can run long-term plans.
You could also upload some big files to Google using its web UI, or some GUI client. Such software exists.

1 Like