Backup files forever


a special customer requirement is: backup all files and store this files forever. With other words, if someone deletes a file, after years we should to find this file in backup and restore it. Also customer wants to backup all versions of a file and wants to restore a specific version of a file years later.
So Duplicati should only backup all files but never deletes a file even an old version.

Question: is this possible with Duplicati?


“Yes, BUT”. Duplicati by default saves all backup versions forever, so they just build up and occupy space. One can set retention rules to reduce them, but if it deletes the only version that had the short-lived file, it’s gone. There’s also no great UI to go searching for deleted files. If one knows the name, the find command can help. If one knows the timeframe when the file existed, and what folder it was in, one can look through old versions in search of it. Duplicati is not an archiver, like one might use for mandatory records retention. Beyond not having the right UI, forever is a long time, and programs designed for that may have provisions for data retrievals in some distant future. This is the sort of thing serious long-term archivers may consider.

Digital Preservation at the Library of Congress

Independent restore program can get your backup restored without any Duplicati code being runnable, but adding the ability to rummage around in old versions for the long-ago-deleted files would have to be added.

So the “yes” is that Duplicati’s intention is to allow restore of deleted files and specific old versions of files.

There are lots of “but …”. Duplicati is currently beta, and the idea of using a beta product for forever saves doesn’t fit well. Sometimes issues arise, and sometimes the solution is to delete a broken version, or start backup fresh. Either one risks your old version. I personally think Duplicati is better for short-term backups. There are also the UI issues mentioned. There have been some requests for UI enhancements to allow an easier view of versions all-at-once, or maybe even specially handling deleted files. None of that is done yet. Duplicati is also not a continuous backup, so if one backs up daily, a file that was created and deleted all in the same day might not be in any backup. I don’t know if that’s important, but I mention it just in case it is…

Basically I don’t personally think it’s currently a good fit for this use. You might prefer a special-purpose tool aimed at archiving (especially if there is any intention of deliberately deleting originals), from an established company with a well-proven product and special focus from design and planning aspects aimed at forever.

1 Like

Working in the healthcare industry in the US, there are certain files that are required to be saved for 7 or 10 years. This requires a good chunk of money for a professional solution that has multiple tiers of backups, as well as plenty of storage for the current files and plenty of extra space for potential future files.
Primary backup system that saves the files to at least 2 locations locally on-site but different areas of the primary building/property, another location off-site but still somewhat local (within 1 hour drive), and then another off-site location at a distance (in case of major destruction from natural disaster like a hurricane wiping out hundreds of miles).

Then you can have secondary and tertiary backup solutions for the same data using other options, and this is where Duplicati comes in. Duplicati should never be the primary for critical information.

“Duplicati by default saves all backup versions forever”… Is this good practice? We are setting up users to backup to their google drives every 6 hours as a “secondary” backup system for their laptops/desktops where they technically have unlimited space, so space isn’t an issue. Could the process of restoring files become ridiculous, say if there’s 5+ years of backups? Would it be better to limit this to like a year, or would it not make that much of a difference? Thanks.