Will Duplicati Delete Newly-Ignored Directories?

I have a directory that has been in my backup configuration for a long time, but I want to now change my configuration to have Duplicati ignore that directory. Once I make that change, will Duplicati go back and delete that directory (and associated files) from all previous backups? If not, how can I retroactively delete that directory and files from my prior backups?

Thanks in advance.

No, it will not do this automatically. You can do it by using the PURGE command. Use it with caution. Try a dry-run first, then maybe without dry-run on a single backup version. If it looks good then you can run it against all backup versions.

Do you purge much? I pretty much never do, and also sometimes have losing battles with Filters syntax.
Purge dry run terror is in my view more a UX issue than actual bugs, but I’m reluctant to dive into all of it.

The main point in lack of detail is that dry-run detail comes out at DryRun log level, not as regular output.
Output is a bit more assuring (hopefully not false-assuring, because it’s just a log level) for CLI users as

<purge command stuff> --console-log-level=dryrun
  Listing remote folder ...
[Dryrun]:   Purging file C:\backup source\sub1\I386\length1.txt (1 bytes)
[Dryrun]: Would write files to remote storage
[Dryrun]: Would upload file duplicati-20210718T193954Z.dlist.zip (350 bytes) and delete file duplicati-20210718T193953Z.dlist.zip, removing 2 files
[Dryrun]:   Purging file C:\backup source\sub1\I386\length1.txt (1 bytes)
[Dryrun]: Would write files to remote storage
[Dryrun]: Would upload file duplicati-20210718T220905Z.dlist.zip (350 bytes) and delete file duplicati-20210718T220904Z.dlist.zip, removing 2 files

I guess GUI users use GUI Commandline, or the live log, or use log-file=<path> log-file-log-level=DryRun.

No, almost never. I did do several purge operations in the past when I was changing my backups from a single large one to multiple smaller ones. I wanted to keep the backup history of my main data files so I ended up keeping those directories in the main backup job. I purged the VM, pictures, music, etc files from the main backup job history and then created new backup jobs for those other files (where I didn’t care about the history as much).

The purge operation worked really well but I was quite careful. I would keep a copy of the back end data and job database just in case something went wrong.

Do you purge much?

Never, unless it’s a major fail. Like really lots of data getting backed up which wasn’t intentional or something getting backed up which shouldn’t be backed up in the very first place for info sec reasons.

Otherwise the data will expire, when it expires and that’s the way of life. It’s even ok according GDPR.

Btw. Today I just restored some stuff, gotta make yet another donation to Duplicati. - It was easier than recovering the EFS encrypted data from damaged SSD. Yet, just for sport, I did recover a few files from the drive as exercise, also confirming that the process is working if required.

1 Like

I see no reason to purge. The old stuff will age out in about a year. And I’m not running out of storage.

Wonder why OP wants this to happen?

Any/Every user of any backup regime should do this about quarterly just to make sure they can. Well done, @Sami_Lehtinen!

Depends on retention settings. In my case I keep some versions indefinitely and wanted to purge the data that I moved to a different backup set.

Some people might find that the space a backup uses is more than they would like, so trim it back.
Some of my storage is pay-by-size, and some is fixed-size although one could pay for bigger size.

Visualize backup data usage
Backend quota is close to being exceeded

Uncheck or exclude doesn’t purge old versions. That’s a manual task with a rather difficult UI, IMO.
I wonder what to do for a risky maybe rarely-used operation with bad UI, given few dev resources?

I “think” the source folder chooser is implemented with filters, but manual filters can be dangerous.
Trying to get them right can be hard, and turning a flaky filter loose to do a purge may risk disaster.