Purge files older than x months

On an existing backup, I want to do a manual purge for several files older than xx months. For a dry run, I just tried it with the find command first.

Duplicati.CommandLine.exe find file://M:\xxx\ “desktop.ini” --dbpath=“M:\xxx.sqlite” --time=-6M --all-versions=true --dry-run=true --debug-output=true

Duplicati.CommandLine.exe purge file://M:\xxx\ “desktop.ini” --dbpath = “M:\xxx.sqlite” --time=-6M --all-versions=true --dry-run=true --debug-output=true

The find command seems to find all backup snapshots older than 6 Months. However, I get a strange error message
Der Wert "-6M, der an --time ausgegeben wird, stellt keine gültige Zeit dar
(value -6M given to --time is not valid time)

Does -6M mean everything older than 6 months?

Furthermore, I get strange differences between find and purge commads with the absolute same arguments.

find: besides the error message listed above, as expected, it finds these files in 2 backups older than 6 months.

purge: Gives the text “Listing remote folder…” and then nothing else. No useful output. Shouldn’t the dry run option list all files it is going to purge?

Shouldn’t I be able to use the same option on both commands?


I don’t trust this documentation because my results differ, but here it is:

Duplicati.CommandLine.exe help time

Duplicati supports absolute and relative dates and times:

  now --> The current time

  1234567890 --> A timestamp, seconds since 1970.

  "2009-03-26T08:30:00+01:00" --> An absolute date and time. You can also use
  the local date and time format of your system like e.g. "01-14-2000" or "01
  jan. 2004".

  Y, M, D, W, h, m, s --> Relative date and time: year, month, day, week,
  hour, minute, second. Example: 2M10D5h is now + 2 months + 10 days + 5

In my limited testing, it seemed like find with a calendar date worked. A purge is more dangerous.
What are you trying to do? Do you have some other backup of (presumably still useful) older files?

Yes but shouldn’t this be the same behavior for find and purge?

What I want to do: I have smart retention, but want to have a shorter retention time on selected paths. Like for example the folder where I store my raw photos before I sort, select, and archive them. The pictures and videos I have deleted, don’t have to waste space for 10 years.

These folders I want to purge after 6 months. Since retention policy is only existing for the whole Backup, I was hoping to be able to manually purge.

(Yes, 2 separate Backup jobs with 2 retention policies are also an option, but then I would lose deduplication when I move a file from “working folder” to “archive”)

Is this what’s called “working folder” later, as implied by being before archiving?

What backs up that archive? Duplicati is intended for backups, not for archiving.
There’s not a great UI for finding files, and don’t count on backup going 10 years.
If you have Duplicati do both folders, space use doesn’t shrink from purging one.

Although I’m not sure this use case is workable, can you step-by-step it further?
You’re in a corner that seems little used/understood. A purge goof is dangerous.

I’d also note that the time syntax doesn’t say anything about negative times used.
The purge documentation says “Selects a specific version to purge from.”, which
doesn’t match the topic title, but it now sounds like old versions (not files) may do.
I haven’t tested that, but if it doesn’t work for you, then I guess try some other way.
Depending on your work flow, maybe dates (if they work here) would be enough…

There are backup programs that deduplicate across different backups, and likely
allow different retention of different backups, but the same issue exists where the
space can’t be released until the last backup lets it go (or at least I’d assume so).

Here it says the time can be negative as well:

But maybe I have to use now-2M … I’ll have to try in the evening.

Although I’m not sure this use case is workable, can you step-by-step it further?

It is easier than it looks

I want to have 2 retention policies on the same backup job. Since this is not possible, I think about the manual purge.

On my local disk, I have some folders which are back-upped with one single backup job.

Folder A: raw data from camera / phone
Folder B: my picture collection.

All my raw pics and videos come into Folder A - let’s say on daily basis. Every now and then, I sort them, delete the majority of the pictures, and move the good pictures (and videos) to folder B.

Both folders shall back-upped with one job. But the backup of Folder A shall be purged after 6 months. Why: The Pictures I chose to delete in folder A shall be removed from the backup after 6 months to save space.

I don’t want to use Duplicate as archive. I just want some folders to have a shorter retention time. So the idea is to use purge from time to time, point it to the path of folder A, and purge everything which is older than 6 Months.

Thanks for clarifying. It sounds like files that didn’t make the collection get a grace period then get purged.
If you want to keep working with this, in theory, the --dry-run option may be helpful, but test on test data.
No promises or further forecasts on what syntax (if any) will get you the type of outcome that you’re after.

If you’re willing to change backup programs, this workflow might fit one that deduplicates across backups.
Files that made the collection stay in its folder B backup. Files that didn’t move age out of folder A backup.
Because files are already deleted from A by then, and were not moved to B, deduplication will purge them.

I think Duplicacy can do such deduplication, but GUI version is payware. I think Kopia can too, but it’s new.