Select files with a creation/modification date less than X months

Hi,

I want to make a back-up:

  1. Daily all files in the path with a creation or modificationdate < 3 months
  2. Weekly purge everything older than 3 months
    Can I Achieve it with an RegEx and HowTo?

TIA

Greetings/
_______/Rein

No, creation or modification dates are not selection criteria.
This was explained in this 2-days old thread with possible workarounds:

Thanks, I missed that post. I was examining the forum for some days now. Sorry.
I think I’ve to make a feature request.

Anyone can request features, which almost by definition means there are too many to actually get done.
Backup dates have come up before, but I don’t know if I’ve heard of purge request before. Did you read:

Features

Incremental backups
Duplicati performs a full backup initially. Afterwards, Duplicati updates the initial backup by adding the changed data only. That means, if only tiny parts of a huge file have changed, only those tiny parts are added to the backup. This saves time and space and the backup size usually grows slowly.

The flip side of small change per version means that for changed files, purging old ones gets little back. There are lots of subtleties to this, so please describe your use case and objective well if you want this.
I’d offer to change the category here to Feature, except the current topic title is only half of the request.

This is for a photographer who promises her clients to keep all photos for a limited time. She does not have too much BU space available.

I indicated that an FTP mirror using Robocopy would be a better choice. But she does not want it. Is afraid foor the commandline :wink:

Photos probably don’t deduplicate well when edited, so one question would be the workflow involved. Questions might be what timestamps transfer in from the camera, and whether edits need only edited photos or require the older original to still be around, or maybe the promise allows edit cutoffs as well.

Although this person almost certainly takes better care of cameras than I do, I once had a camera that could digitally write date visibly on photos. I probably have some photos with a very ancient date when battery got changed and the camera forgot the current date. Regardless, purge seems potentially risky.

What does she do to clean up the source area? At some point that’s going to face the space problem. Cleaning up at source will eventually clean up at backup, as deleted file purge is set by retention rules. There is no extra space added by a new backup that sees an unchanged photo, as those deduplicate.

I “think” Windows File Explorer could do this, but it might take some study to find out exact procedures. Alternatively a script could be written (maybe by someone else), and a button provided to do a cleanup.

A very similar script could be used with Duplicati Scripting options to drive the purge to do as is wished.

Using Duplicati from the Command Line
Example Scripts

What I’d prefer not to hear is that she deletes source immediately after backup, relying on Duplicati as time-limited file archiver. It’s not intended for archiving, but as an additional backup copy – just in case.

so I guess the question is how much trouble happens if the promise is broken. That determines needs.
I’m thinking that this must not be doing archive-and-delete, otherwise your robocopy plan would lose…

Any mirror approach prevents getting different versions of the file (including sometimes one before the malware encrypted it), so versions are good whether or not the workflow meant to create different files.

One can do a little better with some other sync programs, for example look at --backup-dir in rclone. You also get lots of destination choices, including some remote ones without limits besides the budget.

As mentioned in the other post, both robocopy and rclone have an age limit, but the details are scarce.

Why doesn’t she get more?

You could put a button on the desktop, or set up a scheduled task. One question is what’s the easiest way to get the data back. Depending on workflow, e.g. time of source delete, Duplicati can get difficult finding exactly what version has the thing you’re looking for. Even harder is if you want to do compare.

Thank you for the comprehensive answer. I will consult her and get back to you.

She has 2TB in the cloud for backup.