i’m using Duplicati to backup imagebackups of my OpenMediaVault installation over internet to another server. - so it is a huge amount of data, because i store 4 weeks of backup on my local drive.
Thatswhy i want wo tell Duplicati to just backup the last of the local backups to the remote-server.
Therefore i need a filter “exclude files older than …7 days, for example” - i just can find an exclude bigger than —MB option.
how can i handle this problem? - or is it possible to do this as a feature request?
I agree, I have also been looking for such a filter. In my case, I would backup the large amount of data to a local harddrive every now and then, and backup the new files daily to the cloud storage. So a filtering function that can be used to backup the newer files would be great!
A Feature request seems reasonable, and somewhat oddly I don’t see one (in spite of support requests). Between requests and issues, there’s an enormous backlog, but sometimes extra volunteers will pop up.
A technical question which probably doesn’t affect backup of images is effects on deduplication when that file ages out, then someone modifies it, e.g. by appending some text. Normally only changes get uploaded per the block-based deduplication that backup does, but file churn (age away, come back) may interfere…