Plausible plan?

Deciding on a cloud provider at this time but as we know when you hit 4-5 GB it gets spendy. Is it plausible to have Duplicati do a full backup of that 4.5 gigs every two weeks or so to a local NAS or SFTP server, throw it in a fireproof safe and have Duplicati do incremental to the cloud in between?
My thoughts are that my data seems to expand about 10-15 GB per month which opens the options for cloud storage up.

Filter for new files is a Feature request to ease this. Requests are backlogged. It links to some possible workarounds of scripting (example script is here). In terms of fireproof safe, consider the temperatures. Some people have off-site locations, which presumably won’t catch on fire the same time as main one.

Terminology doesn’t quite fit due to deduplication, where a new file version may share blocks with older.
If you split the backup, be sure to keep a backup job closely tied to a backup destination. Don’t mix files.

Basically, you could have an ongoing infrequent backup which every now and then gets newer changes.
Online backs up more recent changes to files, and drops old data as it ages out due to the file filtering…

Math of expansion doesn’t quite work unless it’s all new files. Existing files that expand by appends need entire file backed up. Changed file would be represented as a full set of file blocks, added a few at a time.

Also note that space reclaim from deletions is not instant. Compact has to run. It has some adjustments.

Plan is plausible, but awkward.

Thank you TS.
My experience with backup software is that it would set the archive bit and that would be cleared on nay change. My data is the media server, and little gets changed, just added to now and then.
Freenas can make a snapshot of the OS stuff and I can do that to the cloud. I suppose that I could track the additions to the server.

I have not looked, can Duplicati backup given a date range?
Thank you for the input, I see that it would be awkward. My intent was to backup all media 2-4 weeks and new stuff to the cloud once or twice a week. There would be little loss if I were to loose something in between those weekly backups to the cloud.

No, that’s why there’s a feature request and a script for now. Duplicati doesn’t use archive bits itself, preferring to use file timestamps to detect files that might have changed and need to be looked over.

It would be less awkward if the backup given a date range existed, but even then it’s a bit awkward to potentially flip files constantly due to their dates. You can have files hop in and out of new-file backup. Sometimes a file’s old data is still in the backup for reuse. Other times, compact may have recycled.

Using S3 storage with Duplicati is cheap. I am using Backblaze and it is around $1 for 100GB/m.

1 Like