I’ve done some searching and so far I haven’t turned up what I’m looking for - if it turns out I just haven’t looked hard enough, please feel free to point me in the right direction and I’ll do more reading.
What I’m trying to accomplish with the filter setup is a blend of two jobs - basically an initial backup of a series of folders on one drive, and then a second job to catch just the changes (basically the incremental) from these folders to be done on a second drive. I have been trying to figure out a way to do this via date (initial job run once, then the second job to only include files after a specified date) but I haven’t determined the proper context to an attribute filter to do this - is there a cheat sheet, or additional documentation I can use to educate myself on filter writing?
Duplicati doesn’t work that way. In theory you could run the second backup to the original location and then have a post-backup script move all the new (time stamp wise) destination A files to destination B, but that would essentially break your backup.
Remember, Duplicati doesn’t back up files, it backs up file BLOCKS. So the initial backup will have all the contents of all your files (chopped up into little blocks) saved in the destination. This is how it handles de-duplication to make multiple backups smaller than multiple copies of the source files.
When the second backup runs it only uploads the file BLOCKS that have changed (not the entire file). When restoring a changed file Duplicati will restore the most recent blocks for the file, so it might pull a few blocks from today’s backup files then all the rest from yesterday’s backup files. In other words - having the files form a single job run (other than the first one) likely won’t give you much useful file contents.
What is it that you are trying to accomplish with this full-then-incrementals-only backup process?
Rotating media. Contents of the folders will update weekly and we wanted to back those up without having to recall media from off site, but we don’t need to backup everything all over again, just after a certain date. Then every quarter a new full will be done.
Rotating offsite media is a good policy, well chosen. So it sounds like you are:
keeping one media offsite (and unchanged) for up to a quarter, meaning it’s contents get up to 3 months stale
keeping one media onsite (and weekly active) so it eventually contains an initial backup followed by 8 incremental backups
swapping the media each quarter at which time the freshly-local media gets “wiped” and a fresh / full install is created
Unfortunately, that’s not really how Duplicati works. You could get something similar by creating two backup jobs (one for each media) that have a weekly schedule and “Keep this number of backups” set to “until they are older than 9 weeks”.
Then just leave one drive plugged in for 3 months (job #1 will run as scheduled) after which swap to the second drive (job #2 will run as scheduled).
What will happen is that as long as a drive is plugged in Duplicati will keep no more than 8 weeks of “versions” for a file. Since the files only change weekly, even if you run the backup every day for 6 of the 7 days Duplicati will see no changes and not back anything up.
When the remote drive is brought back and plugged in Duplicati will see that basically ALL file versions are over 8 weeks old, delete them all, and do a fresh backup of everything.
If for any reason a drive stays plugged in fore MORE than 8 weeks, it will still work just fine, again keeping only versions of the file from the most recent 8 weeks.
Of course if your remote drive was accessible over the internet in any way you could just have Duplicati backup directly to it, then you wouldn’t need to physically move drives back and forth…
Oh - and I should have asked, but if you’re using Windows there’s an advanced option parameter (I think it’s --alternate-destination-marker) you’ll likely want to use that helps Duplicati find the right drive even if Windows changes the drive letter.