Backup to Google Drive - Retention Policy

I recently installed Duplicati so that I could backup disk images created by another software package to Google Drive. The software I use to create the disk images takes cares of any file versioning. It makes a full backup every 90 days, and incremental backups every day in between. Given the large amount of data involved, what I need Duplicati to do is just upload any new files created and not do a full upload every day of all the files in the folder.

For example, on day 1 the backup software creates a full disk image, then I would like Duplicati to backup the entire image. On day 2, the backup software creates an incremental image, so the the local folder contains 2 files, the full image from day 1 and the incremental image from day. I would like Duplicati to only backup the incremental image from day 2, since the full disk image from day 1 is already on the cloud storage backup.

How do I set it up to do so?

Appreciate the help.

Welcome to the forum @abracadabra11

If Duplicati put the full disk image in cloud, it knows that and won’t upload it again unless its file changes.

Assuming your images are just created and left, there’s nothing special to set up. Just backup the folder.

It appears that the expected behavior is not occurring. Duplicati appears to be trying to upload the entire directory every time. So instead of uploading just the 3GB incremental file, it’s attempting to upload all files.

Do I need to pull up the log file to work through troubleshooting?

You could set up a –log-file at –log-file-log-level=verbose, or use live log About → Show logs → Live → Verbose to see what’s going on for given files via “Checking file for changes” output. Example is here with commentary on how it “should” work. Also just use File Explorer or whatever and check modification time.

You could certain test some smaller files in a folder to see if you get the same “upload all files every time” behavior, which would be a serious violation of the way things should work. If modification time changes, a scan of the file can happen to look for any changes, but unchanged data from file should not get uploaded.

Statistics on all of this are in your backup job log. Your values will differ, but here’s the general idea of that:

 "ModifiedFiles": 9,
  "ExaminedFiles": 448,
  "OpenedFiles": 13,
  "AddedFiles": 4,
  "SizeOfModifiedFiles": 66201655,
  "SizeOfAddedFiles": 1164494,
  "SizeOfExaminedFiles": 474400640,
  "SizeOfOpenedFiles": 67366149,
...
      "BytesUploaded": 33761799,

Basically, it normally examines all files present, finds some need examining, and uploads changes found.