I started using Duplicati recently and I’m really liking it. I use Google Workspace Enterprise, and with that I get unlimited storage but there is a 750GB daily upload limit per user, and yesterday when it started backing up and I hit my limit, it just stopped and left files there causing issues with unknown blocks or something and I had to reset the database and delete those files. Is there a way I can limit how much one of my backups would be and continue it over lets say a course of 3 days?
Is there a feature to set a limit across the instance?
Maybe the error isn’t relate to the upload cap.
I got this popup after another failure: One or more errors occurred. (Could not find file “/upload/dup-f04b555c-e68a-45f0-a06b-a64ab109951a” (Could not find file “/upload/dup-f04b555c-e68a-45f0-a06b-a64ab109951a”) (One or more errors occurred. (The remote server returned an error: (403) Forbidden.)))
I have the Volume Size set to 100mb, before it was 1gb. I’m using the asynchronous folder thing that goes to a 1tb m.2 which is mounted as a volume to /upload/. I’m using docker.
Welcome to the forum!
Sounds like it is the upload cap. That’s usually how the problem manifests - a Forbidden message on upload.
There isn’t a way to tell Duplicati to only upload X per day, but you can manually do it by doing your initial backup in stages. Select some fraction of data to back up, let it complete. Then select more data to back up and do another backup. Repeat the process until you have all your data selected in the backup job.