Am I doing something wrong? It’s re-uploading my entire backup

Hi all,

Long story short, I have roughly 5TB of media that I’m wanting to backup to Azure Blob storage. I set everything up with large chunk sizes of 5GB due to this being strictly for DR purposes and it being many large files. I brought my computer to a friends house with Google Fiber, and successfully uploaded the entire 5TB of data over the course of this weekend.

I THOUGHT everything was finally perfect, up until I added another 50GB of the backup… when the backup starts, it’s telling me it’s uploading the entire 5TB again… eeeek!

At first I thought maybe because I have Smart Backup Retention selected, so I cancelled the backup, verified files than changed it to just a single copy to be retained. I start the backup again… 5TB again!

What am I doing wrong? This isn’t going to work for me if I can’t get incremental syncs to work. :frowning:

Thanks in advance…

It’s not. It just doesn’t do a very good job at checking through the pieces of the upload set that aren’t changed, or telling the user about it. So basically, depending on how exactly Duplicati chooses the order in which it parses through existing files to check for new or changed files, it might upload the new 50GB and then the 5TB remaining will suddenly drop to zero - though if the new 50GB are in various paths, it might go in chunks.
If you want to check this for yourself more easily, add like 1GB to the original backup set (big enough that it’ll take a bit of time to upload but small enough that you can sit there and watch it all) and watch the behavior of the progress bar as it runs.

Ok, thank you, that makes me feel a lot better. It’s still uploading the new data, but I should find out soon.

I appreciate the response