HELP?! Trying to use Duplicati, but something's always broken!

Now it’s my turn to apologize - stupid family holidays! :slight_smile:

A source of 640G shouldn’t be an issue - and the larger backup-than-source likely means you’ve got multiple versions of some files and/or you’re backing up non-compressible data.

Database recreates can be slow - and yes, multiple day kind of slow. We know this is an issue and are working to improve that but are not near any sort of release yet. :frowning:

Until that time comes, one thing to consider is possibly breaking your backup into multiple jobs. This keeps each job database smaller (thus faster) and if something does happen to one of them you’ll have less to rebuild.

As for why you’re having so many issues, is it possible you’re storing the Duplicati databases IN the Docker container? If so, this could cause problems like:

  • running out of space in the container (potentially corrupting the database due to failed writes)
  • database being deleted when container is updated (linuxserver containers seem to have updates just about every week!)

Oh, and there was a bug found in official Duplicati container. it really only affected certain destinations as they weren’t even available to be chosen as a destination so I’m pretty sure that’s not an issue you’re having. But if you want the latest one consider using 2.0.4.2 experimental (though I believe a beta with the fix should be coming out shortly if you’d prefer to wait).