Duplicati Fails while executing Backup from Docker

I am running Duplicati on an UnRaid server as a Docker and am attempting to backup approx 1.05TB of data into a Local Share as well as a Google Drive account.

I have now attempted 3 times in the last 3 days to backup (1x to drive and 2x local). Each time, it appears to fail. Although I am getting these messages (see bottom), the backup appears to continue to run and stopped without warning.

I am not sure whether it is broken or I am not setting it up correctly.

Please advice.

Thank you!

These are all of the errors that I have gotten since I set this up 3 days ago:
https://pastebin.com/raw/DHCK9mwt

Hi @lankanmon, welcome to the forum!

I am running Duplicati in an unRAID Docker, but am only backing up a small amount of data and only to a local share (on the unRAID box) and haven’t had any issues since the initial setup was completed.

It sounds like you may have had at least one successful backup run to one of your destinations, but it’s a bit clear from the errors you provided which destination(s) were throwing which errors.

I’d recommend we start by focusing on a single destination. Once we get that working we can look at the other one.

For your Local Share backup, have you gotten any backup to work? More specifically, is there anything backed up that you care about or is it OK to wipe it and start over?

If that Local Share backup, what Volume Size (dblock) are you using? It’s possible your Docker mappings are pointing to paths that don’t have enough storage to handle temp file requirements or the job sqlite database size.


These are usually associated with the user cancelling a backup - did you do that at all on the 9th, 10th or 11th?

Mar 9, 2018 9:45 PM: Error in worker
System.Threading.ThreadAbortException

Mar 9, 2018 9:45 PM: Failed while executing "Backup" with id: 1
System.Threading.ThreadAbortException

Mar 10, 2018 12:25 PM: Error in worker
System.Threading.ThreadAbortException

Mar 10, 2018 12:25 PM: Failed while executing "Backup" with id: 1
System.Threading.ThreadAbortException

Mar 11, 2018 1:55 PM: Failed while executing "Backup" with id: 2
Duplicati.Library.Interface.CancelException: Cancelled

This usually happens when SOME Duplicati files from successful backups are deleted from the destination.

Mar 11, 2018 9:05 AM: Failed while executing "Backup" with id: 1
Duplicati.Library.Interface.UserInformationException: Found 229 files that are missing from the remote storage, please run repair

Mar 11, 2018 2:00 PM: Failed while executing "Backup" with id: 1
Duplicati.Library.Interface.UserInformationException: Found 229 files that are missing from the remote storage, please run repair

Mar 11, 2018 3:48 PM: Failed while executing "Backup" with id: 1
Duplicati.Library.Interface.UserInformationException: Found 229 files that are missing from the remote storage, please run repair

This usually happens when ALL Duplicati files from successful backups are deleted from the destination OR the destination has been changed (such as pointing ot a new folder).

Mar 11, 2018 3:49 PM: Failed while executing "Repair" with id: 1
Duplicati.Library.Interface.UserInformationException: No files were found at the remote location, perhaps the target url is incorrect?

These usually happen when the user has started a database repair but it got interrupted somehow (user cancelled, power outage, out of disk space, Docker container updated/recycled, etc.).

Mar 11, 2018 3:49 PM: Failed while executing "Backup" with id: 1
Duplicati.Library.Interface.UserInformationException: The database was attempted repaired, but the repair did not complete. This database may be incomplete and the backup process cannot continue. You may delete the local database and attempt to repair it again.

Mar 11, 2018 3:49 PM: Failed while executing "Backup" with id: 1
Duplicati.Library.Interface.UserInformationException: The database was attempted repaired, but the repair did not complete. This database may be incomplete and the backup process cannot continue. You may delete the local database and attempt to repair it again.

Hello @JonMikelV, thank you for your detailed response.

I had 2 backup plans, one to a local share and the other to a google drive account. Both of which from the same source of about 1.05TB.

After I had it fail 3 times, I created another backup of some test files about 1.89GB in size to google drive. This backup and for about 22min and was successful (and I can see the file in the Google Drive folder). <- This is probably the one that you see as successful.

I did also delete destination files a couple of times, each time after it failed (in a hope to start fresh).
The first time I did that, I did not realize that it had an SQLite DB, so I ran it after deleting the files and it threw those errors (since the files were missing). Since then, I deleted the DB files as well and tried again (this was the third attempt which also failed.

The volume size I am using for local and remote are 100GByte as I think this should be an equal balance between too many files and one single large file although I was also advised to try 700GByte for the google drive backup as that is googles daily upload limit (but I have not gotten to that point yet)

I also do not have any issues with space as far as I know. I have about 5TB free at this point and even a 1:1 backup should not be better than 1.05TB. I have noticed a significant increase in ram usage. My average is about 40% of 32GB, currently, I have it running at 68% with Duplicati backing up. Yesterday it reached 80-90% at some point. Is this normal for Duplicati or could there be a memory leak?

At this point, I am trying to understand why it stops working sometime overnight and fails. In the morning when I check, it says " Last successful run: Never". I was hoping the log would include why it is stopping, but all the errors appear to be at times that I have visually confirmed it to be working. I am going to keep looking into this as I really want to use Duplicati.

Thank you for all of your help!

Assuming you really did set “Volume size” to 100GB (not 100MB) I suspect this is the source of your problem. Since “Volume size” the size of each archive file being uploaded, that means Duplicati is trying to create a 100GB zip file. It also means that your entire 1.05TB source will end up in about 11 destination files.

While functionally this should work, it probably requires a lot of ram and temp space to process such a large compressed file.

Again I’m curious if you really mean 700GB or 700MB. If it’s GB then I’d like to know where you got that advice so we can find out why it was suggested.

Note that by default when Duplicati is done running a backup it downloads a random archive file for verification. This means that with a 100GB or even 700GB “Volume size” even if your backup was only 5MB (maybe only a small file changed) Duplicati could end up downloading 100GB or 700GB for testing purposes.

While it’s possible there’s a memory leak, my guess is that your settings are such that they need a lot of memory.

I’d suggest a test backup of a radically smaller “Volume size” to see if that works any better. Even as low as a 500MB size (which is more suitable for a local backup) would only result in about 2,500 files. My backup uses the default 50MB “Volume size” and after retention-policy cleanup is DOWN to about 4,800 files…

Yeah, I was using 100 Gbyte, I think that I must have mistaken the advice of someone from an IRC Channel who pointed me towards Duplicati.

They told me the main reason that they chose it is because of some daily upload limit that Google Drive has. This may have been for something else though and I may have misinterpreted it as a choice for volume size since Duplicati allows for Gbyte volume size options.

I have deleted everything and have started from scratch with the default 50Mbytes.

I will see how this works out and get back to you.

Thank you!

Good luck! :-)  

Hi @JonMikelV,
So I figured out what was causing the issue for it to start. It was CA Backup/Restore (the app that backs up docker AppData). It turns out that every morning it was turning off all the dockers to do the routine backup at 6 AM. So currently I have the backup disabled as there was a failure that occurred when I tried to exclude Duplicati from CA Backup/Restore (probably unrelated). But since then, I have been running the local backup with the new volume steering sand has been running fine for the last 4 days (doing the initial backup). However, I this morning, I noticed that it just said finishing backup and throughout the day. But when I checked just a few minutes ago, it now says Starting and it shows Last successful run: Never. Does that mean that last 4 days were wasted? can it not resume from where ever it stopped?

Edit: Also, checked the target directory and there are 27,843 AES files of 49.9 and 50.0MiB each, so there are files that contain the backup. But it just did not finish.

I also tried to run “Verify files” to fix the DB, but that gives the error: “Failed to connect: Insertion failed because the database is full”, even though there is plenty of space on the drive that the /temp/ file is on (same share as the destination location).

If you run with ‘–verbose=true’ does it give extra info when the database full error happens?

@JonMikelV,

Okay, so here is an update (this has been a long couple of days):

I learned that duplicati stores an SQLi file in the config folder which can grow to become multiple gigs. Since I am running this in docker, it created this in my AppData folder which is a limited size. Once the folder grew to be about 20GB it ran out of space and started wreaking havoc elsewhere on the server (on other dockers).

I decided to start fresh. I completely removed Duplicati and reinstalled it, but now, I placed the config in a dedicated share so it can grow freely. This should solve both problems that I was having including the problem that I was having with the CA Backup script causing Duplicati to shutdown (since it is no longer in the AppData folder, It will no longer need be shut down for safety). So I can resume my daily backups. I will just have to back up the duplicati config separately (let me know if you have any ideas on how).

I have just restarted the backup fresh and should take a few days, so I will keep you updated.

That was my initial suspicion, I just didn’t word it as well as you did. :slight_smile:

I’ve gone ahead and flagged your post as the solution as well as update the top title to indicate it’s Docker related - let me know if you disagree.

Alright, Thanks!

I will update if there is anything else.

I also have some feedback for docker/ unraid users if there is a place for me to share it.

If it’s instructions on how to do something you could make a How-To post, otherwise you could just do an Uncategorized post and give it the docker tag (at the top where the topic is edited).

Okay, I will take some time to collect my thoughts and see where it would go. I am going to wait until the backup finishes first since I don’t even know if I have it setup correctly. 800GB to go (local).

@JonMikelV So, It has been running fine for 8 days now and there was only like 30GB left last night.

Somehow, this morning, I saw that it had failed with the error:
https://pastebin.com/aRxWEULM

Any Ideas? This time, it is no longer stored in the appdata dir, so that can’t be the issue.

System.NotSupportedException: Attempted to write a stream that is larger than 4GiB without setting the zip64 option
  at SharpCompress.Writers.Zip.ZipWriter+ZipWritingStream.Write (System.Byte[] buffer, System.Int32 offset, System.Int32 count) [0x00034] in <20afbe34b18d4bdda049db0f59cd5db0>:0 
  at System.IO.StreamWriter.Flush (System.Boolean flushStream, System.Boolean flushEncoder) [0x0007e] in <f56c876907e742b0aa586f051fcce845>:0 
  at System.IO.StreamWriter.Write (System.Char value) [0x00014] in <f56c876907e742b0aa586f051fcce845>:0 
  at Newtonsoft.Json.Utilities.JavaScriptUtils.WriteEscapedJavaScriptString (System.IO.TextWriter writer, System.String s, System.Char delimiter, System.Boolean appendDelimiters, System.Boolean[] charEscapeFlags, Newtonsoft.Json.StringEscapeHandling stringEscapeHandling, Newtonsoft.Json.IArrayPool`1[T] bufferPool, System.Char[]& writeBuffer) [0x00003] in <0db320ea9c284716ba6369641babde67>:0 
  at Newtonsoft.Json.JsonTextWriter.WriteEscapedString (System.String value, System.Boolean quote) [0x00020] in <0db320ea9c284716ba6369641babde67>:0 
  at Newtonsoft.Json.JsonTextWriter.WriteValue (System.String value) [0x00019] in <0db320ea9c284716ba6369641babde67>:0 
  at Duplicati.Library.Main.Volumes.FilesetVolumeWriter.AddFileEntry (Duplicati.Library.Main.FilelistEntryType type, System.String name, System.String filehash, System.Int64 size, System.DateTime lastmodified, System.String metahash, System.Int64 metasize, System.String metablockhash, System.String blockhash, System.Int64 blocksize, System.Collections.Generic.IEnumerable`1[T] blocklisthashes, System.Collections.Generic.IEnumerable`1[T] metablocklisthashes) [0x0006e] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Volumes.FilesetVolumeWriter.AddFile (System.String name, System.String filehash, System.Int64 size, System.DateTime lastmodified, System.String metahash, System.Int64 metasize, System.String metablockhash, System.String blockhash, System.Int64 blocksize, System.Collections.Generic.IEnumerable`1[T] blocklisthashes, System.Collections.Generic.IEnumerable`1[T] metablocklisthashes) [0x00000] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Database.LocalDatabase.WriteFileset (Duplicati.Library.Main.Volumes.FilesetVolumeWriter filesetvolume, System.Data.IDbTransaction transaction, System.Int64 filesetId) [0x00251] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Database.LocalBackupDatabase.WriteFileset (Duplicati.Library.Main.Volumes.FilesetVolumeWriter filesetvolume, System.Data.IDbTransaction transaction) [0x00000] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.UploadRealFileList (Duplicati.Library.Main.BackendManager backend, Duplicati.Library.Main.Volumes.FilesetVolumeWriter filesetvolume) [0x000e7] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String[] sources, Duplicati.Library.Utility.IFilter filter) [0x00860] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Controller+<>c__DisplayClass16_0.<Backup>b__0 (Duplicati.Library.Main.BackupResults result) [0x0030f] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action`1[T] method) [0x0014b] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Controller.Backup (System.String[] inputsources, Duplicati.Library.Utility.IFilter filter) [0x00068] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Server.Runner.Run (Duplicati.Server.Runner+IRunnerData data, System.Boolean fromQueue) [0x00454] in <a6c0c2089b9a44ec9be5057a44f12116>:0

Are you using a large dblock (Volume) size?

Oh, and I hope you don’t mind but I copied the pastebin text into your post to help with future forum searches.

Upload volume size is set to 50 Mbyte. Any Ideas?

And yeah, no problem with the paste. I will post it on the forum from now on. Some forums do not permit pasting of code or logs in the thread, that’s why I used Pastebin.

With a 50MB Volume size I’m not sure why a >4GB file is trying to be written… but in theory you can resolve that error by adding --zip-compression-zip64=true:

--zip-compression-zip64 (Boolean): Toggles Zip64 support
The zip64 format is required for files larger than 4GiB, use this flag to
toggle it
* default value: False

Alright, I will try that out. Gonna suck having to wait 8 more days to see if it works. I am also getting concerned about running the disks at full for a week. Hopefully, it will work this time…

It’s probably too late now, but you could try creating a smaller test job to see if you get the same error and, if so, then try adding the parameter to see if it fixes it. Either way, :crossed_fingers:

@JonMikelV -

Surprisingly, I came to the same conclusion after making that last post. I broke up my back up a share (which contains Family Pics & Videos, software etc. into smaller backup jobs. I just completed the Pics one without any issues and now doing videos. I did use --zip-compression-zip64=true just to be safe and will use it in all of the backups from now on.

One Question: When I tried a restore, It seems to have worked correctly, but I get a few errors about permissions (I can’t recall correctly). but the files appear to be restored correctly (file sizes are correct). Could that be because my original drive was using linux based permissions and the restore drive was formatted NTFS?

Once this backup is done, I will try to recreate the error.

Also, should I create a new thread for this issue?

Thanks!