Duplicati hangs after backup attempt (OSX) Was: Crashplan Refugee can't get Duplicati to restore

Back from the weekend. Server state properties has only changed slightly, but it’s still hung and hasn’t done anything since Friday.

  • lastEventId : 694
  • lastDataUpdateId : 2
  • lastNotificationUpdateId : 0
  • estimatedPauseEnd : 0001-01-01T00:00:00
  • activeTask : {“Item1”:3,“Item2”:null}
  • programState : Running
  • lastErrorMessage :
  • connectionState : connected
  • xsfrerror : false
  • connectionAttemptTimer : 0
  • failedConnectionAttempts : 0
  • lastPgEvent : {“BackupID”:null,“TaskID”:3,“BackendAction”:“Get”,“BackendPath”:“duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes”,“BackendFileSize”:21721229,“BackendFileProgress”:21721229,“BackendSpeed”:106,“BackendIsBlocking”:false,“CurrentFilename”:null,“CurrentFilesize”:0,“CurrentFileoffset”:0,“Phase”:“Backup_Complete”,“OverallProgress”:1,“ProcessedFileCount”:228,“ProcessedFileSize”:26802068,“TotalFileCount”:228,“TotalFileSize”:26802068,“StillCounting”:false}
  • updaterState : Waiting
  • updatedVersion :
  • updateReady : false
  • updateDownloadProgress : 0
  • proposedSchedule : [{“Item1”:“1”,“Item2”:“2018-09-17T07:00:00Z”}]
  • schedulerQueueIds : [{“Item1”:4,“Item2”:“1”},{“Item1”:6,“Item2”:null},{“Item1”:8,“Item2”:null}]
  • pauseTimeRemain : 0

Well, this morning I decided to simplify even further. I deleted the Google Drive backup and created a new backup within Duplicati. It took a few attempts because every time it did something, the backend hung and I’d have to restart it. Once I turned off ‘Automatic’, I could start Duplicati and have it not immediately hang the backend. Then it complained about missing data and to run ‘repair’. I tried that, and also tried ‘repair/recreate’. After that completed successfully, I had to restart and then manually ran my backup. Again, the backend complained that a file was missing and then was hung again.

I’m pretty sure Duplicati is just not for me. It probably works fine under Windows and maybe Linux but this is a house full of Macs. It’s very unfortunate for me because it ticks almost all of my requirements and what’s missing is probably scriptable.

I’m happy to continue to debug the problem if someone is interested in giving me instructions but I need to move on to finding a replacement for Crashplan.

2018-09-17 11:43:56Z - Information: Backend event: List - Started: ()
2018-09-17 11:43:56Z - Information: Backend event: List - Completed: (2 bytes)
2018-09-17 11:43:56Z - Warning: Missing file: duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes
2018-09-17 11:43:56Z - Error: Found 1 files that are missing from the remote storage, please run repair
2018-09-17 11:44:07Z - Information: Backend event: List - Started: ()
2018-09-17 11:44:07Z - Information: Backend event: List - Completed: (2 bytes)
2018-09-17 11:44:07Z - Information: Rebuild database started, downloading 1 filelists
2018-09-17 11:44:07Z - Information: Backend event: Get - Started: duplicati-20180914T125645Z.dlist.zip.aes (13.11 KB)
2018-09-17 11:44:12Z - Information: Backend event: Get - Completed: duplicati-20180914T125645Z.dlist.zip.aes (13.11 KB)
2018-09-17 11:44:13Z - Information: Filelists restored, downloading 1 index files
2018-09-17 11:44:13Z - Information: Backend event: Get - Started: duplicati-ie8fcc4e18498428a9f33aee982d7f330.dindex.zip.aes (29.56 KB)
2018-09-17 11:44:13Z - Information: Backend event: Get - Completed: duplicati-ie8fcc4e18498428a9f33aee982d7f330.dindex.zip.aes (29.56 KB)
2018-09-17 11:44:13Z - Error: Remote file referenced as duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes, but not found in list, registering a missing remote file
2018-09-17 11:44:13Z - Information: Recreate completed, verifying the database consistency
2018-09-17 11:44:13Z - Information: Recreate completed, and consistency checks completed, marking database as complete
2018-09-17 11:44:32Z - Information: Backend event: List - Started: ()
2018-09-17 11:44:32Z - Information: Backend event: List - Completed: (2 bytes)
2018-09-17 11:44:32Z - Warning: Missing file: duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes
2018-09-17 11:44:32Z - Error: Found 1 files that are missing from the remote storage, please run repair
2018-09-17 11:44:32Z - Error: Fatal error
Duplicati.Library.Interface.UserInformationException: Found 1 files that are missing from the remote storage, please run repair
at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList (Duplicati.Library.Main.BackendManager backend, Duplicati.Library.Main.Options options, Duplicati.Library.Main.Database.LocalDatabase database, Duplicati.Library.Main.IBackendWriter log, System.String protectedfile) [0x001aa] in :0
at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify (Duplicati.Library.Main.BackendManager backend, System.String protectedfile) [0x000fd] in :0
at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String sources, Duplicati.Library.Utility.IFilter filter) [0x003c6] in :0

Macs do get used. https://usage-reporter.duplicati.com/ shows about 2000 backups a day (if I read that right). Windows is the biggest use, with Linux use somewhere in between. Here I spot a few Mac users who help out with many things (and I might try to get involved in this debug session because I’m getting a bit past my skills).

I hinted earlier at how one might translate numbers to backup job names via URL substitution, and I think that what you would plug in (based on this) would be Item2 # for activeTask and schedulerQueueIds and Item1 for proposedSchedule. That might lead you to a backup (i.e. you’re stuck in 3 now, and have 1 scheduled to run when/if 3 finishes, and due to be scheduled again at 2018-09-17T07:00:00Z), but the work to run is piling up.

Let me be more specific about how I think you can map backup numbers into human-recognizable job names.

If I run “Show log” on a job I have scheduled, URL is localhost:8200/ngax/index.html#/log/2 (notice “2” at end)
The reason I think “2” is a backup id is because “200” gives “Failed to connect: Invalid or missing backup id”. Plugging in a valid backup id at the end of whatever URL you get (details may vary) should get the job name.

Though you could still try to map backup numbers to names (to see if what I just said makes any sense), you have queued tasks that don’t seem to have an associated backup number, and I’m not sure how to see them.

Before leaving this train for a bit, do you have anything after the backup, e.g. reports by email or something? Earlier logging seemed to show you getting to the end of the backup, and yet the backup seems to be active.

Pushing some automatic scheduling aside can be done by running a web UI job Export --> As Command-line then using that. I think it runs mono for you. If it runs it wrong, peel off the mono and use /usr/bin/duplicati-cli. You can add additional logging options as desired. What I hope happens is when backup is done, you return to your shell prompt without any mysterious hangs or next activities like the web UI is seemingly getting into…

Perhaps what you can do is to have the web UI make the simplest possible backup to an existing local folder, not let the web UI run it, and export it to run manually. If it works, try something more complex. And thank you. Sometimes going through the debug is what it takes to get to the answer. If you have several machines, you could of course pick one for this chase (as long as we’re up for it), and one for testing other backup options.

Having just made a pitch for debug, I’m also suggesting a new job because I don’t want to also be debugging whatever led to the repair. In test mode we have the luxury of trying to pick one issue at a time, e.g. the hang.

Let me see if @JonMikelV has anything to add, at least on OSX (which I don’t run, so can’t say much about).

Hello @hpeyerl, welcome to the forum - and thanks for being willing to debug!

I’ve you a Mac running Duplicati (since 2.0.2.something) and only ever had issues with my local USB backup when I forgot to attach the drive - my SFTP process works fine.

I don’t recall seeing an export of your job settings, but one of your posts from 7d ago had something about downloading over 1,000M. Do you have a custom --dblock (Update volume) size?

The “log in use” error was likely due to trying to include the log file in the backup itself. Are you running as a daemon / root or just the logged in user?

The local test backup suggestion from @ts678 is good as that can isolate Duplicati from possible network issues. There is a known bug where losing connecting while uploading a file causes stalling similar to what you’re seeing.

Oh, and yes - the --console-log-level setting does do something with the GUI if you use the GUI command-line option. Of course it’s still scrolling through the browser so I feel works best with a targeted console filter level command. Note that I believe all the console-log stuff came in AFTER 2.0.3.3 beta (but I might be misremembering).

You remembered correctly, but it’s confusing because it came in at 2.0.3.2 in the canary sequence. How-to.

This reminds me that there’s a chance 2.0.3.11 is better. If so, Beta users can maybe wait for the next Beta.

Checklist of items to be fixed is shrinking. I’m not sure if the repair issue is a regression, or just never great.

I woke up late today so didn’t get any debugging time. Just thought I’d drop a note to say I’ve seen the further instructions and will get to them tomorrow morning.

I’m not sure why we’re trying to find the backup name. I appreciate the information but I’m not sure what to do with it. I only ever have one backup configured.

The last backup I configured that I reported on above was a small 25MB folder and I backed it up to a locally mapped folder on my NAS.

I don’t see anything after the backup. No email reports (though that is configured). But yes, the backup appears to be completing and then the backend hangs, so presumably there’s some internal thread that has gotten wedged and the main thread is blocked waiting for it. I’m guessing on the internal structure of the code however, but it smells like it.

JonMikeIV: In the early debugging, I set the --dblock to 1G based on something I read in the docs somewhere. For the more recent testing, I lowered it to 100M. And yes, I surmised that the error on duplicati.log was because it was open for Write. I didn’t think that error was salient but included it for the sake of completeness. I’m just logged in as myself and I launch Duplicati either from Finder or, for the sake of debugging, from the shell in /Applications/Duplicati.app/Contents/Resources with “./duplicati”

I’ll hopefully be back with more results tomorrow morning.

Thanks.

I wondered if email (or similar) was configured to run because of Duplicati web itnerface hangs on “Finished!”, which is also why I’m suggesting any test backup be as simple as possible initially, before going to complexity.

The backup name only matters if it helps with debug. And I could have done the below analysis earlier. Sorry.
Discovering what’s going on (even without names) was the objective, because I didn’t know what you had set up, yet something seemed like it might have been getting in your way and making you wait. I wanted to know. This was also a chance to see if internal ID #s can successfully be mapped into job names using URL editing. The desire arises from time to time, but the current “Server State properties” web output isn’t human-friendly.

proposedSchedule : [{“Item1”:“1”,“Item2”:“2018-09-17T07:00:00Z”}] would be a scheduled job.
schedulerQueueIds : [{“Item1”:4,“Item2”:“1”} says the job is already in the queue to run (backlog).
activeTask : {“Item1”:3,“Item2”:null} seems to say it’s stuck on an internal task that’s not a backup. Unfortunately I haven’t yet found a way to turn task numbers into anything that would actually help debugging.

–dblock-size advice can be found at Choosing sizes in Duplicati although I doubt it’s related to current hangs.

I only configured the email later as an experiment to see if it being unconfigured was the cause of the hanging. I’ll certainly unconfigure it tomorrow but I suspect it’s not the cause of the problem.

[Skip to the embarassing conclusion by going down to “SOLVED”]

This morning I tried to re-run the backup to NAS folder that I’d done the other day but it complained the database was in need of repair. I tried to ‘repair’ and ‘delete and recreate’ a few times but kept getting errors so eventually I just blew away the ~/.config/Duplicati/ sqlite file corresponding to that backup. Then I exported the job to commandline and ran it in a terminal. It ran successfully a number of times to completion. So I tried ‘run now’ in the GUI and it ran to completion successfully a couple of times. This is something I’d never seen before so that was encouraging. I removed the ‘–send-mail-to=’ option when I did that so that was interesting but I was dubious that it was related.

So then I created a new backup but to my googledrive. Same options as the local-nas backup that was successful, only a different destination. I exported to commandline again and ran it in terminal and it ran successfully to completion:

mono /Applications/Duplicati.app/Contents/Resources/Duplicati.CommandLine.exe backup “googledrive://Backups/test3?authid=ce0f6fb86a37fe5d93e0333c81fc6a52%3Ap_N0m”‘!’“7Us7s6Oz9kFnZaR” /Users/hpeyerl/Software/ --backup-name=googledrivetest --dbpath=/Users/hpeyerl/.config/Duplicati/LKPUZURPWS.sqlite e=aes --compression-module=zip --dblock-size=100MB --passphrase=mumble --log-level=Information --log-file=duplicati.log --default-filters=OSX --disable-module=console-password-input
Backup started at 9/19/2018 6:22:40 AM
Checking remote backup …
Listing remote folder …
Scanning local files …
228 files need to be examined (25.56 MB)
0 files need to be examined (0 bytes)
removing temp files, as no data needs to be uploaded
Checking remote backup …
Listing remote folder …
removing file listed as Temporary: duplicati-bd49f09fe351e4f75b6beb11ad8d7d056.dblock.zip.aes
removing file listed as Temporary: duplicati-i729e9c4547dd4a39ad31d43e66370272.dindex.zip.aes
Verifying remote backup …
Remote backup verification completed
Downloading file (13.11 KB) …
Downloading file (29.56 KB) …
Downloading file (20.71 MB) …
Duration of backup: 00:00:08
Remote files: 3
Remote size: 20.76 MB
Total remote quota: 1.00 TB
Available remote quota: 353.27 MB
Files added: 0
Files deleted: 0
Files changed: 0
Data uploaded: 0 bytes
Data downloaded: 20.76 MB
Backup completed successfully!

Encouraged, I ran it again but added a --send-mail-to and the job did not complete.

mono /Applications/Duplicati.app/Contents/Resources/Duplicati.CommandLine.exe backup “googledrive://Backups/test3?authid=mumble” /Users/hpeyerl/Software/ --backup-name=googledrivetest --dbpath=/Users/hpeyerl/.config/Duplicati/LKPUZURPWS.sqlite e=aes --compression-module=zip --dblock-size=100MB --passphrase=mumble --log-level=Information --log-file=duplicati.log --default-filters=OSX --disable-module=console-password-input --send-mail-to="hpeyerl+duplicati@beer.org"
Backup started at 9/19/2018 6:23:19 AM
Checking remote backup …
Listing remote folder …
Scanning local files …
228 files need to be examined (25.56 MB)
0 files need to be examined (0 bytes)
removing temp files, as no data needs to be uploaded
Checking remote backup …
Listing remote folder …
removing file listed as Temporary: duplicati-b036751bac0534fe9830cb979173bf7d4.dblock.zip.aes
removing file listed as Temporary: duplicati-ic523c59ec8194685944e0069a7ed8997.dindex.zip.aes
Verifying remote backup …
Remote backup verification completed
Downloading file (13.11 KB) …
Downloading file (29.56 KB) …
Downloading file (20.71 MB) …

hang for >10 minutes.

However, while in that state, I then went to the web interface and clicked ‘Run now’ on the same job (with no email configured) and it ran to completion, while the aforementioned command line job was hung. I was able to do that a few times, including CommandLine within the browser (not to be confused with the above command line ones which were done in a terminal session):

Backup started at 9/19/2018 6:37:26 AM
Checking remote backup …
Listing remote folder …
Scanning local files …
224 files need to be examined (15.77 MB)
0 files need to be examined (0 bytes)
removing temp files, as no data needs to be uploaded
Checking remote backup …
Listing remote folder …
removing file listed as Temporary: duplicati-bedc3ac53d1004f1c8178be5ca112a9ff.dblock.zip.aes
removing file listed as Temporary: duplicati-i45dea23f32a94a7dbdc068ea89b015a1.dindex.zip.aes
Verifying remote backup …
Remote backup verification completed
Downloading file (13.11 KB) …
Downloading file (29.56 KB) …
Downloading file (20.71 MB) …
Duration of backup: 00:00:08
Remote files: 3
Remote size: 20.76 MB
Total remote quota: 1.00 TB
Available remote quota: 353.27 MB
Files added: 0
Files deleted: 0
Files changed: 0
Data uploaded: 0 bytes
Data downloaded: 20.76 MB
Backup completed successfully!
Return code: 1

So this is new because I’ve not been able to do that before. The only change is really the email which is curious. For the sake of completion, I re-ran the same backup again in the browser in CommandLine, with the --send-mail-to= option and curiously it hung in the same way as the run that was hung in the terminal.

SOLVED
After this result, I dug into the mail thing and realized that I hadn’t set up all the rest of the options. I do remember when initially configuring Duplicati, I just somehow turned on email status reports and then the default recipient address was some bogus default that I changed to my own recipient. In the back of my mind I wondered how it would know which mail server to use, etc but assumed it would just use my local mail client to do the sending… This morning I noticed there was additional options including the smtp server URL, etc. After setting that, my mail server auth info, I was able to re-run the job and it completed successfully and I got an email.

Backup started at 9/19/2018 6:48:38 AM
Checking remote backup …
Listing remote folder …
Scanning local files …
228 files need to be examined (25.56 MB)
0 files need to be examined (0 bytes)
removing temp files, as no data needs to be uploaded
Checking remote backup …
Listing remote folder …
removing file listed as Temporary: duplicati-b1f9f896518c841a08447c8d5f6d95038.dblock.zip.aes
removing file listed as Temporary: duplicati-ib129573459c247b8b74362b0a689d6a3.dindex.zip.aes
Verifying remote backup …
Remote backup verification completed
Downloading file (13.11 KB) …
Downloading file (29.56 KB) …
Downloading file (20.71 MB) …
Email sent successfully using server: smtp://mumble:587/?starttls=when-available
Duration of backup: 00:00:09
Remote files: 3
Remote size: 20.76 MB
Total remote quota: 1.00 TB
Available remote quota: 353.27 MB
Files added: 0
Files deleted: 0
Files changed: 0
Data uploaded: 0 bytes
Data downloaded: 20.76 MB
Backup completed successfully!

We can see that the mail gets sent after the point where the other backups hung. I was able to reproduce the same hang in the web browser by incorrectly configuring the mail server.

That left me to wonder how to reproduce what I originally saw when I installed Duplicati. Because I don’t remember going to “advanced options” at the time and thought that “send mail” was already there and populated with a bogus address. So I killed Duplicati, the process, and blew away $HOME/.config/duplicati, and then restarted Duplicati… I was in a clean configuration but I was not able to find the pre-populated default mail recipient that I remembered from when I first installed Duplicati. I also browsed through github looking for the change but didn’t find it.

In Summary, I believe that it was an incompletely configured email setup in the options that was the entire cause of the problem. It would have been helpful if Duplicati had said something like “Emailing Status report” prior to attempting to send the mail but that’s about it. I guess it would also be useful to have some sanity checking in the email config and/or a timeout. The rest is my fault.

Thanks for all of your help. I also now understand better how Duplicati is put together.

Just a brief update to close this thread. Duplicati is now working fine with my full 160GB backup to Google Drive and sending me automated mails every day.

Thanks for the help.

1 Like

Completely agree! :smiley:

(And glad you figured it out!)