Duplicati Backup Files deleted during first backup

Hello guys,

I am trying to use duplicati as backup programm within my families network over 4 locations.
Every location has a small (linux) server to save the files within the home network and every server should push a backup of his data to a bigger server. So in the case of a failure in one or more home networks the files are safe on the bigger (backup) server.
The 4 locations are connected to the bigger server with a VPN. So I have the possibility of store the data via samba, ssh, …; Within this backup I used ssh.

With 3 of the 4 this seems to work just fine, even the restore works as it should, but with the last (just 1mbit upload speed) the backup of about 80GB failed.

Facts:

  • it took about 8 days (fine so far)
  • on the bigger destination server are duplicati files of about 70GB (due to compression fine so far)
  • the duplicati webinterface of the smaller server reports it deleted 21 files and it suggest to run a repair
  • I ran the repair with the “–rebuild-missing-dblock-files”-option
  • repair failed, 11 files still missing

So, what had happend? how can I rescue the backup? Is duplicati maybe not the right tool for my purpos?

thanks
mac

grafik

    1. Febr. 2019 13:11: list
    1. Febr. 2019 23:59: list
    1. Febr. 2019 08:40: delete duplicati-b50b6375edffb4740a3af98cdc4554014.dblock.zip.aes
    1. Febr. 2019 08:40: delete duplicati-bd43e925259994b7f95a765b071942c78.dblock.zip.aes
    1. Febr. 2019 08:40: delete duplicati-bdb54b57f1dd14d0194bfd5a330ee1a5a.dblock.zip.aes
    1. Febr. 2019 08:40: delete duplicati-b913a909572e3454c93ba592f6623b67c.dblock.zip.aes
    1. Febr. 2019 08:40: delete duplicati-bb637e6a866134985a9744a84a0cec14b.dblock.zip.aes
    1. Febr. 2019 08:40: delete duplicati-be6367301c09b474b9edaf16988eacd5b.dblock.zip.aes
    1. Febr. 2019 05:11: delete duplicati-b17596617ef624f89a57a522e0952e0e3.dblock.zip.aes
    1. Febr. 2019 05:11: delete duplicati-b1f9066d978664320a83df7644d69b737.dblock.zip.aes
    1. Febr. 2019 05:11: delete duplicati-bbda01d07828447469f51b118be465223.dblock.zip.aes
    1. Febr. 2019 05:11: delete duplicati-be0f824b25ba943318e3b01edfbe81de5.dblock.zip.aes
    1. Febr. 2019 05:11: delete duplicati-b196a6593893e4aa0ac9e7840e37ff309.dblock.zip.aes
    1. Febr. 2019 05:11: delete duplicati-bb0e9396f551a4953950129ed10c968d0.dblock.zip.aes
    1. Febr. 2019 05:11: delete duplicati-b175742496b684c4c8dc69660ccec9c38.dblock.zip.aes
    1. Febr. 2019 05:11: list
    1. Febr. 2019 05:10: put duplicati-bb637e6a866134985a9744a84a0cec14b.dblock.zip.aes
      till here just "put"s.

Hard for me to speculate on what happened, but one fundamental rule is to make sure each computer using Duplicati is backing up to a UNIQUE location. They should never be mixing their backup data in the same destination folder. This can be as simple as making individual subfolders at the target location for each computer.

hello Rod,

thanks for your reply.
The small servers using Duplicati with unique backup locations/folders within the bigger server.

Is there a possibility to see, why this error occour? Otherwise I will have to try it again or have to search for another tool to do it.
Till this incident Duplicati seems just right and mature.

The possibility gets better if you’re willing to help by either looking over or posting logs and databases.

There won’t likely be any instant fix, but there might be one later, and meanwhile a better workaround chance. Unfortunately, repair tries (though totally reasonable to do) have added to the history needing investigation. Meanwhile I’m not even clear on what’s posted now. I see 13 deletes in a row after puts.

Seeing names of more puts would be helpful, for example normally puts send dblock and dindex files, followed by a dlist at the end. That would help guess where things were just before the deletes began. Normally for a reproducible problem one sets up more logs (to avoid guessing), but I think we can get further information from default logging before trying that. I don’t want a series of 8 day runs just yet…

On deletes, though I see 13, the text mentions 21. I assume the red notification is the last bullet with 11 missing files, and produced after the repair? Possibly the bullet 3 was about an earlier one not caught?

I’d like to know more about the above, but perhaps there’s no more info. To comment on red notification:

There is a total mismatch between the missing files and the deleted files (at least according to names). That would be another thing to look for in Job → Show log → Remote. When were 11 missing files put originally onto the destination, and when were they deleted? Does your SFTP server keep any file logs?

Also look closely at Job → Show log → General and About → Show log → Stored for any more clues.

Viewing the log files of a backup job
Viewing the Duplicati Server Logs

Creating a bug report and posting it is one alternative to posting job logs. That’s where log info is stored, along with other useful information. It’s supposed to be sanitized, to protect privacy. Not sure it’s perfect. You can inspect it with DB Browser for SQLite, if you like, or not post it at all if you don’t feel comfortable.

Above is the kind of work that helps figure that out. We sort of know how it ended, but not how it got there, unless we’re lucky enough that someone can recognize the issue from clues posted so far, and jumps in.

Hello,

the general log is emtpy, the remote log is as text in the remote zip file:
remote.zip (64.9 KB)

The log within about is as test in the remote zip file:
About.zip (2.6 KB)

I will inspect the db file. How can I upload this one? - it is over 100MB.

Thanks in advance.
mac

The following error looks like upload in the SSHNET library didn’t finish quickly enough.
Looking at its source code, it looks like there’s one 30 second timeout used very widely.
Unlike the http timeout, it’s not for total transfer, but times out smaller pieces of it.
Duplicati can set SSHNET timeout higher, but it can’t overcome a serious network failure.

  1. Febr. 2019 05:10: Failed while executing “Backup” with id: 1

    —> (Inner Exception #1) System.AggregateException: One or more errors occurred. —> Renci.SshNet.Common.SshOperationTimeoutException: Session operation has timed out

    at Renci.SshNet.SftpClient.UploadFile (System.IO.Stream input, System.String path, System.Action`1[T] uploadCallback) [0x00000] in <502177f7126d48e4be01a83c8cdb4c79>:0
    at Duplicati.Library.Backend.SSHv2.Put (System.String remotename, System.IO.Stream stream) [0x00012] in <92780ed9b8c74a0ab4415f69daf23b10>:0

The deletions seen look somewhat familiar to me. Duplicati is cleaning up files that had
upload failures. Retries are done until this file uploads or the retry limit is reached.
When a file finally uploads, the ones that didn’t upload are not needed, so are deleted.

What’s a little non-obvious is the retry is done under a different name to avoid bugs in
some servers. You can see a retry in Job --> Show log --> Remote by clicking on the line
to see if has the same Size and Hash as the previous put. If so, it’s probably the same.

14. Febr. 2019 08:40: delete duplicati-b50b6375edffb4740a3af98cdc4554014.dblock.zip.aes
14. Febr. 2019 08:40: delete duplicati-bd43e925259994b7f95a765b071942c78.dblock.zip.aes
14. Febr. 2019 08:40: delete duplicati-bdb54b57f1dd14d0194bfd5a330ee1a5a.dblock.zip.aes
14. Febr. 2019 08:40: delete duplicati-b913a909572e3454c93ba592f6623b67c.dblock.zip.aes
14. Febr. 2019 08:40: delete duplicati-bb637e6a866134985a9744a84a0cec14b.dblock.zip.aes
14. Febr. 2019 08:40: delete duplicati-be6367301c09b474b9edaf16988eacd5b.dblock.zip.aes
14. Febr. 2019 05:11: delete duplicati-b17596617ef624f89a57a522e0952e0e3.dblock.zip.aes
14. Febr. 2019 05:11: delete duplicati-b1f9066d978664320a83df7644d69b737.dblock.zip.aes
14. Febr. 2019 05:11: delete duplicati-bbda01d07828447469f51b118be465223.dblock.zip.aes
14. Febr. 2019 05:11: delete duplicati-be0f824b25ba943318e3b01edfbe81de5.dblock.zip.aes
14. Febr. 2019 05:11: delete duplicati-b196a6593893e4aa0ac9e7840e37ff309.dblock.zip.aes
14. Febr. 2019 05:11: delete duplicati-bb0e9396f551a4953950129ed10c968d0.dblock.zip.aes
14. Febr. 2019 05:11: delete duplicati-b175742496b684c4c8dc69660ccec9c38.dblock.zip.aes
14. Febr. 2019 05:11: list

Above were the list to see what’s there, and the deletes to remove the now-excess files.
Below I’ll take each delete and match it up to the put (or series of puts) uploading it.
I don’t have full data, but repeated put repeats of the same type are a sign of retries.
The try # is a guess. I expected 5, got 6. You can check Size and Hash to confirm retry.

14. Febr. 2019 08:40: delete duplicati-b50b6375edffb4740a3af98cdc4554014.dblock.zip.aes
14. Febr. 2019 05:10: put duplicati-bb637e6a866134985a9744a84a0cec14b.dblock.zip.aes
14. Febr. 2019 05:08: put duplicati-be6367301c09b474b9edaf16988eacd5b.dblock.zip.aes
14. Febr. 2019 05:06: put duplicati-b50b6375edffb4740a3af98cdc4554014.dblock.zip.aes (try 4?)

14. Febr. 2019 08:40: delete duplicati-bd43e925259994b7f95a765b071942c78.dblock.zip.aes
14. Febr. 2019 05:10: put duplicati-bb637e6a866134985a9744a84a0cec14b.dblock.zip.aes
14. Febr. 2019 05:08: put duplicati-be6367301c09b474b9edaf16988eacd5b.dblock.zip.aes
14. Febr. 2019 05:06: put duplicati-b50b6375edffb4740a3af98cdc4554014.dblock.zip.aes
14. Febr. 2019 05:04: put duplicati-bd43e925259994b7f95a765b071942c78.dblock.zip.aes (try 3?)

14. Febr. 2019 08:40: delete duplicati-bdb54b57f1dd14d0194bfd5a330ee1a5a.dblock.zip.aes
14. Febr. 2019 05:10: put duplicati-bb637e6a866134985a9744a84a0cec14b.dblock.zip.aes
14. Febr. 2019 05:08: put duplicati-be6367301c09b474b9edaf16988eacd5b.dblock.zip.aes
14. Febr. 2019 05:06: put duplicati-b50b6375edffb4740a3af98cdc4554014.dblock.zip.aes
14. Febr. 2019 05:04: put duplicati-bd43e925259994b7f95a765b071942c78.dblock.zip.aes
14. Febr. 2019 05:02: put duplicati-bdb54b57f1dd14d0194bfd5a330ee1a5a.dblock.zip.aes (try 2?)

14. Febr. 2019 08:40: delete duplicati-b913a909572e3454c93ba592f6623b67c.dblock.zip.aes
14. Febr. 2019 05:10: put duplicati-bb637e6a866134985a9744a84a0cec14b.dblock.zip.aes
14. Febr. 2019 05:08: put duplicati-be6367301c09b474b9edaf16988eacd5b.dblock.zip.aes
14. Febr. 2019 05:06: put duplicati-b50b6375edffb4740a3af98cdc4554014.dblock.zip.aes
14. Febr. 2019 05:04: put duplicati-bd43e925259994b7f95a765b071942c78.dblock.zip.aes
14. Febr. 2019 05:02: put duplicati-bdb54b57f1dd14d0194bfd5a330ee1a5a.dblock.zip.aes
14. Febr. 2019 04:55: put duplicati-b913a909572e3454c93ba592f6623b67c.dblock.zip.aes (try 1?)

14. Febr. 2019 08:40: delete duplicati-bb637e6a866134985a9744a84a0cec14b.dblock.zip.aes
14. Febr. 2019 05:10: put duplicati-bb637e6a866134985a9744a84a0cec14b.dblock.zip.aes (try 6?)

14. Febr. 2019 08:40: delete duplicati-be6367301c09b474b9edaf16988eacd5b.dblock.zip.aes
14. Febr. 2019 05:10: put duplicati-bb637e6a866134985a9744a84a0cec14b.dblock.zip.aes
14. Febr. 2019 05:08: put duplicati-be6367301c09b474b9edaf16988eacd5b.dblock.zip.aes (try 5?)

14. Febr. 2019 05:11: delete duplicati-b17596617ef624f89a57a522e0952e0e3.dblock.zip.aes
14. Febr. 2019 04:16: put duplicati-bf52956e8b83a46d9875d38d78b3d99c3.dblock.zip.aes
14. Febr. 2019 04:10: put duplicati-b17596617ef624f89a57a522e0952e0e3.dblock.zip.aes

14. Febr. 2019 05:11: delete duplicati-b1f9066d978664320a83df7644d69b737.dblock.zip.aes
13. Febr. 2019 04:17: put duplicati-ba9798c78f9e943fb8ce701cb2bb15fc1.dblock.zip.aes
13. Febr. 2019 04:09: put duplicati-b1f9066d978664320a83df7644d69b737.dblock.zip.aes

14. Febr. 2019 05:11: delete duplicati-bbda01d07828447469f51b118be465223.dblock.zip.aes
12. Febr. 2019 04:18: put duplicati-b453ef39c07b3411db8dcbc6ecbc80608.dblock.zip.aes
12. Febr. 2019 04:17: put duplicati-bbda01d07828447469f51b118be465223.dblock.zip.aes

14. Febr. 2019 05:11: delete duplicati-be0f824b25ba943318e3b01edfbe81de5.dblock.zip.aes
11. Febr. 2019 04:19: put duplicati-bfd4a5a0f1b0c4b6ab9febe2fdd5b1684.dblock.zip.aes
11. Febr. 2019 04:17: put duplicati-be0f824b25ba943318e3b01edfbe81de5.dblock.zip.aes

14. Febr. 2019 05:11: delete duplicati-b196a6593893e4aa0ac9e7840e37ff309.dblock.zip.aes
10. Febr. 2019 04:21: put duplicati-b708b181f7f33469bb6230acd15cc3c72.dblock.zip.aes
10. Febr. 2019 04:18: put duplicati-b196a6593893e4aa0ac9e7840e37ff309.dblock.zip.aes

14. Febr. 2019 05:11: delete duplicati-bb0e9396f551a4953950129ed10c968d0.dblock.zip.aes
9. Febr. 2019 04:22: put duplicati-b5ef6f2fabd3a4ab6b4f6c8ae0be287a8.dblock.zip.aes
9. Febr. 2019 04:22: put duplicati-bb0e9396f551a4953950129ed10c968d0.dblock.zip.aes

14. Febr. 2019 05:11: delete duplicati-b175742496b684c4c8dc69660ccec9c38.dblock.zip.aes
8. Febr. 2019 04:23: put duplicati-b968ec355103a46d8ba3296f088c649ae.dblock.zip.aes
8. Febr. 2019 04:22: put duplicati-b175742496b684c4c8dc69660ccec9c38.dblock.zip.aes

The 11 missing files list in the red notification names the first 11 dblock files put.
I’m not sure why they’re missing unless there are some deletes or loss not in the log.
Perhaps you could look on the destination to see whether they’re present but not seen.
Or perhaps the database bug report could be saved somewhere, then shared using a link.
Or you could open it with a DB browser and look at the Remoteoperation table yourself.
Just having the logs so far has been helpful to get some idea of what’s going on here.

6. Febr. 2019 17:18: put duplicati-b6d794bdcd97440feba9cd0d5e476c31a.dblock.zip.aes
6. Febr. 2019 17:26: put duplicati-b62533d2e7a66458da80d20db40d81046.dblock.zip.aes
6. Febr. 2019 17:34: put duplicati-bbcad4bcca418428fb86baead0f144634.dblock.zip.aes
6. Febr. 2019 17:43: put duplicati-b1dbe80be69824eef8fab8fdfabf9e47e.dblock.zip.aes
6. Febr. 2019 17:55: put duplicati-b58ba16b053d740e5bda4735fc9fa05e9.dblock.zip.aes
6. Febr. 2019 18:07: put duplicati-bc5b60f2014a44c26950ee38bbbd24b8d.dblock.zip.aes
6. Febr. 2019 18:19: put duplicati-bfaea8494ed4c4ed9b2987c437707d54f.dblock.zip.aes
6. Febr. 2019 18:31: put duplicati-bd268419b8e844447b2390906ab1fbb66.dblock.zip.aes
6. Febr. 2019 18:42: put duplicati-b1c324a1584a3475f8dd954b3b2d489df.dblock.zip.aes
6. Febr. 2019 18:50: put duplicati-bdcd0a52c58634500af8c96847541a9bc.dblock.zip.aes
6. Febr. 2019 19:02: put duplicati-b5ab5fde04647474ba95c95d2645e28b2.dblock.zip.aes

Assuming these are the default 50MB dblock files that take about 8 minutes, that’s an
uplink speed of maybe 900 kilobits/second. Is that what you’re expecting? You did say
1 megabit, so maybe that’s expected. The problem is that it might also be unreliable.

The occasional retries early on are slightly worrisome, but the fatal end is more so.
You could also run ping tests to see if packets are being lost. That can slow things.
Raising --ssh-operation-timeout to 10 minutes (probably 10m) might allow more retries
at the TCP level (if network goes bad) while reducing the retries Duplicati must try.

C:\Program Files\Duplicati 2>Duplicati.CommandLine.exe help ssh-operation-timeout
  --ssh-operation-timeout (Timespan): Sets the operation timeout value
    Use this option to manage the internal timeout for SSH operations. If this
    options is set to zero, the operations will not time out
    * default value: 0

C:\Program Files\Duplicati 2>

I’m not sure I agree with the “will not time out” part. Your backup didn’t agree either.

Unfortunately, I doubt I can lead you through making repair fix everything up nicely.
There was an effort to rewrite that code to improve it. I’m not sure where it stands.

Unless someone else jumps in with confidence, it might be best to repeat that backup.

I think you could probably wipe out the destination and the job database and restart.
Probably doing an Export in advance would be best in case you need to Import instead.
Please also test and tune your network as well as possible to avoid a failure midway.

Initial backup is possibly a special case, but a normal backup interruption should do
resumption on next backup after a scan to check the backup status so it can continue.

1 Like

seems like I had to redo the backup. I added the --ssh-operation-timeout command and will start the backup in the morning.

I am the same opinion, the initial backup is a special case due to the “high” volume and the low internet-speed. With the uplink speed of about 1Mbit an “effective” upload-speed of 0,9Mbit is within my expectations.
The uplink is unreliable due to the fact of a forced reconnecing cycle with a turnus of 24 hours of the isp. I hope the --ssh-operation-timeout will to the trick here.

Thanks for your very detailled analysis and I will give an update next week :wink:

When it will fail again, is it possible to save the initial backup local, transfer it via external drive and change the destination directory after I copy the files to the new destination?

You can adjust as you like, however by my understanding there will be a limit because the TCP level backs off (retries happen further and further apart) and eventually gives up (maybe OS-dependent). If you need to survive that Internet reconnect, you can also set –number-of-retries and –retry-delay, however setting high values might delay problem recognition when the Internet goes out (and stays out) for some other reasons.

I’m not sure I follow the idea, but the initial backup is fairly indistinguishable from initial plus later ones that initially just add additional dblock, dindex files with changes, and a dlist at the end to list all of the files. You can initially pull that apart just by destination file dates, but retention policy and compacting can mix things.

You could store the initial backup destination files and database and configuration export at remote to save a little time over a complete start-from-scratch if you wish. It depends on how many of initial files remain…

Moving complete backups is more usual, so you can backup to something portable then carry it to remote. Just edit the job to the new destination after the files are there, and it shouldn’t have any worry about move.

I hope it’s not failing a lot, but there’s definitely a chance it could fail, so it’s useful to have some plans for it.

1 Like

You could also artificially reduce the size of your back-up by excluding some (or most) files for the initial run. Then you can remove exclusions on subsequent runs until you have everything you want in the back-up.

1 Like