Lost the database. Still have local files and access to remote files. How do I get things running again?

I’ve done numerous searches, but can’t seem to find the right article or help article.

In short, the file system holding my database died. Fortunately, my external hard drive with all my files is working great. I also still have access to the remote backup (via SSH).

I’m back at a point where my Duplicati has been reinstalled, and I’m at the Duplicati main screen (index.html) with it ready to accept settings for the first time.

Is there a way I can tell Duplicati “Here are my existing files on my external hard drive, and here is the backup, can you please rebuild the database and go from here?” Do I do it through “Add a new backup” and somehow point it to an existing backup?

If you saved an export of your job config, import it. Otherwise you’ll have to recreate your backup job manually, including all settings like encryption password, target, source, etc. Don’t schedule it to run automatically at this point.

After the job is set up, click on it on the main screen to expand options. Click the blue Database link and then click the Recreate button. This will scan the backup files and regenerate the local database.

Hopefully you are running 2.0.5.1 as earlier betas had some bugs that could cause this process to take much longer than it should.

Good luck!

Thanks! I was able to recreate the backup job exactly as it was. Once I did that, a notification popped up tht it found thousands of files not in its database, and gave me a repair option. 6 hours into the repair, it’s about 3/4 done (for a 1.5 TB backup).

I’m confused. How was “exactly as it was” determined, and what (if anything) else led to that notification?

Do you have text? “Found {0} remote files that are not recorded in local storage, please run repair” would be worrisome right after a Recreate, which I’d think would ordinarily record all the remote files that it saw.

I’m also hoping “recreate the backup job” also includes “Recreate” of the database, or your backup could be disappearing right now. Are you seeing any size reduction or folder timestamp change on the remote?

How was “exactly as it was” determined, and what (if anything) else led to that notification?

From memory and notes I scribbled in various places. I didn’t have that many special settings, only 4 or so.

Do you have text? “Found {0} remote files that are not recorded in local storage, please run repair”

Yes, it was 8701 I believe, and I double checked, the server side had exactly that many .aes files. It then offered me three choices, one of which was repair, so I did it. The Repair then seemed to process correctly? I saw the status bar indicate it is rebuilding the database. The green status line slowly moved, I saw no issues. I figured that was the correct procedure.

I’m also hoping “recreate the backup job” also includes “Recreate” of the database, or your backup could be disappearing right now. Are you seeing any size reduction or folder timestamp change on the remote?

I specifically set up my remote end so that nobody can delete, append, or modify files except the root on that server side. I set the client side Duplicati’s settings similarly to avoid problems with this setup. This is my security mechanism to that if anyone hacks my end and obtains the credentials to the server side, they can’t log into the server side and destroy my backup.

However, I have a new problem.

I just checked the results. Restore took 8 hours, no errors. My scheduled backup then ran afterward. I’ve got a 2 Errors:

2020-06-15 09:01:12 -06 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes

2020-06-15 09:15:08 -06 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-b26c1bc68055f487e86f36859bba0ce29.dblock.zip.aes

Seemed it took over 6 hours to run the backup and produce those 2 errors. Bleh… Any ideas from anyone?

On a reattempt, I got a much better detailed error message:

{
  "DeletedFiles": 0,
  "DeletedFolders": 0,
  "ModifiedFiles": 0,
  "ExaminedFiles": 113427,
  "OpenedFiles": 0,
  "AddedFiles": 0,
  "SizeOfModifiedFiles": 0,
  "SizeOfAddedFiles": 0,
  "SizeOfExaminedFiles": 996095516696,
  "SizeOfOpenedFiles": 0,
  "NotProcessedFiles": 0,
  "AddedFolders": 0,
  "TooLargeFiles": 0,
  "FilesWithError": 0,
  "ModifiedFolders": 0,
  "ModifiedSymlinks": 0,
  "AddedSymlinks": 0,
  "DeletedSymlinks": 0,
  "PartialBackup": false,
  "Dryrun": false,
  "MainOperation": "Backup",
  "CompactResults": null,
  "VacuumResults": null,
  "DeleteResults": null,
  "RepairResults": null,
  "TestResults": {
    "MainOperation": "Test",
    "VerificationsActualLength": 5,
    "Verifications": [
      {
        "Key": "duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes",
        "Value": [
          {
            "Key": "Error",
            "Value": "File length is invalid"
          }
        ]
      },
      {
        "Key": "duplicati-b26c1bc68055f487e86f36859bba0ce29.dblock.zip.aes",
        "Value": [
          {
            "Key": "Error",
            "Value": "File length is invalid"
          }
        ]
      },
      {
        "Key": "duplicati-20191118T100742Z.dlist.zip.aes",
        "Value": []
      },
      {
        "Key": "duplicati-i19fbd785e868445e8cd6790aa8f96dfd.dindex.zip.aes",
        "Value": []
      },
      {
        "Key": "duplicati-b23543a0128c84464b5a11fbf5605d646.dblock.zip.aes",
        "Value": []
      }
    ],
    "ParsedResult": "Success",
    "Version": "2.0.5.1 (2.0.5.1_beta_2020-01-18)",
    "EndTime": "2020-06-15T18:23:04.078949Z",
    "BeginTime": "2020-06-15T17:56:05.035603Z",
    "Duration": "00:26:59.0433460",
    "MessagesActualLength": 0,
    "WarningsActualLength": 0,
    "ErrorsActualLength": 0,
    "Messages": null,
    "Warnings": null,
    "Errors": null,
    "BackendStatistics": {
      "RemoteCalls": 15,
      "BytesUploaded": 0,
      "BytesDownloaded": 219938775,
      "FilesUploaded": 0,
      "FilesDownloaded": 3,
      "FilesDeleted": 0,
      "FoldersCreated": 0,
      "RetryAttempts": 8,
      "UnknownFileSize": 8,
      "UnknownFileCount": 1,
      "KnownFileCount": 8825,
      "KnownFileSize": 877229435996,
      "LastBackupDate": "2020-06-15T03:00:00-06:00",
      "BackupListCount": 167,
      "TotalQuotaSpace": 0,
      "FreeQuotaSpace": 0,
      "AssignedQuotaSpace": -1,
      "ReportedQuotaError": false,
      "ReportedQuotaWarning": false,
      "MainOperation": "Backup",
      "ParsedResult": "Success",
      "Version": "2.0.5.1 (2.0.5.1_beta_2020-01-18)",
      "EndTime": "0001-01-01T00:00:00",
      "BeginTime": "2020-06-15T17:31:14.061439Z",
      "Duration": "00:00:00",
      "MessagesActualLength": 0,
      "WarningsActualLength": 0,
      "ErrorsActualLength": 0,
      "Messages": null,
      "Warnings": null,
      "Errors": null
    }
  },
  "ParsedResult": "Error",
  "Version": "2.0.5.1 (2.0.5.1_beta_2020-01-18)",
  "EndTime": "2020-06-15T18:23:04.322979Z",
  "BeginTime": "2020-06-15T17:31:14.061424Z",
  "Duration": "00:51:50.2615550",
  "MessagesActualLength": 33,
  "WarningsActualLength": 0,
  "ErrorsActualLength": 2,
  "Messages": [
    "2020-06-15 11:31:14 -06 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started",
    "2020-06-15 11:48:37 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()",
    "2020-06-15 11:48:51 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (8.62 KB)",
    "2020-06-15 11:55:57 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()",
    "2020-06-15 11:56:02 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (8.62 KB)",
    "2020-06-15 11:56:05 -06 - [Information-Duplicati.Library.Main.Operation.TestHandler-MissingRemoteHash]: No hash or size recorded for duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes, performing full verification",
    "2020-06-15 11:56:05 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes ()",
    "2020-06-15 11:57:55 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes ()",
    "2020-06-15 11:58:06 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes ()",
    "2020-06-15 12:02:18 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes ()",
    "2020-06-15 12:02:28 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes ()",
    "2020-06-15 12:06:42 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes ()",
    "2020-06-15 12:06:53 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes ()",
    "2020-06-15 12:07:47 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes ()",
    "2020-06-15 12:07:57 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes ()",
    "2020-06-15 12:08:50 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Failed: duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes ()",
    "2020-06-15 12:08:50 -06 - [Information-Duplicati.Library.Main.Operation.TestHandler-MissingRemoteHash]: No hash or size recorded for duplicati-b26c1bc68055f487e86f36859bba0ce29.dblock.zip.aes, performing full verification",
    "2020-06-15 12:08:50 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b26c1bc68055f487e86f36859bba0ce29.dblock.zip.aes ()",
    "2020-06-15 12:09:46 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b26c1bc68055f487e86f36859bba0ce29.dblock.zip.aes ()",
    "2020-06-15 12:09:56 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b26c1bc68055f487e86f36859bba0ce29.dblock.zip.aes ()"
  ],
  "Warnings": [],
  "Errors": [
    "2020-06-15 12:08:50 -06 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes",
    "2020-06-15 12:14:19 -06 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-b26c1bc68055f487e86f36859bba0ce29.dblock.zip.aes"
  ],
  "BackendStatistics": {
    "RemoteCalls": 15,
    "BytesUploaded": 0,
    "BytesDownloaded": 219938775,
    "FilesUploaded": 0,
    "FilesDownloaded": 3,
    "FilesDeleted": 0,
    "FoldersCreated": 0,
    "RetryAttempts": 8,
    "UnknownFileSize": 8,
    "UnknownFileCount": 1,
    "KnownFileCount": 8825,
    "KnownFileSize": 877229435996,
    "LastBackupDate": "2020-06-15T03:00:00-06:00",
    "BackupListCount": 167,
    "TotalQuotaSpace": 0,
    "FreeQuotaSpace": 0,
    "AssignedQuotaSpace": -1,
    "ReportedQuotaError": false,
    "ReportedQuotaWarning": false,
    "MainOperation": "Backup",
    "ParsedResult": "Success",
    "Version": "2.0.5.1 (2.0.5.1_beta_2020-01-18)",
    "EndTime": "0001-01-01T00:00:00",
    "BeginTime": "2020-06-15T17:31:14.061439Z",
    "Duration": "00:00:00",
    "MessagesActualLength": 0,
    "WarningsActualLength": 0,
    "ErrorsActualLength": 0,
    "Messages": null,
    "Warnings": null,
    "Errors": null
  }
}

Fortunately, Duplicati looks like it handles the database recreation on a new backup successfully. I tested by moving my DB, basically equivalent to a delete, but more reversible, then ran a backup with 12 existing files on the destination. It complained about them, gave me a button on its complaint popup for repair, and repair recreated the database without deleting the remote. Without a database (e.g. on a newly configured backup), Database buttons offer only a blue Repair button, but without a DB present, repair recreates DB.

image

The REPAIR command describes the dual behavior of Repair. The way people get in trouble with Duplicati is by restoring a stale DB from an image backup or such. Repair aligns things by deleting new remote files.

Tries to repair the backup. If no local database is found or the database is empty, the database is re-created with data from the storage.

So maybe your anti-damage precautions weren’t required, and maybe you just didn’t push the blue button.

Failed to process file duplicati-xxx.dindex.zip.aes is the most recent explanation, but you already got some additional error information posted. Now the question is how old are those files, and how correct are they?

Test files are sampled, so might not be from that backup. You’d have to check dates on files or something, however the more direct approach at getting error details are mentioned in the link in the paragraph above.

You can also sample other files to see how widespread the issue is by using the Verify files button for the job. Watching About --> Show log --> Live -> Retry will let you see any issues. Here’s an example after temporarily turning a dblock file into an empty file:

image

however I’m not sure how yours would have wrong size so soon after DB recreate via repair read the size.

1 Like

I really appreciate the help you’ve given so far. (Edit: See next post, the issue with these two files are unrelated to the database restore)

I did push the left button of the three.

The two files are the last two backed up on June 2nd. I was able to reconnect to the server on June 14th (made a test.txt to verify), and then I did a new round of backups. Seems it’s backing up more stuff now, but these two June 2nd files are causing the problems Capture

It couldn’t have changed on the server, the server forbids reopening a file for later modification once a file is created…

Trying that now. So far, it’s at: “Jun 15, 2020 4:50 PM: The operation Test has started”. I’ll post an update when it’s done.

Wait! I know more. These two files are a separate and unrelated issue from the database restore!

I use Duplicati-Monitoring, and sure enough, I’ve got the log files from June 3rd in there.

Failed: Permission denied
Details: Renci.SshNet.Common.SftpPermissionDeniedException: Permission denied at Duplicati.Library.Main.BackendManager.Delete (System.String remotename, System.Int64 size, System.Boolean synchronous) [0x0005c] in <8f1de655bd1240739a78684d845cecc8>:0 
... and then ...
Log data:
2020-06-05 03:20:06 -06 - [Warning-Duplicati.Library.Main.BackendManager-DeleteFileFailure]: Failed to recover from error deleting file duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes
System.NullReferenceException: Object reference not set to an instance of an object at Duplicati.Library.Main.BackendManager.ThreadRun () [0x003e3] in <8f1de655bd1240739a78684d845cecc8>:0
2020-06-05 03:20:07 -06 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
Renci.SshNet.Common.SftpPermissionDeniedException: Permission denied at Duplicati.Library.Main.BackendManager.Delete (System.String remotename, System.Int64 size, System.Boolean synchronous) [0x0005c] in <8f1de655bd1240739a78684d845cecc8>:0 at Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis


That’s one of the two files. Now I just have to figure out why that June 2nd error occurred and how to fix it.

I guess the Permission denied makes sense for a delete attempt, but I wonder why a delete was tried? When trying to stop delete, Options screen 5 Backup retention should be at Keep all backups and –no-auto-compact should be enabled. It’s good that you got a good look at some history on those two files, however I don’t know that we’ve seen a log showing what size “File length is invalid” thinks they should be, although we do have your June 2 listing showing what size they actually are. An alternative to getting log is –upload-verification-file and then either browse duplicati-verification.json to see what it thinks files’ lengths should be or (if server has Python), verify files with /usr/lib/duplicati/utility-scripts/DuplicatiVerify.py

The settings have always been as shown below. So I’m baffled why a delete was attempted

I have an idea. Suppose on June 2nd, during the middle of an upload, something just went wrong, such as server side glitching and flipping its file system to read only. That could mean the files would be incomplete. So would Duplicati try to delete those files in that situation, since it can’t use those files?

Capture

Here you go:

{"ID":8651,"Name":"duplicati-b26c1bc68055f487e86f36859bba0ce29.dblock.zip.aes","Hash":null,"Size":68031642,"Type":0,"State":3,"DeleteGracePeriod":"0001-01-01T00:00:00Z"}
{"ID":7609,"Name":"duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes","Hash":null,"Size":67998903,"Type":0,"State":3,"DeleteGracePeriod":"0001-01-01T00:00:00Z"}

The sizes in duplicati-verification.json matches exactly the sizes of the files on the server. Curious the hashes are null. Also, the server is a shell-less chroot jail, so I can’t run DuplicatiVerify.py over there.

What next? Suppose I’m able to manually delete those two files on the server, would Duplicati fix itself from there, especially since Duplicati was trying to delete them in the first place? If those two files do contain needed backup data, is there a way to instruct Duplicati to just rebackup anything that is now missing? I guess I’m just trying to ask “Is there a way I can see what client sise files those two .zip.aes files are backing up”?

Yes, although I forget exactly when it happens. Each upload attempt gets a new file name, so there should be no confusion about which files had an upload error of some sort, thus calling for a delete of the residue. The renaming was added to workaround an Apache WEBDAV issue, but impedes a never-delete scheme.

You should be able to see all actions to the destination with live log or –log-file at Information level or above. Retry level is better at showing the data for every retry, as opposed to where it fails after retries exhaustion.

The dindex for a dblock looks like it holds the dblock hash and size in a file having the dblock file name, e.g.

“volumehash”:“A2dgqxZq4slbtqbsQecjwP0Sw70WSK5jt91wfxiJ9wc=”,“volumesize”:1505

and in the normal case, every dblock has a dindex. In your case, a possible partial dblock might not, but its size might still be gotten from a directory listing. Someone who knows the recreate code may know more.

Another code-level question is whether this odd situation might cause a size error instead of a hash error, basically misreporting the problem. Maybe there’s a special case to suppress the hash error for null hash.

For your workaround, I suppose you could add a prefix to the suspect files to hide them instead of deleting (because we’re still speculating on what happened), but I’m not sure if Duplicati would see missing files…
Doing a DB Recreate that never saw them might clear that up, but I’m not sure how to avoid a recurrence.

One deeper question is whether there’s an error reporting bug, but it would be best to see the directory list Duplicati saw (described earlier – click on the list operation in the remote log, and see what size it saw).

Another question is whether there’s a workaround for this potential delete, and what if file can’t be deleted? People certainly try such setups either for protection or for cold storage. Forum research may be possible.

Although it may be hard, the ideal is to find steps to reproduce without special equipment and file an Issue. Some developer might then work on it, although there’s a rather enormous backlog, and few developers…

Back to what to try now about the problem files. If you still had an old DB or enough logs you could see the history better, but it sounds like you don’t. Does your SFTP server have any logs that show upload history?

Lacking logs, if you have remote timestamps or can sort by time, you might try looking at files somewhat after the troublesome ones to test if the troublesome ones are truncated versions of ones uploaded later. Adjust for your upload speed, and the fact that parallel uploads might have sent some other files earlier…

What OS is this? Looking in the DB with something like DB Browser for SQLite (a.k.a. sqllitebrowser) can show whether your dblock and dindex files are mismatched. Actually, I suppose a text editor or something that can count could do the same in duplicati-verification.json. If no false positives, it’s a dindex per dblock.

DB does have one useful table IndexBlockLink which pairs up IndexVolumeId and BlockVolumeId. Looking into that and Remotevolume table, you might find that your two files aren’t paired up with an index file, and possibly aren’t needed. You’d also want to see Block table having no VolumeID pointing to the two dblocks.

Creating a bug report and posting a link to it would let others (maybe including me) do such a look for you. There’s sanitization to try to protect privacy of paths (passwords are never in this DB), but it’s not perfect.

Above tries to test that the two files are irrelevant, however if they’re partial files, and you find the full one, The AFFECTED command run from Command screen (after some adjusting) can show its source files.

A harder approach is to see if the files decrypt in AES Crypt or Duplicati’s CLI SharpAESCrypt.exe. If not, that’s a hint that the file is partial. If it decrypts, zip is full of blocks with filename of block’s SHA-256 hash value, then Base64 URL-style, with special character subs. You could search safe characters in the DB Block table Hash column then look up ID in BlocksetEntry table, then take BlocksetID to File table to see what’s using the block, but that’s a lot of manual work… Best if affected can do the use tracing for you.

EDIT:

I suppose you could just feed affected the two file names directly, to see if anything pops out from that.

Ah, maybe this is it then.

I was told the server’s file system recently switched itself to read only mode. Given that I wasn’t able to back up anything new after June 2nd, I bet June 2nd is a good guess for when that happened. Once we got it back to a read/write file system, then backups resumed (with those two files stating errors).

I like that term, “cold storage”. Seems a current drawback is that Duplicati can’t properly handle partial files that were interrupted.

I suppose the best approach to attempt to duplicate the bug would be 1) start a new cold storage backup, 2) disconnect the network cable mid backup, 3) see what happens. I suppose a possible fix may be for Duplicati to recognize such residue files as unrelated to the backup and ignore them.

No logs server side. Client was Ubuntu 18, now it’s Ubuntu 20.

I ran the affected command. For the two problem .aes files, I got: “No files are affected”. It then listed some of the recent entires for that in the recent log files I do have.

For my other .aes files, it’s working great. So since those two files are just useless, I’m going to try to rename them with a .old suffix and see what happens.

It’s not an original term, and isn’t what you’re doing, but there are similarities that one wants to write and then leave it unchanged without activities like version thinning or compact that require changing old data.

Cold Storage in the Cloud: Comparing AWS, Google, Microsoft

It handles them fine if delete is permitted (which is the typical case), but maybe you meant in your case.

It will probably see them as extraneous files, and delete them if you allow it. The reason I suggested using prefix specifically is because that’s how Duplicati recognizes its files. There’s a probably little-used –prefix option that lets you change the prefix if you absolutely have to mix different Duplicati backups in one folder.

EDIT:

I didn’t quite follow the whole test, but if you didn’t do it yet, you can try a normal dblock to test the method.

Yes, it didn’t like my suffix. :slight_smile: Switched to a prefix. I then had a series of failures (which I’ve described below, since you seem to love details).

I’ve since moved the client’s Duplicati databases to another folder, created my backup job from scratch, and hopefully a full clean repair will do the trick now that those two problematic files are no longer start with the prefix “duplicati”


My series of errors.

After manually renaming the 2 files to start with “old.”, and rerunning the backup from the web interace, I got this error:

“The backup storage destination is missing data files. You can either enable --rebuild-missing-dblock-files or run the purge command to remove these files. The following files are missing: duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes, duplicati-b26c1bc68055f487e86f36859bba0ce29.dblock.zip.aes”

So I ran the purge command (with the --no-auto-compact and the --ssh-fingerprint and got this:

Failed to load process type Duplicati.Library.Common.IO.VssBackupComponents assembly /usr/lib/duplicati/Duplicati.Library.IO.dll, error message: Could not load type of field ‘Duplicati.Library.Common.IO.VssBackupComponents:_vssBackupComponents’ (1) due to: Could not load file or assembly ‘AlphaVSS.Common, Version=1.4.0.0, Culture=neutral, PublicKeyToken=959d3993561034e3’ or one of its dependencies. => Could not load type of field ‘Duplicati.Library.Common.IO.VssBackupComponents:_vssBackupComponents’ (1) due to: Could not load file or assembly ‘AlphaVSS.Common, Version=1.4.0.0, Culture=neutral, PublicKeyToken=959d3993561034e3’ or one of its dependencies.

Enter encryption passphrase:
Listing remote folder …
Missing file: duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes
Missing file: duplicati-b26c1bc68055f487e86f36859bba0ce29.dblock.zip.aes
Found 2 files that are missing from the remote storage, please run repair

ErrorID: MissingRemoteFiles
Found 2 files that are missing from the remote storage, please run repair

So I tried a command line repair

root@UDOO:~# /usr/lib/duplicati/Duplicati.CommandLine.exe repair ssh://[location redacted] [ssh credentials redacted] --dbpath=/root/.config/Duplicati/MQWYFKCHQC.sqlite
Failed to load process type Duplicati.Library.Common.IO.VssBackupComponents assembly /usr/lib/duplicati/Duplicati.Library.IO.dll, error message: Could not load type of field ‘Duplicati.Library.Common.IO.VssBackupComponents:_vssBackupComponents’ (1) due to: Could not load file or assembly ‘AlphaVSS.Common, Version=1.4.0.0, Culture=neutral, PublicKeyToken=959d3993561034e3’ or one of its dependencies. => Could not load type of field ‘Duplicati.Library.Common.IO.VssBackupComponents:_vssBackupComponents’ (1) due to: Could not load file or assembly ‘AlphaVSS.Common, Version=1.4.0.0, Culture=neutral, PublicKeyToken=959d3993561034e3’ or one of its dependencies.

Enter encryption passphrase:
Failed to load process type Duplicati.Library.Common.IO.VssBackupComponents assembly /usr/lib/duplicati/Duplicati.Library.IO.dll, error message: Could not load type of field ‘Duplicati.Library.Common.IO.VssBackupComponents:_vssBackupComponents’ (1) due to: Could not load file or assembly ‘AlphaVSS.Common, Version=1.4.0.0, Culture=neutral, PublicKeyToken=959d3993561034e3’ or one of its dependencies. => Could not load type of field ‘Duplicati.Library.Common.IO.VssBackupComponents:_vssBackupComponents’ (1) due to: Could not load file or assembly ‘AlphaVSS.Common, Version=1.4.0.0, Culture=neutral, PublicKeyToken=959d3993561034e3’ or one of its dependencies.

Enter encryption passphrase:
Listing remote folder …

ErrorID: MissingDblockFiles
The backup storage destination is missing data files. You can either enable --rebuild-missing-dblock-files or run the purge command to remove these files. The following files are missing: duplicati-b07b0a29e5d134cc69d31309e2904b210.dblock.zip.aes, duplicati-b26c1bc68055f487e86f36859bba0ce29.dblock.zip.aes
Error: SQLite error
no such table: main.BlocklistHash
Database is NOT upgraded.

Attempted to put in --rebuild-missing-dblock-files through the web interface, but upon trying to save, reported that the database was read only. So I rebooted. Now the web client won’t come up at all. So I’m now attempting to start over, recreate the job from scratch and repair from scratch. Hope this one works.

Fixing Duplicati when things go wrong can be challenging… 2.0.5.1 breaks less, but is far from perfect.
Some of what you hit are known issues (some even fixed). Others are expected, or a mystery to me…

The “missing data files” isn’t too surprising. Although nobody’s looked at the DB detail, the files being in duplicati-verification.json says they’re known, and if they’re known, they might be missed if they vanish.

About --> Changelog shows where --rebuild-missing-dblock files was likely added. The release note is:

2.0.3.10_canary_2018-08-30

Removed automatic attempts to rebuild dblock files as it is slow and rarely finds all the missing pieces (can be enabled with --rebuild-missing-dblock-files ).

I’m surprised that the purge command is involved with this, as it’s used to purge unwanted source files.
Still, the author of the message is expert. More at Duplicati stalls during “verifying backend data” #2335.
FWIW I’ve never had much luck with --rebuild-missing-dblock-files, but I can’t guarantee it never works.

The “Could not load” errors are probably the issue below, which is fixed, but not yet in any Beta release:

VSS warning on non-Windows platforms #4149 (I’m a bit worried fix aimed at just AlphaVSS.Common)

Do not remove Alpha VSS dependencies in installer packages #4178

I’m back:

Turns out that deleting the two problem dblock files wasn’t enough. Two dindex files exist as well, and Duplicati told me about them my repair #2. They’re what informs Duplicati on the Database repair recreation about all the dblock files used. So I had to remove the two dindex files as well (I first copied those two dindex files to the client, use the SharpAESCrypt.exe tool to decrypt them, looked inside, and verified they were only referring to the bad dblock files.)

Once the 2 dblock and 2 dindex files were gone, I did a full repair again (repair #3). Now I’m back. All in all, this experienced helped me feel much more comfortable with database recreation.