Direct Recreate of Database

A few weeks ago I posted that I could not get my backups to run to completion. Never got it working. That is sad, as I had used Duplicati successfully for years, it only stopped working when I switched computers recently. Really think the project fit with what I wanted. Donated then (a couple weeks ago) and donated again this AM because what you do matters. That said, because I can’t get Duplicati to work for me (it’s a sophisticated piece of software, so maybe my more mundance requirements are best met by something else) I have been trying to download old backups. (Probably don’t need them, but never sure, and I have space to restore them to.) When I try to use Restore “Direct from backup files”, the job runs for a couple/three days and then I get warning, but nothing is restored.

Here are the warning, (it says there were 1602 warnings, but I don’t know):

  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Database.LocalRecreateDatabase-MismatchInBlocklistHashCount]: Mismatching number of blocklist hashes detected on blockset 2099402. Expected 0 blocklist hashes, but found 1
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache
  • 2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache

And here are is the log:
{
“MainOperation”: “Repair”,
“RecreateDatabaseResults”: {
“MainOperation”: “Repair”,
“ParsedResult”: “Success”,
“Version”: “2.0.5.1 (2.0.5.1_beta_2020-01-18)”,
“EndTime”: “2021-02-27T09:59:04.19588Z”,
“BeginTime”: “2021-02-24T12:16:33.976032Z”,
“Duration”: “2.21:42:30.2198480”,
“MessagesActualLength”: 0,
“WarningsActualLength”: 0,
“ErrorsActualLength”: 0,
“Messages”: null,
“Warnings”: null,
“Errors”: null,
“BackendStatistics”: {
“RemoteCalls”: 8767,
“BytesUploaded”: 0,
“BytesDownloaded”: 57580975713,
“FilesUploaded”: 0,
“FilesDownloaded”: 8757,
“FilesDeleted”: 0,
“FoldersCreated”: 0,
“RetryAttempts”: 8,
“UnknownFileSize”: 0,
“UnknownFileCount”: 0,
“KnownFileCount”: 0,
“KnownFileSize”: 0,
“LastBackupDate”: “0001-01-01T00:00:00”,
“BackupListCount”: 0,
“TotalQuotaSpace”: 0,
“FreeQuotaSpace”: 0,
“AssignedQuotaSpace”: 0,
“ReportedQuotaError”: false,
“ReportedQuotaWarning”: false,
“MainOperation”: “Repair”,
“ParsedResult”: “Success”,
“Version”: “2.0.5.1 (2.0.5.1_beta_2020-01-18)”,
“EndTime”: “0001-01-01T00:00:00”,
“BeginTime”: “2021-02-24T12:16:33.876956Z”,
“Duration”: “00:00:00”,
“MessagesActualLength”: 0,
“WarningsActualLength”: 0,
“ErrorsActualLength”: 0,
“Messages”: null,
“Warnings”: null,
“Errors”: null
}
},
“ParsedResult”: “Warning”,
“Version”: “2.0.5.1 (2.0.5.1_beta_2020-01-18)”,
“EndTime”: “2021-02-27T10:00:08.620943Z”,
“BeginTime”: “2021-02-24T12:16:33.876952Z”,
“Duration”: “2.21:43:34.7439910”,
“MessagesActualLength”: 17541,
“WarningsActualLength”: 1602,
“ErrorsActualLength”: 0,
“Messages”: [
“2021-02-24 07:16:33 -05 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Repair has started”,
“2021-02-24 07:16:34 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started: ()”,
“2021-02-24 07:19:21 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed: (16.56 KB)”,
“2021-02-24 07:20:20 -05 - [Information-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-RebuildStarted]: Rebuild database started, downloading 520 filelists”,
“2021-02-24 07:20:20 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20180829T050000Z.dlist.zip.aes (57.86 MB)”,
“2021-02-24 07:20:27 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20180829T050000Z.dlist.zip.aes (57.86 MB)”,
“2021-02-24 07:20:27 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20180830T191911Z.dlist.zip.aes (57.98 MB)”,
“2021-02-24 07:20:34 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20180830T191911Z.dlist.zip.aes (57.98 MB)”,
“2021-02-24 07:25:17 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20180901T133622Z.dlist.zip.aes (57.24 MB)”,
“2021-02-24 07:25:23 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20180901T133622Z.dlist.zip.aes (57.24 MB)”,
“2021-02-24 07:28:37 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20180902T050000Z.dlist.zip.aes (57.27 MB)”,
“2021-02-24 07:28:43 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20180902T050000Z.dlist.zip.aes (57.27 MB)”,
“2021-02-24 07:32:25 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20180903T050000Z.dlist.zip.aes (57.19 MB)”,
“2021-02-24 07:32:34 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20180903T050000Z.dlist.zip.aes (57.19 MB)”,
“2021-02-24 07:35:54 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20180904T050000Z.dlist.zip.aes (57.07 MB)”,
“2021-02-24 07:36:01 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20180904T050000Z.dlist.zip.aes (57.07 MB)”,
“2021-02-24 07:39:27 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20180905T050000Z.dlist.zip.aes (57.42 MB)”,
“2021-02-24 07:39:33 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20180905T050000Z.dlist.zip.aes (57.42 MB)”,
“2021-02-24 07:42:45 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20180906T050000Z.dlist.zip.aes (57.35 MB)”,
“2021-02-24 07:42:52 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20180906T050000Z.dlist.zip.aes (57.35 MB)”
],
“Warnings”: [
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Database.LocalRecreateDatabase-MismatchInBlocklistHashCount]: Mismatching number of blocklist hashes detected on blockset 2099402. Expected 0 blocklist hashes, but found 1”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/0c/0c37rgwu7lJtl1DOw9Z291JjEwWvXLhPvhoTIrLfdpo.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache”,
“2021-02-24 19:41:47 -05 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileEntryProcessingFailed]: Failed to process file-entry: /home/jim/development/apps/apoxeia/tmp/cache/assets/sprockets/v3.0/97/97Li-ymdGH9fEsvns-Q5MSW0Mlzn1wMkF04HAgpNTXc.cache”
],
“Errors”: ,
“BackendStatistics”: {
“RemoteCalls”: 8767,
“BytesUploaded”: 0,
“BytesDownloaded”: 57580975713,
“FilesUploaded”: 0,
“FilesDownloaded”: 8757,
“FilesDeleted”: 0,
“FoldersCreated”: 0,
“RetryAttempts”: 8,
“UnknownFileSize”: 0,
“UnknownFileCount”: 0,
“KnownFileCount”: 0,
“KnownFileSize”: 0,
“LastBackupDate”: “0001-01-01T00:00:00”,
“BackupListCount”: 0,
“TotalQuotaSpace”: 0,
“FreeQuotaSpace”: 0,
“AssignedQuotaSpace”: 0,
“ReportedQuotaError”: false,
“ReportedQuotaWarning”: false,
“MainOperation”: “Repair”,
“ParsedResult”: “Success”,
“Version”: “2.0.5.1 (2.0.5.1_beta_2020-01-18)”,
“EndTime”: “0001-01-01T00:00:00”,
“BeginTime”: “2021-02-24T12:16:33.876956Z”,
“Duration”: “00:00:00”,
“MessagesActualLength”: 0,
“WarningsActualLength”: 0,
“ErrorsActualLength”: 0,
“Messages”: null,
“Warnings”: null,
“Errors”: null
}
}

That was maybe here where I asked for information, but never got it. I keep begging people on that bug.

which unfortunately can lead to complicated problems, which are hard to debug, especially without info.

The current issue is off to a better start, but the one-line summaries in the log omit important later lines.

I think the warnings are limited in this log, which again gives a taste but not a full view. If you like, use the log-file=<path> and log-file-log-level options to make a file which you can post in part or full, if you prefer.

Alternatively, if usual methods don’t work and you want a look without ever continuing with an old backup, Recovering by using the Duplicati Recovery tool is more tolerant of issues. You get whatever it can get…

On linux with the recovery tool and command:
duplicati-cli restore googledrive://X230_Backups * --authid=“XXXXX” --restore-path="/media/jim/1gb_back"

I get this error:
Failed to load process type Duplicati.Library.Common.IO.VssBackupComponents assembly /usr/lib/duplicati/Duplicati.Library.IO.dll, error message: Could not load type of field ‘Duplicati.Library.Common.IO.VssBackupComponents:_vssBackupComponents’ (1) due to: Could not load file or assembly ‘AlphaVSS.Common, Version=1.4.0.0, Culture=neutral, PublicKeyToken=959d3993561034e3’ or one of its dependencies. => Could not load type of field ‘Duplicati.Library.Common.IO.VssBackupComponents:_vssBackupComponents’ (1) due to: Could not load file or assembly ‘AlphaVSS.Common, Version=1.4.0.0, Culture=neutral, PublicKeyToken=959d3993561034e3’ or one of its dependencies.

Any suggestions?

Should be harmless noise fixed in Canary and next Beta. Did the restore do what you wanted?

If it bothers you, you can grab a .zip file of your release, seemingly 2.0.5.1_beta_2020-01-18.
Extract AlphaVSS.Common.dll to your Duplicati installation, which might be in /usr/lib/duplicati.

I stopped the restore wait on the answer. I’ll start it up again. It seems to download file after file to /tmp for days, delete those files as it goes, with no files being restored and no errors or telemetry. Using the tool, the server seems not to be involved, as I understand it, but in any case there is nothing in the server logs either. Maybe I backed up with an earlier version of Duplicati? Would that matter?

Only in that earlier versions had more bugs. Other than that, newer should be able to restore earlier.

Are you talking about Restoring files using the Recovery Tool? Downloading all remote files using the Recovery Tool is a completely distinct earlier step, and then you build the index, and then you restore.

If you’re running duplicati-cli, that’s not the recovery tool. That script runs Duplicati.CommandLine.exe which is the command line version of Duplicati. These run independently of the server, so don’t share databases with server unless you specifically use –dbpath to point to path seen in the Database page.

If you’re trying for some reason to match settings for a GUI job, Export As Command-line can help that.

What’s the goal at the moment? I gave some earlier advice and then there was a gap of about a month.

Thanks.

I’m using this command in an attempt to restore:

duplicati-cli restore googledrive://X230_Backups * --authid=“XXXXX” --restore-path="/media/jim/1gb_back"

I’ve read the documentation, but It seems to download file after file to /tmp for days, deleting those files as it goes, with no files being restored and no errors or other telemetry about that is going on.

Your choice, but keep in mind that Duplicati.CommandLine.RecoveryTool.exe might work if this won’t.

Because duplicati-cli is very similar to your original test, I’d guess it will eventually have similar results however one improvement may be that a Warning may have some details instead of just a single line.

You’re also making extra work for the program by not telling it via --dbpath where existing database is.
On the other hand, it’s not clear if you have a database matching backup, and maybe don’t want one?

Here’s what a small restore set up similar to your run says:

Restore started at 4/2/2021 7:38:12 PM
  Listing remote folder ...
  Downloading file (722 bytes) ...
  Downloading file (822 bytes) ...
  Downloading file (608 bytes) ...
  Downloading file (609 bytes) ...
  Downloading file (568 bytes) ...
Checking remote backup ...
  Listing remote folder ...
Checking existing target files ...
  1 files need to be restored (1 bytes)
Scanning local files for needed data ...
  Downloading file (677 bytes) ...
  0 files need to be restored (0 bytes)
Verifying restored files ...
Restored 1 (1 bytes) files to C:\tmp

How far in above sequence did you get? Note there are two sections that are potentially downloading.

There’s no remote telemetry, but you can increase –console-log-level. At information level, output is

Restore started at 4/2/2021 7:37:38 PM
The operation Restore has started
No local database, building a temporary database
Backend event: List - Started:  ()
  Listing remote folder ...
Backend event: List - Completed:  (8 bytes)
Rebuild database started, downloading 2 filelists
Backend event: Get - Started: duplicati-20210327T201704Z.dlist.zip (722 bytes)
  Downloading file (722 bytes) ...
Backend event: Get - Completed: duplicati-20210327T201704Z.dlist.zip (722 bytes)
Backend event: Get - Started: duplicati-20210330T170115Z.dlist.zip (822 bytes)
  Downloading file (822 bytes) ...
Backend event: Get - Completed: duplicati-20210330T170115Z.dlist.zip (822 bytes)
Filelists restored, downloading 3 index files
Backend event: Get - Started: duplicati-i1e9d2c47aa6545cc9b97b39268d3ef2c.dindex.zip (608 bytes)
  Downloading file (608 bytes) ...
Backend event: Get - Completed: duplicati-i1e9d2c47aa6545cc9b97b39268d3ef2c.dindex.zip (608 bytes)
Backend event: Get - Started: duplicati-i70c4d9c4f4064b73b5a15fea08828f6b.dindex.zip (609 bytes)
  Downloading file (609 bytes) ...
Backend event: Get - Completed: duplicati-i70c4d9c4f4064b73b5a15fea08828f6b.dindex.zip (609 bytes)
Backend event: Get - Started: duplicati-i9a89c2f06366401291b4c2b10355c6d2.dindex.zip (568 bytes)
  Downloading file (568 bytes) ...
Backend event: Get - Completed: duplicati-i9a89c2f06366401291b4c2b10355c6d2.dindex.zip (568 bytes)
Recreate completed, verifying the database consistency
Recreate completed, and consistency checks completed, marking database as complete
Checking remote backup ...
Backend event: List - Started:  ()
  Listing remote folder ...
Backend event: List - Completed:  (8 bytes)
Searching backup 0 (3/30/2021 5:01:15 PM) ...
Checking existing target files ...
  1 files need to be restored (1 bytes)
Scanning local files for needed data ...
1 remote files are required to restore
Backend event: Get - Started: duplicati-bd20f1bf7cdb749188ad7a967e71cee13.dblock.zip (677 bytes)
  Downloading file (677 bytes) ...
Backend event: Get - Completed: duplicati-bd20f1bf7cdb749188ad7a967e71cee13.dblock.zip (677 bytes)
  0 files need to be restored (0 bytes)
Verifying restored files ...
Restored 1 (1 bytes) files to C:\tmp

Above is clearer that the first set of downloads are building a temporary database (due to no --dbpath).
These are dlist and dindex files. After DB is built, actual restore happens, and it downloads dblock.
Sometimes dblock files are needed to build the temporary database, and sometimes all will download while attempting to find missing information. One hint of a dblock file is the large size of 50 MB default.

You can get an even better view of things at verbose level. It shows you the database recreation steps:

Restore started at 4/2/2021 8:02:51 PM
The operation Restore has started
No local database, building a temporary database
Backend event: List - Started:  ()
  Listing remote folder ...
Backend event: List - Completed:  (8 bytes)
Rebuild database started, downloading 2 filelists
Backend event: Get - Started: duplicati-20210327T201704Z.dlist.zip (722 bytes)
  Downloading file (722 bytes) ...
Backend event: Get - Completed: duplicati-20210327T201704Z.dlist.zip (722 bytes)
Backend event: Get - Started: duplicati-20210330T170115Z.dlist.zip (822 bytes)
  Downloading file (822 bytes) ...
Processing filelist volume 1 of 2
Backend event: Get - Completed: duplicati-20210330T170115Z.dlist.zip (822 bytes)
Processing filelist volume 2 of 2
Filelists restored, downloading 3 index files
Backend event: Get - Started: duplicati-i1e9d2c47aa6545cc9b97b39268d3ef2c.dindex.zip (608 bytes)
  Downloading file (608 bytes) ...
Backend event: Get - Completed: duplicati-i1e9d2c47aa6545cc9b97b39268d3ef2c.dindex.zip (608 bytes)
Processing indexlist volume 1 of 3
Backend event: Get - Started: duplicati-i70c4d9c4f4064b73b5a15fea08828f6b.dindex.zip (609 bytes)
  Downloading file (609 bytes) ...
Backend event: Get - Completed: duplicati-i70c4d9c4f4064b73b5a15fea08828f6b.dindex.zip (609 bytes)
Processing indexlist volume 2 of 3
Backend event: Get - Started: duplicati-i9a89c2f06366401291b4c2b10355c6d2.dindex.zip (568 bytes)
  Downloading file (568 bytes) ...
Backend event: Get - Completed: duplicati-i9a89c2f06366401291b4c2b10355c6d2.dindex.zip (568 bytes)
Processing indexlist volume 3 of 3
Recreate completed, verifying the database consistency
Recreate completed, and consistency checks completed, marking database as complete
Checking remote backup ...
Backend event: List - Started:  ()
  Listing remote folder ...
Backend event: List - Completed:  (8 bytes)
Searching backup 0 (3/30/2021 5:01:15 PM) ...
Needs to restore 1 files (1 bytes)
Mapping restore path prefix to "C:\backup source\" to "C:\tmp\"
Restore list contains 2 blocks with a total size of 138 bytes
Checking existing target files ...
  1 files need to be restored (1 bytes)
Target file does not exist: C:\tmp\A.txt
Scanning local files for needed data ...
Target file is patched with some local data: C:\tmp\A.txt
1 remote files are required to restore
Backend event: Get - Started: duplicati-bd20f1bf7cdb749188ad7a967e71cee13.dblock.zip (677 bytes)
  Downloading file (677 bytes) ...
Backend event: Get - Completed: duplicati-bd20f1bf7cdb749188ad7a967e71cee13.dblock.zip (677 bytes)
Recording metadata from remote data: C:\tmp\A.txt
Patching metadata with remote data: C:\tmp\A.txt
  0 files need to be restored (0 bytes)
Verifying restored files ...
Testing restored file integrity: C:\tmp\A.txt
Restored 1 (1 bytes) files to C:\tmp

If you were in GUI, I’d say just go to About → Show log → Live → Verbose to see what’s going on, but CommandLine has to be told up front with options, so if it’s in mid-run it’s rather hard to watch activity.

You’ve put forth an incredible amount of information to help, but I’m not sure that I’m getting closer to a solution.

In short, the restore operation has gone on for hours, and downloaded about 90 files of about 60MB each. After the original “Listing remote folder”, there is NO other information other “Downloading file…” for each downloaded file. As far as I can tell, not a single file has been restored, even though I’ve asked (as you can see) for a full restore. As to the backup database, no I don’t have one (that’s why I was using a direct restore from the webgui originally). I didn’t see an option to build a database, and I don’t understand whether I would or would not need/want one. In my own way of thinking, if I’ll doing a full restore, what is the point of building the database? Shortsighted, but I didn’t see tje option for it anyway.

The downloaded files are downloaded to /tmp. There are no dlist files there currently. There is a file the includes the string “journal”. The rest of the handful of files in the /tmp directory are just named “dup” with hash-like string. In the past on Linux systems, /tmp was cleaned on restart, I find it frustrating that the restore downloaded GBs of files but restored nothing. I get that if I only want to restore one file, and didn’t have the database, the file would have to be located, but I’m asking for a full restore. Every file should be restored.

On the /tmp thing, I just read, not sure it’s true, as I thought it was only at reboot, that systemd “sweeps” /tmp once a day. That could explain a lot about what is happening to me. Is there an option I can set that puts the temporary files somewhere else? That is, if it’s building an index in /tmp that gets cleaned before it comes to the restore stage, that might explain the null results?

Anyway, absent a solution/idea based on the above, I think you’re saying reasonably that what I’m doing is the same as what I was doing in the gui, and so the result will be the same: failure. I’ll look at the recovery tool documentation again and try to use that, assuming there is such for linux. Thanks.

Comparing to my output with logging, those are probably dlist files, but you can look in Google Drive.
Backup below is kind of small, but maybe you have one with lots of files so the dlist comes to 60MB:

Because Duplicati is oriented around a database. This tells it what versions there are, what files are in a version, where to find the blocks of the files that need to be put back together for your restore, and so on. Duplicati.CommandLine.RecoveryTool.exe takes a different approach but it might download more data…

How the backup process works
How the restore process works

Don’t rely on the names in /tmp. The generic plan for naming is that the file begins with dup-. Duplicati knows what’s inside, but it’s hard for you to know except by size and maybe copying it and looking in it. Running at an enhanced log level would also give you the remote name, but you can’t set it up mid-run.

That would be associated with an SQLite database of same name without the suffix. If journal is getting larger after watching it for awhile, you’re probably still in the process of getting a temporary database up.

That starts after a successful database rebuild, however your experience with GUI makes me think you won’t get one. You can let this run and see if it gives you 1602 warnings again, or try this a different way.

For best chances of getting everything, use Duplicati.CommandLine.RecoveryTool.exe, which explains:

This tool can be used in very specific situations, where you have to restore data from a corrupted backup. The procedure for recovering from this scenario is covered in Disaster Recovery.

Any full restore will need a lot of downloading. If the goal is just to look to see what files are present, there might be some other ways. You don’t seem to be using encryption, so you can just open a dlist .zip file. Inside is a filelist.json file (see above “How” articles if you like) which is a long line with files and their info:

[{“type”:“Folder”,“path”:“C:\backup source\exclude2\”,“metahash”:“byLC8ETJHhEJjfN9ipe+NSGO83/WAu2NzvNmQMohffw=”,“metasize”:139,“metablockhash”:“byLC8ETJHhEJjfN9ipe+NSGO83/WAu2NzvNmQMohffw=”},{“type”:“File”,“path”:“C:\backup source\A.txt”,“hash”:“VZrq0IJk1XldOQlxjN0Fq9SVcuhP5VWQ7vMaiKCP3/0=”,“size”:1,“time”:“20201121T225725Z”,“metahash”:“hrBVpOwh+sSaiR+cg+GWpnSsnYPxuOKQ7/lmZBhkYM0=”,“metasize”:137},{“type”:“File”,“path”:“C:\backup source\exclude2\empty.gify”,“hash”:“47DEQpj8HBSa+/TImW+5JCeuQeRkm5NMpJWZG3hSuFU=”,“size”:0,“time”:“20210323T165612Z”,“metahash”:“mtpO5hgKUEfN5tSeQ/u1XwZUfsL52TGx3vaMFaZWoEQ=”,“metasize”:137}]

Database management Repair button would be one way, if you wanted to try to set up a job, otherwise recreate is sort of implied by restore request without a database. Another option is to explcitly do repair:

If no local database is found or the database is empty, the database is re-created with data from the storage.

One advantage of doing it this way is that I believe you can say –version=0 for latest, which might avoid processing all of your versions. Does it seem plausible that you have over 90 dlist files in Google Drive?

tempdir describes it. Note that for most complete DB results you might want to use the TMPDIR method.
If you’re going to add that option, please raise logging level as well, and possibly even get a log-file using log-file-log-level=verbose to get output at the level of my third screenshot. It also avoids screen clutter… Peek at it with tail or less if you want the story behind what the screen is showing at its default output.

Please do consider what you’re trying to do. Troubleshoot original warnings? See file info? Restore files?