Migrating Backup (problem with "PoC" & How to do it properly)

Important: Dont invest too much time in the error message, as this is primarily about the proper way to do the migration instead of solving the errors.

Hi,
I’ve searched other Problems, the HowTos and asked Copilot, but all indicate that migrating from one target to another should be straightforward.
I need to migrate my current 3 backup job that all save to Mega, as they keep locking my account on a regular basis and that’s really annoying.

Just today, I created a new Backup config and thought I would try the migration process as the data currently backed-up by it is unimportant.

  • Initially, the backup was to Google Drive and worked.
  • Downloaded the whole folder from there and put it into my download folder.
  • Changed the config of the Backup and ran the backup. No problem, worked.
  • Then I deleted the files in Google Drive and uploaded the current local backup from my Downloads folder back to Google Drive and changed the configuration.
  • After that, I got the error message:
    > Duplicati.Library.Interface.RemoteListVerificationException: Found 24 files that are missing from the remote storage, please run repair
Click here for details about Backup error message

Duplicati.Library.Interface.RemoteListVerificationException: Found 24 files that are missing from the remote storage, please run repair
at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable1 protectedFiles) at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile) at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task) at Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.<Backup>b__0(BackupResults result) at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action1 method)
at Duplicati.Library.Main.Controller.Backup(String inputsources, IFilter filter)
at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

As expected and experienced before, the repair does nothing but suggest to usage of --rebuild-missing-dblock-files

Click here for details about Repair error message

Duplicati.Library.Interface.UserInformationException: The backup storage destination is missing data files. You can either enable --rebuild-missing-dblock-files or run the purge command to remove these files. The following files are missing: duplicati-b633345ca1ee044b49be8621d71055de9.dblock.zip.aes, duplicati-b2743fa7ed5aa4047b9f3cc19ba3b9835.dblock.zip.aes, duplicati-bb44a093e5ef74d89820884626df1475c.dblock.zip.aes, duplicati-b91c15f08845d425eba63b1b3da7addec.dblock.zip.aes, duplicati-b77304f19f5bc44f2b22888626336fa7d.dblock.zip.aes, duplicati-bfd61547914bf4375817ed5441f82a413.dblock.zip.aes, duplicati-b7fbea02bae8b4be1979fc8e82fec59cf.dblock.zip.aes, duplicati-ba042b1c94b7040fba78cb6e5d43eca87.dblock.zip.aes, duplicati-b86f083df30a74321a7cae1ec239bebbb.dblock.zip.aes, duplicati-b1862906c9a3449f3b7b3c75e97bd1901.dblock.zip.aes, duplicati-bf5d86421581640e9951b22dd9b9066fc.dblock.zip.aes
at Duplicati.Library.Main.Operation.RepairHandler.RunRepairRemote()
at Duplicati.Library.Main.Operation.RepairHandler.Run(IFilter filter)
at Duplicati.Library.Main.Controller.RunAction[T](T result, String& paths, IFilter& filter, Action`1 method)
at Duplicati.Library.Main.Controller.Repair(IFilter filter)
at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

After trying it a few times, strangely the amount of missing files came down from 24 to 11. Unfortunately, the number doesnt lower anymore when performing more repairs/backups.

The question now isn’t how to solve this problem (as the data is not important) but how to do the migration properly.
Most likely, the 3 Backups going back to 2022, are supposed to be migrated to Google Drive.
From what I’ve read, it should be as easy as downloading the data ( the .zip.aes files) from Mega, uploading it to Google Drive and then changing the backup target in the backup job. But my experience doesn’t really match that.

Google Drive is a special case. Duplicati can only see its own files unless you get AuthID by

image

at site https://duplicati-oauth-handler.appspot.com/. Google may remove this ability someday.

EDIT:

Enhancing security controls for Google Drive third-party apps was removal plan. Not done yet.

Safest plan is to only put files on Google Drive with Duplicati, as then they should always work.

1 Like

Duplicati can only see its own files unless you get AuthID by

Yes, did that. Checked the connection and it worked and was able to create the folder for me. So that’s not the problem.

Safest plan is to only put files on Google Drive with Duplicati, as then they should always work.

Do you mean simply changing the Target destination and running a backup or do I misunderstand you?
Wouldnt that result in Duplicate - correctly - stating that there are parts of the backup missing?

EDIT: Tried that, as expected result was
Found 24 files that are missing from the remote storage, please run repair

So the question is - how do I get the backuped .ZIP.AES files, that are currently somewhere else like in Mega to somewhere else (Google Drive in this case) ?

I didn’t say it was. That was done with Duplicati, right? If below was not, then that’s the problem:

How was upload done? Duplicati can only see files it uploads unless you use full access login.

As described, most “somewhere else” should just work. Google Drive is a special harder case.

  1. Yes, that seems to be the cause of the problem.No, I simply took the files that were stored locally and drag&drop uploaded them to Google Drive manually.
  2. That also explains why there already was “folder A” in google drive but duplicati insisted on creating a new “folder A”.
  3. Doesn’t really explain why the amount of missing files sunk from 24 to 11, but lets disregard that one.

Maybe I didnt really look closely at your reply on my phone, but I didnt see your link to the page when reading your answer for the first time on my phone yesterday. Anyway, I fond the same link here:

That method worked, but as both you now and the post back then said, that method could be pulled by Google at any time.

That was mentioned in the other thread as well (see below, but unfortunately that method seems to be a bit above my level. I found another thread where someone had to deal with the same problem

So, as at the moment the full access granting seems to work (and as the Google Drive account is solely for backup purposes, so no data apart from duplicati will be there; → no concern about duplicati messing with other files), I’ll stick with the full access workaround for now.

Thank you very much

It also exposes another Google Drive oddity, which is that duplicate names are fine.
This is rather unusual on computer systems, but Google’s name is only an attribute.

I can’t predict what Google will do. They certainly missed their original cutoff target.
Maybe someday they’ll provide a way to “give” files to some user such as Duplicati.
Their original plan involved a file picker, but I don’t know if it can do a folder of files.

Here I am once again - seems like the cutoff hast happened by now.
The oAuth does not offer full access anymore.

Once again, my goal is to upload an existing backup of multiple years to Google Drive.

I’ve read some of the threads here using Duplicati.CommandLine.BackendTool.exe but I cant really get it to work.

I’ve created a simple test backup of a few KB - downloaded that, which resulted in a .dlist.zip , a .dblock.zip and a .dindex.zip file - so far so good.

Its documentation or here is, in my opinion, a bit lacking, so i tested around a bit.
But I only got to the point of
Command failed: Backend not supported.

Are you aware of any more beginner friendly documentation or know of the top of your head how to do that?

Im changing to “C:\Program Files\Duplicati 2”

cd "C:\Program Files\Duplicati 2"

and then attempt to use

.\Duplicati.CommandLine.BackendTool.exe PUT ggoogledrive://Testbackup?authid=myAuthID "D:\tmp\duplicati-20250320T221135Z.dlist.zip"

(Testbackup being the name of the newly created and freshly oAuth-ed backup location in GDrive; myAuthID being the newly oAuth-ed authid)

But the documentation or the examples I found online dont tell whether to reference the .dlist.zip , .dblock.zip or .dindex.zip file or all of them or the folder in which they are in…

I’m honestly really lost here…

Google Drive Destination (ignore the wrong subtitle saying it’s Dropbox) says:

googledrive://<folder>/<subfolder>?authid=<authid>

Yours looks similar to my reading, except you somehow wrote ggoogledrive.

Fixing that should remove Backend not supported. error.

Your second link is to https://duplicati.commandline.backendtool.xn--exe-jga/ somehow. The first is generic info. It’s a tool. It can be used in many ways. Aside from not telling you what to do in this very new use, is there any problem?

One of my own complaints (which I have to make up for in directions) is where to obtain the URL. I usually suggest GUI Export As Command-line. The full path of the command is also not always obvious, and assumes some knowledge of the installation folder. It might also vary depending on what was installed, e.g. 32 bit Windows programs use a different (x86) folder.

Those are your backup files and are portable to other storage. Move all of the files.

With three, you can do them by hand. You probably have one ready after a typo fix.

Assuming you don’t want to try to script something, there’s a new tool to try, but it’s only in Canary (test release) so far. I have it here from a .zip install, on the side…

C:\Duplicati\duplicati-2.1.0.111_canary_2025-03-15-win-x64-gui>Duplicati.CommandLine.SyncTool --help
Description:
  Remote Synchronization Tool

  This tool synchronizes two remote backends. The tool assumes that the intent is
  to have the destination match the source.

  If the destination has files that are not in the source, they will be deleted
  (or renamed if the retention option is set).

  If the destination has files that are also present in the source, but the files
  differ in size, or if the source files have a newer (more recent) timestamp,
  the destination files will be overwritten by the source files. Given that some
  backends do not allow for metadata or timestamp modification, and that the tool
  is run after backup, the destination files should always have a timestamp that
  is newer (or the same if run promptly) compared to the source files.

  If the force option is set, the destination will be overwritten by the source,
  regardless of the state of the files. It will also skip the initial comparison,
  and delete (or rename) all files in the destination.

  If the verify option is set, the files will be downloaded and compared after
  uploading to ensure that the files are correct. Files that already exist in the
  destination will be verified before being overwritten (if they seemingly match).


Usage:
  Duplicati.CommandLine.SyncTool <backend_src> <backend_dst> [options]

Arguments:
  <backend_src>  The source backend string
  <backend_dst>  The destination backend string

Options:
  -y, --confirm, --yes               Automatically confirm the operation [default: False]
  -d, --dry-run                      Do not actually write or delete files. If not set here, the global options will be checked [default: False]
  --dst-options <dst-options>        Options for the destination backend. Each option is a key-value pair separated by an equals sign, e.g. --dst-options key1=value1 key2=value2 [default: empty] []
  -f, --force                        Force the synchronization [default: False]
  --global-options <global-options>  Global options all backends. May be overridden by backend specific options (src-options, dst-options). Each option is a key-value pair separated by an equals sign, e.g.
                                     --global-options key1=value1 key2=value2 [default: empty] []
  --log-file <log-file>              The log file to write to. If not set here, global options will be checked [default: ""] []
  --log-level <log-level>            The log level to use. If not set here, global options will be checked [default: Information]
  --parse-arguments-only             Only parse the arguments and then exit [default: False]
  --progress                         Print progress to STDOUT [default: False]
  --retention                        Toggles whether to keep old files. Any deletes will be renames instead [default: False]
  --retry <retry>                    Number of times to retry on errors [default: 3]
  --src-options <src-options>        Options for the source backend. Each option is a key-value pair separated by an equals sign, e.g. --src-options key1=value1 key2=value2 [default: empty] []
  --verify-contents                  Verify the contents of the files to decide whether the pre-existing destination files should be overwritten [default: False]
  --verify-get-after-put             Verify the files after uploading them to ensure that they were uploaded correctly [default: False]
  --version                          Show version information
  -?, -h, --help                     Show help and usage information

You could try that with your three-file backup. The URL for a local folder can be the Windows path (quote if it contains spaces), assuming you have a folder ready to go. Possibly you have other things in D:\tmp? If so, create a new folder with three files.

EDIT:

https://github.com/duplicati/duplicati/releases usually has a Canary near the top. Current latest one is v2.1.0.111_canary_2025-03-15, so maybe you can get duplicati-2.1.0.111_canary_2025-03-15-win-x64-gui.zip to unzip somewhere.

2 Likes

Thank you so much- I was able to get my small test working.

As the real backup will be hundreds of files, I’ve created a dynamic script for that.

Should be redundant once the new feature ships, but I read that part of your message too late :slight_smile:

Once done, I’ll share the script here and edit my message

Here is the mentioned script, but unfortunately it failed after 590 files, as google cut me off: Unofficial-Duplicati_Backup_Migration Tool · GitHub

1 Like

Well, my script was working… until… Google Drive cut my connection after 591 files…

Anyway, If anyone is interested in the script: Unofficial-Duplicati_Backup_Migration Tool · GitHub

Seems like I’ll have a look at the canary feature.
Hopefully that does not have that limitation.

Uploading (590 / 860): duplicati-i565aa5991e924c83a3b0701664720718.dindex.zip.aes...
Uploading (591 / 860): duplicati-i56651e9f8c9047f1853b0fa61bf17a5f.dindex.zip.aes...
Command failed: The remote server returned an error: (403) Forbidden.
System.Net.WebException: The remote server returned an error: (403) Forbidden.
   at Duplicati.Library.Utility.AsyncHttpRequest.AsyncWrapper.GetResponseOrStream()
   at Duplicati.Library.Utility.AsyncHttpRequest.GetResponse()
   at Duplicati.Library.JSONWebHelper.GetResponse(AsyncHttpRequest req, Object requestdata)
   at Duplicati.Library.JSONWebHelper.ReadJSONResponse[T](AsyncHttpRequest req, Object requestdata)
   at Duplicati.Library.JSONWebHelper.GetJSONData[T](String url, Action`1 setup, Action`1 setupbody
req)
   at Duplicati.Library.Backend.GoogleDrive.GoogleDrive.ListFolder(String parentfolder, Nullable`1 
onlyFolders, String name)+MoveNext()
   at System.Collections.Generic.LargeArrayBuilder`1.AddRange(IEnumerable`1 items)
   at System.Collections.Generic.EnumerableHelpers.ToArray[T](IEnumerable`1 source)
   at Duplicati.Library.Backend.GoogleDrive.GoogleDrive.GetFolderIdAsync(String path, Boolean autoc
reate, CancellationToken cancelToken)
   at Duplicati.Library.Backend.GoogleDrive.GoogleDrive.GetCurrentFolderIdAsync(CancellationToken c
ancelToken)
   at Duplicati.Library.Utility.Utility.Await[T](Task`1 task)
   at Duplicati.Library.Backend.GoogleDrive.GoogleDrive.List()+MoveNext()
   at Duplicati.Library.Backend.GoogleDrive.GoogleDrive.PutAsync(String remotename, Stream stream, 
CancellationToken cancelToken)
   at Duplicati.Library.Backend.GoogleDrive.GoogleDrive.PutAsync(String remotename, String filename
, CancellationToken cancelToken)
   at Duplicati.Library.Utility.Utility.Await(Task task)
   at Duplicati.CommandLine.BackendTool.Program.Main(String[] _args)

It won’t overcome a Google rejection, but maybe it’s less likely to provoke one…

BTW there’s a 750 GB daily upload limit, but I’m not sure if you hit it in that error.

1 Like

It did work :slight_smile:
So thank you very much again.

Once again, I dont like the missing of examples and explanaitions in the documentation, but as it was really that easy I managed to figure it out.
Synced locale folder and the google drive remote target.

$SyncTool_path =    "C:\duplicati-2.1.0.111_canary_2025-03-15-win-x64-gui\Duplicati.CommandLine.SyncTool.exe"
$backend_src =      "C:\folder_of_old_backup"
$backend_dst =      "googledrive://Testbackup_new?authid=44444444444444444%3Aj3s44Y-O6.7-Di4h-l942"

& $SyncTool_path $backend_src $backend_dst

Although it worked in the end,I got this error asking for a repair when attempting an backup to the remote target once the sync was done.

Details below - is this to be expected or might it be helpful to create a separate thread about the errors I encountered while using the new SyncTool?

Duplicati.Library.Interface.RemoteListVerificationException: Found 860 remote files that are not recorded in local storage, please run repair
   at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles, Boolean logErrors)
   at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(String backendurl, Options options, BackupResults result)
   at Duplicati.Library.Main.Operation.BackupHandler.RunAsync(String[] sources, IFilter filter, CancellationToken token)
   at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
   at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter, CancellationToken token)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass17_0.<Backup>b__0(BackupResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)
   at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

Had to run a repair, which in turn ran successfully but threw 2 warnings and one error.

warnings:

2025-03-21 23:45:56 +01 - [Warning-Duplicati.Library.Main.Database.LocalRecreateDatabase-MissingVolumesDetected]: Found 1 missing volumes; attempting to replace blocks from existing volumes
2025-03-21 23:46:05 +01 - [Warning-Duplicati.Library.Main.Database.LocalRecreateDatabase-MissingVolumesDetected]: Found 1 missing volumes; attempting to replace blocks from existing volumes

Errors:

2025-03-21 23:41:56 +01 - [Error-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-MissingFileDetected]: Remote file referenced as duplicati-b6a0e9b27b2a54dc6b8669e31cdc92ebe.dblock.zip.aes by duplicati-i916d6d6e657f49fc86f0a65ab15e8e91.dindex.zip.aes, but not found in list, registering a missing remote file
full log:
            {
  "MainOperation": "Repair",
  "RecreateDatabaseResults": {
    "MainOperation": "Repair",
    "ParsedResult": "Success",
    "Interrupted": false,
    "Version": "2.1.0.5 (2.1.0.5_stable_2025-03-04)",
    "EndTime": "2025-03-21T22:46:06.502576Z",
    "BeginTime": "2025-03-21T22:38:34.4637925Z",
    "Duration": "00:07:32.0387835",
    "MessagesActualLength": 0,
    "WarningsActualLength": 0,
    "ErrorsActualLength": 0,
    "Messages": null,
    "Warnings": null,
    "Errors": null,
    "BackendStatistics": {
      "RemoteCalls": 438,
      "BytesUploaded": 0,
      "BytesDownloaded": 22475425,
      "FilesUploaded": 0,
      "FilesDownloaded": 437,
      "FilesDeleted": 0,
      "FoldersCreated": 0,
      "RetryAttempts": 0,
      "UnknownFileSize": 0,
      "UnknownFileCount": 0,
      "KnownFileCount": 0,
      "KnownFileSize": 0,
      "LastBackupDate": "0001-01-01T00:00:00",
      "BackupListCount": 0,
      "TotalQuotaSpace": 0,
      "FreeQuotaSpace": 0,
      "AssignedQuotaSpace": 0,
      "ReportedQuotaError": false,
      "ReportedQuotaWarning": false,
      "MainOperation": "Repair",
      "ParsedResult": "Success",
      "Interrupted": false,
      "Version": "2.1.0.5 (2.1.0.5_stable_2025-03-04)",
      "EndTime": "0001-01-01T00:00:00",
      "BeginTime": "2025-03-21T22:38:34.4376843Z",
      "Duration": "00:00:00",
      "MessagesActualLength": 0,
      "WarningsActualLength": 0,
      "ErrorsActualLength": 0,
      "Messages": null,
      "Warnings": null,
      "Errors": null
    }
  },
  "ParsedResult": "Error",
  "Interrupted": false,
  "Version": "2.1.0.5 (2.1.0.5_stable_2025-03-04)",
  "EndTime": "2025-03-21T22:46:06.5482299Z",
  "BeginTime": "2025-03-21T22:38:34.4376749Z",
  "Duration": "00:07:32.1105550",
  "MessagesActualLength": 882,
  "WarningsActualLength": 2,
  "ErrorsActualLength": 1,
  "Messages": [
    "2025-03-21 23:38:34 +01 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: Die Operation Repair wurde gestartet",
    "2025-03-21 23:38:34 +01 - [Information-Duplicati.Library.Main.Operation.RepairHandler-RenamingDatabase]: Renaming existing db from C:\\Users\\neele\\AppData\\Local\\Duplicati\\EVTSTWEDDR.sqlite to C:\\Users\\neele\\AppData\\Local\\Duplicati\\EVTSTWEDDR.backup",
    "2025-03-21 23:38:34 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()",
    "2025-03-21 23:38:40 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (860 Bytes)",
    "2025-03-21 23:38:45 +01 - [Information-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-RebuildStarted]: Rebuild database started, downloading 13 filelists",
    "2025-03-21 23:38:45 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20230924T074643Z.dlist.zip.aes (545,64 KB)",
    "2025-03-21 23:38:46 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20230924T074643Z.dlist.zip.aes (545,64 KB)",
    "2025-03-21 23:38:46 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20240326T153334Z.dlist.zip.aes (497,59 KB)",
    "2025-03-21 23:38:48 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20240326T153334Z.dlist.zip.aes (497,59 KB)",
    "2025-03-21 23:38:48 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20240410T194853Z.dlist.zip.aes (498,37 KB)",
    "2025-03-21 23:38:49 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20240410T194853Z.dlist.zip.aes (498,37 KB)",
    "2025-03-21 23:38:49 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20240504T142708Z.dlist.zip.aes (499,00 KB)",
    "2025-03-21 23:38:50 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20240504T142708Z.dlist.zip.aes (499,00 KB)",
    "2025-03-21 23:38:50 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20240611T125148Z.dlist.zip.aes (501,53 KB)",
    "2025-03-21 23:38:51 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20240611T125148Z.dlist.zip.aes (501,53 KB)",
    "2025-03-21 23:38:51 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20240724T152950Z.dlist.zip.aes (504,01 KB)",
    "2025-03-21 23:38:52 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20240724T152950Z.dlist.zip.aes (504,01 KB)",
    "2025-03-21 23:38:52 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20240828T115517Z.dlist.zip.aes (516,47 KB)",
    "2025-03-21 23:38:53 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20240828T115517Z.dlist.zip.aes (516,47 KB)",
    "2025-03-21 23:38:53 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20241001T132718Z.dlist.zip.aes (517,00 KB)"
  ],
  "Warnings": [
    "2025-03-21 23:45:56 +01 - [Warning-Duplicati.Library.Main.Database.LocalRecreateDatabase-MissingVolumesDetected]: Found 1 missing volumes; attempting to replace blocks from existing volumes",
    "2025-03-21 23:46:05 +01 - [Warning-Duplicati.Library.Main.Database.LocalRecreateDatabase-MissingVolumesDetected]: Found 1 missing volumes; attempting to replace blocks from existing volumes"
  ],
  "Errors": [
    "2025-03-21 23:41:56 +01 - [Error-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-MissingFileDetected]: Remote file referenced as duplicati-b6a0e9b27b2a54dc6b8669e31cdc92ebe.dblock.zip.aes by duplicati-i916d6d6e657f49fc86f0a65ab15e8e91.dindex.zip.aes, but not found in list, registering a missing remote file"
  ],
  "BackendStatistics": {
    "RemoteCalls": 438,
    "BytesUploaded": 0,
    "BytesDownloaded": 22475425,
    "FilesUploaded": 0,
    "FilesDownloaded": 437,
    "FilesDeleted": 0,
    "FoldersCreated": 0,
    "RetryAttempts": 0,
    "UnknownFileSize": 0,
    "UnknownFileCount": 0,
    "KnownFileCount": 0,
    "KnownFileSize": 0,
    "LastBackupDate": "0001-01-01T00:00:00",
    "BackupListCount": 0,
    "TotalQuotaSpace": 0,
    "FreeQuotaSpace": 0,
    "AssignedQuotaSpace": 0,
    "ReportedQuotaError": false,
    "ReportedQuotaWarning": false,
    "MainOperation": "Repair",
    "ParsedResult": "Success",
    "Interrupted": false,
    "Version": "2.1.0.5 (2.1.0.5_stable_2025-03-04)",
    "EndTime": "0001-01-01T00:00:00",
    "BeginTime": "2025-03-21T22:38:34.4376843Z",
    "Duration": "00:00:00",
    "MessagesActualLength": 0,
    "WarningsActualLength": 0,
    "ErrorsActualLength": 0,
    "Messages": null,
    "Warnings": null,
    "Errors": null
  }
}

Added a new tool for offsite backup was the announcement. At least for this, more docs sound expected. Help text is often concise (try some). Manual can say more.

So there’s an example of plans for directions on how to use a tool in a certain way, which makes lots of sense when a tool might be inspired by a specific usage need.

It sounds like you encountered those

so I wouldn’t immediately blame sync, however at a higher level it would be a migration issue to a different storage. Migrating Duplicati to a new machine is covered in the manual. Maybe the new docs will cover storage migration too?

Generally it’s easy – just move all the files and change job Destination setting, however if you’ve been making new jobs, any new jobs will have no database describing expected Destination contents, so may be surprised to see the files.

If you’re not sure if your database is the last working one (although I’m unsure whether it was working all that well), check your job logs to see signs of history.

If it looks too empty, maybe it’s useless, so you use Database screen to run a Recreate (delete and repair), which maybe you already did, since you got warnings and errors talking about recreate problems. For the error, the error is probably a dindex file intending to index a non-existent (it should exist) dblock.

Sometimes dindex/dblock pairs break, due to either extra dindex or lost dblock. Recreate notices these. Normal operations don’t. I’d just delete that dindex file. You have a local copy anyway (that it uploaded from) in case it needs checking.

I’m guessing that the warnings are related to the error, as dindex implied dblock, possibly leading to an unsuccessful attempt to work around the lack of its blocks.

Regardless, next step is probably to remove that unhelpful dindex, and Recreate.

1 Like

If you have feedback on the new SyncTool, please do create a new topic for it.
But it looks like the errors here are from the regular Duplicati run?