Remote backup task quit working : constraint failed UNIQUE constraint failed: Remotevolume.Name, Remotevolume.State

I’m running three duplicati backup jobs nightly since 6 months. Destination is pcloud using WEBDAV. Worked fine with some exceptions, when destination was not accessible for whatever reason.

Sincs Sept. 9th one of them stopped working throwing the error “constraint failed UNIQUE constraint failed: Remotevolume.Name, Remotevolume.State”. Any modifications on the configuration were done.

Running a repair stops with the same error.

I found some threads here but no solution. How to fix this not loosing all remote data (keeping file versions) and uploading all data taking 20+ hours.

Sure I will provide more logs and detailed information if anybody will try to help me on this. I have no glue how to fix.

Duplicati Version: Duplicati - 2.0.6.3_beta_2021-06-17
OS: Win10 PRO 21H2

Log- data:

code = Constraint (19), message = System.Data.SQLite.SQLiteException (0x800027AF): constraint failed
UNIQUE constraint failed: Remotevolume.Name, Remotevolume.State
   bei System.Data.SQLite.SQLite3.Reset(SQLiteStatement stmt)
   bei System.Data.SQLite.SQLite3.Step(SQLiteStatement stmt)
   bei System.Data.SQLite.SQLiteDataReader.NextResult()
   bei System.Data.SQLite.SQLiteDataReader..ctor(SQLiteCommand cmd, CommandBehavior behave)
   bei System.Data.SQLite.SQLiteCommand.ExecuteReader(CommandBehavior behavior)
   bei System.Data.SQLite.SQLiteCommand.ExecuteNonQuery(CommandBehavior behavior)
   bei Duplicati.Library.Main.Database.LocalDatabase.UpdateRemoteVolume(String name, RemoteVolumeState state, Int64 size, String hash, Boolean suppressCleanup, TimeSpan deleteGraceTime, IDbTransaction transaction)
   bei Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   bei Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   bei Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
   bei Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext()
--- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde ---
   bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   bei CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
   bei Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.<Backup>b__0(BackupResults result)
   bei Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   bei Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)
   bei Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

Hello

I presume that you are keen to restart your backup as soon as possible, so my advice would be to try to delete the last backup version as it could remove the underlying problem (to do that go to the Web UI, select the job, pick ‘Command Line’, choose the ‘Delete’ operation and in the advanced version add ‘Version’ field with a ‘0’ value)

If you can save a copy of the job database (NOT Duplicati-server.sqlite) before attempting it, it could help in debugging what happened, if it’s a bug, or if it’s ‘normal’ corruption (crash, hardware problem) see if Duplicati could handle the problem gracefully. If you are doing cheesy stuff with backups this remark does not apply.

Hello gpatel-fr,
thx for your help. I saved a copy of the Job-Database and tried to delete last version but ended up with this, same error:

  Listing remote folder ...

code = Constraint (19), message = System.Data.SQLite.SQLiteException (0x800027AF): constraint failed
UNIQUE constraint failed: Remotevolume.Name, Remotevolume.State
   bei System.Data.SQLite.SQLite3.Reset(SQLiteStatement stmt)
   bei System.Data.SQLite.SQLite3.Step(SQLiteStatement stmt)
   bei System.Data.SQLite.SQLiteDataReader.NextResult()
   bei System.Data.SQLite.SQLiteDataReader..ctor(SQLiteCommand cmd, CommandBehavior behave)
   bei System.Data.SQLite.SQLiteCommand.ExecuteReader(CommandBehavior behavior)
   bei System.Data.SQLite.SQLiteCommand.ExecuteNonQuery(CommandBehavior behavior)
   bei Duplicati.Library.Main.Database.LocalDatabase.UpdateRemoteVolume(String name, RemoteVolumeState state, Int64 size, String hash, Boolean suppressCleanup, TimeSpan deleteGraceTime, IDbTransaction transaction)
   bei Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   bei Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   bei Duplicati.Library.Main.Operation.DeleteHandler.DoRun(LocalDeleteDatabase db, IDbTransaction& transaction, Boolean hasVerifiedBackend, Boolean forceCompact, BackendManager sharedManager)
   bei Duplicati.Library.Main.Operation.DeleteHandler.Run()
   bei Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   bei Duplicati.Library.Main.Controller.Delete()
   bei Duplicati.CommandLine.Commands.Delete(TextWriter outwriter, Action`1 setup, List`1 args, Dictionary`2 options, IFilter filter)
   bei Duplicati.CommandLine.Program.ParseCommandLine(TextWriter outwriter, Action`1 setup, Boolean& verboseErrors, String[] args)
   bei Duplicati.CommandLine.Program.RunCommandLine(TextWriter outwriter, TextWriter errwriter, Action`1 setup, String[] args)
Return code: 100

it seems that your local database is in a bad state. Maybe try to save a copy, then recreate it ?

Maybe too late now, but an Information level log might have clues. Profiling too, but it’s huge.
About → Show log → Live → Information sometimes says what it’s up to, before surprise fail.

“updates their state” seems to be where it finds a uniqueness issue, probably against current data.
Profiling log would show an UPDATE just before that, but you’d use a log-file and log-file-log-level.

Another approach would be to see if you can spot a duplicate name in database (even saved one).
DB Browser for SQLite can do that. If need be, I might be able to offer SQL, or just sort and stare…

If, for example, it had duplicate Name with different State, then tried to set up same State, it’d error.

Seems worth a try. It would be hard to have one file name show up twice after recreate (if it works).

Started a delete&rebuild 6pm yesterday, after I saved the AppData/Local/Duplicati Folder. Finished with error after 14 hours:

Recreated database has missing blocks and 1 broken filelists. 
Consider using "list-broken-files" and "purge-broken-files" to purge broken data from the remote store and the database.

Started list-broken-files with the result:

Fertiggestellt!         
0	: 11.09.2022 03:22:01	(15 match(es))
	E:\Privatdaten\Backup_PrivatDatenVonRyzenLogs\Backup_PrivatDatenVonRyzen_2022_09_09_ 847.log (1,08 KB)
	E:\Privatdaten\Backup_PrivatDatenVonRyzenLogs\Backup_PrivatDatenVonRyzen_2022_09_09_ 931.log (1,08 KB)
	E:\Privatdaten\Backup_PrivatDatenVonRyzenLogs\Backup_PrivatDatenVonRyzen_2022_09_09_1919.log (1,08 KB)
	E:\Privatdaten\thomas\PioProjects\RST2022_Testaufbau2\.pio\build\nodemcu-32s\idedata.json (28,95 KB)
	E:\Privatdaten\thomas\PioProjects\RST2022_Testaufbau2\.vscode\c_cpp_properties.json (53,71 KB)
	E:\Privatdaten\thomas\PioProjects\RST2022_Testaufbau2\.vscode\extensions.json (274 Bytes)
	E:\Privatdaten\thomas\PioProjects\RST2022_Testaufbau2\.vscode\launch.json (1,80 KB)
	E:\Privatdaten\thomas\Rhinoceros\Fenster_Einpressprofil\Fenster_Einpressprofil 001.3dm (142,35 KB)
	E:\Privatdaten\thomas\Rhinoceros\Fenster_Einpressprofil\Fenster_Einpressprofil.3dm (176,84 KB)
	E:\Privatdaten\thomas\Rhinoceros\Fenster_Einpressprofil\Fenster_Einpressprofil.3dmbak (139,64 KB)
	E:\Privatdaten\thomas\Rhinoceros\Fenster_Einpressprofil\Fenster_Einpressprofil.3mf (16,83 KB)
	E:\Privatdaten\thomas\Rhinoceros\Fenster_Einpressprofil\Fenster_Einpressprofil.stl (3,40 KB)
Return code: 0

Ok, this is data recreated Sep 9th, the day problems started. Something must have gone wrong backing up. No nice leading to this error but hovever. …

Started purge-broken-files with the result:

Fertiggestellt!          
  Uploading file (13,18 MB) ...
  Deleting file duplicati-20220911T012201Z.dlist.zip.aes ...
Return code: 0

Worked, so far so good, all should be in sync again.

Started Backup Task

Error while running Privatdaten
2022-09-17 13:11:02 +02 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes

At this point I’m loosing a bit confidence in this really great program. Any suggestions?

Task Log and full log:

Zeit
Beginn 2022-09-17 12:19:21
Ende 2022-09-17 13:11:19
Dauer 00:51:58
Quelldateien
Geprüft 113807 (83.38 GB)
Geöffnet 89843 (80.63 GB)
Hinzugefügt 197 (71.38 MB)
Geändert 27 (11.09 MB)
Gelöscht 1
 Test Phase 
 Komprimierungsphase 
 Phase Löschen (alte Sicherungsversionen) 
 Warnings 2 
2022-09-17 12:20:26 +02 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes is listed as Verified with size 4096 but should be 24813, please verify the sha256 hash "nE+sklcKP2E10/H93nUmEao4mRacK228TG71znudLhQ="
2022-09-17 13:10:20 +02 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes is listed as Verified with size 4096 but should be 24813, please verify the sha256 hash "nE+sklcKP2E10/H93nUmEao4mRacK228TG71znudLhQ="
 Errors 1 
2022-09-17 13:11:02 +02 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes
Vollständiges Protokoll  
{
  "DeletedFiles": 1,
  "DeletedFolders": 0,
  "ModifiedFiles": 27,
  "ExaminedFiles": 113807,
  "OpenedFiles": 89843,
  "AddedFiles": 197,
  "SizeOfModifiedFiles": 11624446,
  "SizeOfAddedFiles": 74843242,
  "SizeOfExaminedFiles": 89531491432,
  "SizeOfOpenedFiles": 86576528170,
  "NotProcessedFiles": 0,
  "AddedFolders": 11,
  "TooLargeFiles": 0,
  "FilesWithError": 0,
  "ModifiedFolders": 0,
  "ModifiedSymlinks": 0,
  "AddedSymlinks": 0,
  "DeletedSymlinks": 0,
  "PartialBackup": false,
  "Dryrun": false,
  "MainOperation": "Backup",
  "CompactResults": {
    "DeletedFileCount": 2,
    "DownloadedFileCount": 0,
    "UploadedFileCount": 0,
    "DeletedFileSize": 2829,
    "DownloadedFileSize": 0,
    "UploadedFileSize": 0,
    "Dryrun": false,
    "VacuumResults": null,
    "MainOperation": "Compact",
    "ParsedResult": "Success",
    "Version": "2.0.6.3 (2.0.6.3_beta_2021-06-17)",
    "EndTime": "2022-09-17T11:09:45.3727646Z",
    "BeginTime": "2022-09-17T11:09:37.5813042Z",
    "Duration": "00:00:07.7914604",
    "MessagesActualLength": 0,
    "WarningsActualLength": 0,
    "ErrorsActualLength": 0,
    "Messages": null,
    "Warnings": null,
    "Errors": null,
    "BackendStatistics": {
      "RemoteCalls": 26,
      "BytesUploaded": 88059390,
      "BytesDownloaded": 64067351,
      "FilesUploaded": 6,
      "FilesDownloaded": 3,
      "FilesDeleted": 9,
      "FoldersCreated": 0,
      "RetryAttempts": 5,
      "UnknownFileSize": 0,
      "UnknownFileCount": 0,
      "KnownFileCount": 2531,
      "KnownFileSize": 63844284109,
      "LastBackupDate": "2022-09-17T12:19:21+02:00",
      "BackupListCount": 12,
      "TotalQuotaSpace": 0,
      "FreeQuotaSpace": 0,
      "AssignedQuotaSpace": -1,
      "ReportedQuotaError": false,
      "ReportedQuotaWarning": false,
      "MainOperation": "Backup",
      "ParsedResult": "Success",
      "Version": "2.0.6.3 (2.0.6.3_beta_2021-06-17)",
      "EndTime": "0001-01-01T00:00:00",
      "BeginTime": "2022-09-17T10:19:21.8485653Z",
      "Duration": "00:00:00",
      "MessagesActualLength": 0,
      "WarningsActualLength": 0,
      "ErrorsActualLength": 0,
      "Messages": null,
      "Warnings": null,
      "Errors": null
    }
  },
  "VacuumResults": null,
  "DeleteResults": {
    "DeletedSetsActualLength": 7,
    "DeletedSets": [
      {
        "Item1": 7,
        "Item2": "2022-09-04T03:22:00+02:00"
      },
      {
        "Item1": 6,
        "Item2": "2022-09-05T03:22:00+02:00"
      },
      {
        "Item1": 5,
        "Item2": "2022-09-06T03:22:00+02:00"
      },
      {
        "Item1": 4,
        "Item2": "2022-09-07T03:22:00+02:00"
      },
      {
        "Item1": 3,
        "Item2": "2022-09-08T03:22:00+02:00"
      },
      {
        "Item1": 2,
        "Item2": "2022-09-09T03:22:00+02:00"
      },
      {
        "Item1": 11,
        "Item2": "2022-08-19T03:22:00+02:00"
      }
    ],
    "Dryrun": false,
    "MainOperation": "Delete",
    "CompactResults": {
      "DeletedFileCount": 2,
      "DownloadedFileCount": 0,
      "UploadedFileCount": 0,
      "DeletedFileSize": 2829,
      "DownloadedFileSize": 0,
      "UploadedFileSize": 0,
      "Dryrun": false,
      "VacuumResults": null,
      "MainOperation": "Compact",
      "ParsedResult": "Success",
      "Version": "2.0.6.3 (2.0.6.3_beta_2021-06-17)",
      "EndTime": "2022-09-17T11:09:45.3727646Z",
      "BeginTime": "2022-09-17T11:09:37.5813042Z",
      "Duration": "00:00:07.7914604",
      "MessagesActualLength": 0,
      "WarningsActualLength": 0,
      "ErrorsActualLength": 0,
      "Messages": null,
      "Warnings": null,
      "Errors": null,
      "BackendStatistics": {
        "RemoteCalls": 26,
        "BytesUploaded": 88059390,
        "BytesDownloaded": 64067351,
        "FilesUploaded": 6,
        "FilesDownloaded": 3,
        "FilesDeleted": 9,
        "FoldersCreated": 0,
        "RetryAttempts": 5,
        "UnknownFileSize": 0,
        "UnknownFileCount": 0,
        "KnownFileCount": 2531,
        "KnownFileSize": 63844284109,
        "LastBackupDate": "2022-09-17T12:19:21+02:00",
        "BackupListCount": 12,
        "TotalQuotaSpace": 0,
        "FreeQuotaSpace": 0,
        "AssignedQuotaSpace": -1,
        "ReportedQuotaError": false,
        "ReportedQuotaWarning": false,
        "MainOperation": "Backup",
        "ParsedResult": "Success",
        "Version": "2.0.6.3 (2.0.6.3_beta_2021-06-17)",
        "EndTime": "0001-01-01T00:00:00",
        "BeginTime": "2022-09-17T10:19:21.8485653Z",
        "Duration": "00:00:00",
        "MessagesActualLength": 0,
        "WarningsActualLength": 0,
        "ErrorsActualLength": 0,
        "Messages": null,
        "Warnings": null,
        "Errors": null
      }
    },
    "ParsedResult": "Success",
    "Version": "2.0.6.3 (2.0.6.3_beta_2021-06-17)",
    "EndTime": "2022-09-17T11:09:45.3727646Z",
    "BeginTime": "2022-09-17T11:09:23.8981129Z",
    "Duration": "00:00:21.4746517",
    "MessagesActualLength": 0,
    "WarningsActualLength": 0,
    "ErrorsActualLength": 0,
    "Messages": null,
    "Warnings": null,
    "Errors": null,
    "BackendStatistics": {
      "RemoteCalls": 26,
      "BytesUploaded": 88059390,
      "BytesDownloaded": 64067351,
      "FilesUploaded": 6,
      "FilesDownloaded": 3,
      "FilesDeleted": 9,
      "FoldersCreated": 0,
      "RetryAttempts": 5,
      "UnknownFileSize": 0,
      "UnknownFileCount": 0,
      "KnownFileCount": 2531,
      "KnownFileSize": 63844284109,
      "LastBackupDate": "2022-09-17T12:19:21+02:00",
      "BackupListCount": 12,
      "TotalQuotaSpace": 0,
      "FreeQuotaSpace": 0,
      "AssignedQuotaSpace": -1,
      "ReportedQuotaError": false,
      "ReportedQuotaWarning": false,
      "MainOperation": "Backup",
      "ParsedResult": "Success",
      "Version": "2.0.6.3 (2.0.6.3_beta_2021-06-17)",
      "EndTime": "0001-01-01T00:00:00",
      "BeginTime": "2022-09-17T10:19:21.8485653Z",
      "Duration": "00:00:00",
      "MessagesActualLength": 0,
      "WarningsActualLength": 0,
      "ErrorsActualLength": 0,
      "Messages": null,
      "Warnings": null,
      "Errors": null
    }
  },
  "RepairResults": null,
  "TestResults": {
    "MainOperation": "Test",
    "VerificationsActualLength": 4,
    "Verifications": [
      {
        "Key": "duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes",
        "Value": [
          {
            "Key": "Error",
            "Value": "Invalid header marker"
          }
        ]
      },
      {
        "Key": "duplicati-20220227T012914Z.dlist.zip.aes",
        "Value": []
      },
      {
        "Key": "duplicati-iac00f557df8d4f8f85e8674c5a265f51.dindex.zip.aes",
        "Value": []
      },
      {
        "Key": "duplicati-b01650f97bcfa471d87fcd24ab6f928b1.dblock.zip.aes",
        "Value": []
      }
    ],
    "ParsedResult": "Success",
    "Version": "2.0.6.3 (2.0.6.3_beta_2021-06-17)",
    "EndTime": "2022-09-17T11:11:19.6095205Z",
    "BeginTime": "2022-09-17T11:10:20.1618168Z",
    "Duration": "00:00:59.4477037",
    "MessagesActualLength": 0,
    "WarningsActualLength": 0,
    "ErrorsActualLength": 0,
    "Messages": null,
    "Warnings": null,
    "Errors": null,
    "BackendStatistics": {
      "RemoteCalls": 26,
      "BytesUploaded": 88059390,
      "BytesDownloaded": 64067351,
      "FilesUploaded": 6,
      "FilesDownloaded": 3,
      "FilesDeleted": 9,
      "FoldersCreated": 0,
      "RetryAttempts": 5,
      "UnknownFileSize": 0,
      "UnknownFileCount": 0,
      "KnownFileCount": 2531,
      "KnownFileSize": 63844284109,
      "LastBackupDate": "2022-09-17T12:19:21+02:00",
      "BackupListCount": 12,
      "TotalQuotaSpace": 0,
      "FreeQuotaSpace": 0,
      "AssignedQuotaSpace": -1,
      "ReportedQuotaError": false,
      "ReportedQuotaWarning": false,
      "MainOperation": "Backup",
      "ParsedResult": "Success",
      "Version": "2.0.6.3 (2.0.6.3_beta_2021-06-17)",
      "EndTime": "0001-01-01T00:00:00",
      "BeginTime": "2022-09-17T10:19:21.8485653Z",
      "Duration": "00:00:00",
      "MessagesActualLength": 0,
      "WarningsActualLength": 0,
      "ErrorsActualLength": 0,
      "Messages": null,
      "Warnings": null,
      "Errors": null
    }
  },
  "ParsedResult": "Error",
  "Version": "2.0.6.3 (2.0.6.3_beta_2021-06-17)",
  "EndTime": "2022-09-17T11:11:19.6251332Z",
  "BeginTime": "2022-09-17T10:19:21.8485653Z",
  "Duration": "00:51:57.7765679",
  "MessagesActualLength": 66,
  "WarningsActualLength": 2,
  "ErrorsActualLength": 1,
  "Messages": [
    "2022-09-17 12:19:21 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: Die Operation Backup wurde gestartet",
    "2022-09-17 12:19:43 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()",
    "2022-09-17 12:19:44 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Retrying:  ()",
    "2022-09-17 12:19:54 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()",
    "2022-09-17 12:20:26 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (2,47 KB)",
    "2022-09-17 12:20:26 +02 - [Information-Duplicati.Library.Main.Operation.Backup.RecreateMissingIndexFiles-RecreateMissingIndexFile]: Re-creating missing index file for duplicati-b2a297cd9c6624c7891eb1aa8d731ce0e.dblock.zip.aes",
    "2022-09-17 12:20:27 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ibd7ac9340ae5429eb03f2e484b9ba345.dindex.zip.aes (781 Bytes)",
    "2022-09-17 12:20:27 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ibd7ac9340ae5429eb03f2e484b9ba345.dindex.zip.aes (781 Bytes)",
    "2022-09-17 13:08:21 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b1a152d09aad54606a51091cdea438f4a.dblock.zip.aes (49,97 MB)",
    "2022-09-17 13:08:28 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b7f2390bfa6db4877b2cb5c2c497c31d7.dblock.zip.aes (20,75 MB)",
    "2022-09-17 13:08:57 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b7f2390bfa6db4877b2cb5c2c497c31d7.dblock.zip.aes (20,75 MB)",
    "2022-09-17 13:08:58 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ia4f18baa1e3e422796ba25d28b3585e7.dindex.zip.aes (14,58 KB)",
    "2022-09-17 13:08:58 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ia4f18baa1e3e422796ba25d28b3585e7.dindex.zip.aes (14,58 KB)",
    "2022-09-17 13:09:12 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b1a152d09aad54606a51091cdea438f4a.dblock.zip.aes (49,97 MB)",
    "2022-09-17 13:09:12 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ia3fcfd67021349f998a855732cfcd620.dindex.zip.aes (56,48 KB)",
    "2022-09-17 13:09:13 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ia3fcfd67021349f998a855732cfcd620.dindex.zip.aes (56,48 KB)",
    "2022-09-17 13:09:13 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20220917T101921Z.dlist.zip.aes (13,19 MB)",
    "2022-09-17 13:09:23 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20220917T101921Z.dlist.zip.aes (13,19 MB)",
    "2022-09-17 13:09:23 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-StartCheck]: Start checking if backups can be removed",
    "2022-09-17 13:09:23 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-FramesAndIntervals]: Time frames and intervals pairs: 7.00:00:00 / 1.00:00:00, 28.00:00:00 / 7.00:00:00, 365.00:00:00 / 31.00:00:00"
  ],
  "Warnings": [
    "2022-09-17 12:20:26 +02 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes is listed as Verified with size 4096 but should be 24813, please verify the sha256 hash \"nE+sklcKP2E10/H93nUmEao4mRacK228TG71znudLhQ=\"",
    "2022-09-17 13:10:20 +02 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes is listed as Verified with size 4096 but should be 24813, please verify the sha256 hash \"nE+sklcKP2E10/H93nUmEao4mRacK228TG71znudLhQ=\""
  ],
  "Errors": [
    "2022-09-17 13:11:02 +02 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes"
  ],
  "BackendStatistics": {
    "RemoteCalls": 26,
    "BytesUploaded": 88059390,
    "BytesDownloaded": 64067351,
    "FilesUploaded": 6,
    "FilesDownloaded": 3,
    "FilesDeleted": 9,
    "FoldersCreated": 0,
    "RetryAttempts": 5,
    "UnknownFileSize": 0,
    "UnknownFileCount": 0,
    "KnownFileCount": 2531,
    "KnownFileSize": 63844284109,
    "LastBackupDate": "2022-09-17T12:19:21+02:00",
    "BackupListCount": 12,
    "TotalQuotaSpace": 0,
    "FreeQuotaSpace": 0,
    "AssignedQuotaSpace": -1,
    "ReportedQuotaError": false,
    "ReportedQuotaWarning": false,
    "MainOperation": "Backup",
    "ParsedResult": "Success",
    "Version": "2.0.6.3 (2.0.6.3_beta_2021-06-17)",
    "EndTime": "0001-01-01T00:00:00",
    "BeginTime": "2022-09-17T10:19:21.8485653Z",
    "Duration": "00:00:00",
    "MessagesActualLength": 0,
    "WarningsActualLength": 0,
    "ErrorsActualLength": 0,
    "Messages": null,
    "Warnings": null,
    "Errors": null
  }
}

It’s common for a problem to have roots in, for example, the backup just before having some failures. Looking at About → Show log → Stored might find something failing on backup before “quit working”.
If your backups are on a known schedule, you can also put your old DB back in just to check job logs. Failures (unlike warnings) typically log to the other log, but you can still look for where the gaps are…

2022-09-17 12:20:26 +02 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes is listed as Verified with size 4096 but should be 24813, please verify the sha256 hash “nE+sklcKP2E10/H93nUmEao4mRacK228TG71znudLhQ=”

says that file is mysteriously shorter than expected, and also an oddly even binary size, which typically suggests a filesystem problem, or sometimes a transmission problem. I think SMB can use even sizes. Assuming this is still pCloud and WebDAV, I’m not sure how a 4KiB file could happen. More tests later.

Usually such size errors get noticed by original database which knows what the size is supposed to be, however the recreated database (that’s what you’re on, right?) won’t have the old record, so this is odd.

You can determine rather positively whether the file is still good by downloading it however that’s done. Seeing it still at 4096 bytes will be suspicious, but if AES Crypt can’t decrypt it, that means it’s bad now. Using Duplicati’s provided SharpAESCrypt.exe can do the same decryption test if you’d rather use CLI.

If a file is truly useless now, you can delete it and do the list and purge of broken files again to clean up.

2022-09-17 13:11:02 +02 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes

Needs more of an error message. The one line summary is just summary, but details below are cut off.
About → Show log → Live → Error might be the easiest way to see details. Possibly another bad file…

Do you have any way to see file dates? Maybe the dblock and dindex file are a pair. You can also try to decrypt the dindex. If it will decrypt (I’m not counting on it), its .zip file will have its dblock name inside.

A dindex file is the index to a dblock file. They should be paired. If you lose a dblock, you lose blocks of source files that were put there. If you lose a dindex, DB recreate has to do a longer download to open dblock files until it finds the blocks it needs. This can be seen in progress bar moving through 90-100%, with details available in a verbose log, e.g. About → Show log → Live → Verbose if one wants to watch.

The other way to figure out if these two files are a dblock/dindex pair is by browsing in the old database, following directions I’ll detail if you want to look. Relevant tables are Remotevolume and IndexBlockLink.

Another option is to Create bug report and post a link for someone else to look in the sanitized database. Doing a file time check on the mentioned files would be useful. If they’re older, you can temporarily copy older database back in so that’s the one that gives the report. Put current back in for further work though.

Thank you spending your time helping me with this.

I downloaded the file duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes.

  • File size is 4096 byte (reported by Windows Expoler, Total Commander and Notepad++)

  • Downloaded SharpAESCrypt.exe und run it on the file

    e:\Duplicati_Error_Backup>SharpAESCrypt.exe d C115D1................0AB7BF9B4 duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes
    Error: File length is invalid
    
  • Run SharpAESCrypt.exe on another downloaded file to proof setup returned data

  • Second file shows also but different error:

    e:\Duplicati_Error_Backup>SharpAESCrypt.exe d C115D18.......B7BF9B4 duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes
    Error: Invalid header marker
    

The logfile (rerorded) contains some information on errors during Repair

17. Sept. 2022 11:00: Failed while executing "Repair" with id: 4
Duplicati.Library.Interface.UserInformationException: Recreated database has missing blocks and 1 broken filelists. Consider using "list-broken-files" and "purge-broken-files" to purge broken data from the remote store and the database.
   bei Duplicati.Library.Main.Operation.RecreateDatabaseHandler.DoRun(LocalDatabase dbparent, Boolean updating, IFilter filter, NumberedFilterFilelistDelegate filelistfilter, BlockVolumePostProcessor blockprocessor)
   bei Duplicati.Library.Main.Operation.RecreateDatabaseHandler.Run(String path, IFilter filter, NumberedFilterFilelistDelegate filelistfilter, BlockVolumePostProcessor blockprocessor)
   bei Duplicati.Library.Main.Operation.RepairHandler.RunRepairLocal(IFilter filter)
   bei Duplicati.Library.Main.Operation.RepairHandler.Run(IFilter filter)
   bei Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   bei Duplicati.Library.Main.Controller.Repair(IFilter filter)
   bei Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

and Backup I executed today

17. Sept. 2022 11:04: Failed while executing "Backup" with id: 4
Duplicati.Library.Interface.UserInformationException: The database was attempted repaired, but the repair did not complete. This database may be incomplete and the backup process cannot continue. You may delete the local database and attempt to repair it again.
   bei Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext()
--- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde ---
   bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   bei CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
   bei Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.<Backup>b__0(BackupResults result)
   bei Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   bei Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)
   bei Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

All the information in this and the last post apply to the rebuild local database. No modifications on data was executed besides Duplicati.

I might try another rebuilt (taking 14 hours) with activated live log - which option do you suggest (verbose?)

is probably failing a rough size sanity check. I don’t think it knows actual size. Duplicati does that.

AES File Format

1 Octet - File size modulo 16 in least significant bit positions

The format also says file should begin with AES, then soon get CREATED_BY or some identifier.
Opening a sample .aes file in notepad, I can see AES !CREATED_BY SharpAESCrypt v1.3.3.0.
If your file start is different, then what happened was not truncation (typical), but something odder.
Figuring out if it ever uploaded right (or if did upload but pCloud lost it) will be hard with no history.
History would be in whichever database fits the timestamp on the files. Any way to see file times?

Duplicati has them in DB if pCloud has no other way. Job → Show log → Remote and click a list.
Small example below. Yours will be long, and maybe unwieldy. Looking right in DB might be faster.
Better still would be if pCloud can show file times, so we can make some more guesses on history.
Viewing the Duplicati Server Logs and Viewing the log files of a backup job (or log gaps) may help.

[
{“Name”:“duplicati-20220916T181233Z.dlist.zip”,“LastAccess”:“2022-09-16T14:12:34.1993477-04:00”,“LastModification”:“2022-09-16T14:12:34.1993477-04:00”,“Size”:665,“IsFolder”:false},
{“Name”:“duplicati-b7659de102dc84575a9acf1f6b5523051.dblock.zip”,“LastAccess”:“2022-09-16T14:12:33.5334751-04:00”,“LastModification”:“2022-09-16T14:12:33.5334751-04:00”,“Size”:577,“IsFolder”:false},
{“Name”:“duplicati-i40dfa84b590447709049cf586516b94c.dindex.zip”,“LastAccess”:“2022-09-16T14:12:34.0217655-04:00”,“LastModification”:“2022-09-16T14:12:34.0217655-04:00”,“Size”:609,“IsFolder”:false}
]

That has size too, so seeing size history could also say whether it arrived wrong size or went bad.
I’m pretty sure there’s a size check at end of backup though, and errors would have been flagged.
This gets back to trying to know what sort of issue happened before the new backups wouldn’t go.

You can if you like, but it might not add much compared to some other things I’ve been requesting.
You’ll probably see it download all the dlist files, which have names and the hash IDs of file blocks.
After that it download the dindex files which say what dblock has a hash. Bad dindex gets an error.
Assuming that dindex had needed ID, it will search all dblocks for that ID. Bad dblock gets an error.
End result is a block that some source file needs is missing, so purge-broken-files tidies what’s left.

Opened both .aes files, result confirms Duplicati errors:

duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes (Error: Invalid header marker) shows no header in editor

duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes (Error: File length is invalid) the header is there

I have mounted the pcloud as a drive and can se file attributes. Timestamps from pcloud match the date/time of original upload when failing (scheduled on 03:22 am) :

duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes	4.096	11.09.2022 03:26	----

duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes	2.048	10.09.2022 03:27	----

Here is a list of all files at pcloud for this backup with date newer than 08.09.2022

duplicati-20220917T101921Z.dlist.zip.aes					13.833.373		17.09.2022 13:09	----
duplicati-b1a152d09aad54606a51091cdea438f4a.dblock.zip.aes	52.399.165		17.09.2022 13:09	----
duplicati-ia3fcfd67021349f998a855732cfcd620.dindex.zip.aes	57.837			17.09.2022 13:09	----
duplicati-b7f2390bfa6db4877b2cb5c2c497c31d7.dblock.zip.aes	21.753.309		17.09.2022 13:08	----
duplicati-ia4f18baa1e3e422796ba25d28b3585e7.dindex.zip.aes	14.925			17.09.2022 13:08	----
duplicati-20220911T012202Z.dlist.zip.aes					13.816.669		17.09.2022 12:16	----
.davfs.tmp039e99											268.288			11.09.2022 03:29	--h-
.davfs.tmp2f9619											4.096			11.09.2022 03:28	--h-
.davfs.tmp0a5e99											239.558			11.09.2022 03:27	--h-
duplicati-i6324d97e8d7347b8a1cc3ea90d4c465c.dindex.zip.aes	3.709			11.09.2022 03:27	----
duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes	4.096			11.09.2022 03:26	----
.davfs.tmp3cfe21											4.096			11.09.2022 03:26	--h-
duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes	2.048			10.09.2022 03:27	----
.davfs.tmp0c4199											4.096			10.09.2022 03:27	--h-
duplicati-i642faf4e90b44b43920386c7f0a7e327.dindex.zip.aes	4.797			09.09.2022 03:30	----
duplicati-bd092300e2a1642c695580e3903b4ba6c.dblock.zip.aes	547.309			09.09.2022 03:30	----

The are some temporary files - there are some more, but very few single ones, with older dates on the drive when backup was still working. Trying to decrypt two of it, SharpAESCrypt throws Error: File length is invalid.

Opened remote log - confirms timestamps above

17. Sept. 2022 13:10: list:
{"Name":"duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes","LastAccess":"2022-09-11T03:26:42+02:00","LastModification":"2022-09-11T03:26:42+02:00","Size":4096,"IsFolder":false},

{"Name":"duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes","LastAccess":"2022-09-10T03:27:33+02:00","LastModification":"2022-09-10T03:27:33+02:00","Size":2048,"IsFolder":false},

If I missed something please let me know, I’ll try my best -Thanks and have a nice evening.

The date matters. The dindex appears to be dated Sep 10, the dblock Sep 11, so might not be a pair.
Because decrypting the dindex won’t work, only way to know about pairing is in DB or DB bug report.
“Sep 9th, the day problems started” might be consistent with uploaded files from that day looking thin.
A normal completion would have a dlist file from that day. It looks like your time zone is +2 from UTC.
All these .davfs.tmp files might be a sign of an uploading issue. Some destination servers use tmp file during the upload, and rename it to its eventual name after the upload is complete. A pCloud warning:

Thousands of .davfs.tmp files (from forum of Duplicacy another backup program that can do WebDAV.
Feel free to do your own searches to guess at what those are, but that’s not a filename from Duplicati.
Several of those also have that 4096 byte length that’s been bothering us for an actually uploaded file.

If you’re getting occasional upload errors that are reported to Duplicati, Duplicati will retry, and job log’s “Complete log” will count the retries in the “BackendStatistics” section under the “RetryAttempts” value.
You can see "RetryAttempts": 5, in your post, but they’re probably from the TestHandler download.

An upload retry is done under a different name, and the name of the failed upload is attempted deleted.
You might have to check old database to see RetryAttempts from earlier. Were old davfs tmp common?
I suppose you could also get an opinion from other two jobs. Maybe retries were mostly covering issue.

Without old logs or DB, I’m less sure about what failed, in what ways, when, but making some progress.
If you have unreliable uploads (and it appears you do), this is potentially going to be a continuing irritant.
If you search with Google for “pcloud” “webdav” “duplicati” you’ll find other issues you can compare with.

If you’d rather end analysis and just try for the best way to get back to maybe-not-quite-stable normal, a delete of the broken dblock will possibly let list-broken-files and purge-broken-files remove affected files.

The broken dindex isn’t serving as a dindex, so DB recreate will probably have to search in the dblocks.
I’m not sure if deleting or leaving it is better. Might be the same result, but deleting will make less noise.
Duplicati can recreate deleted dindex if it has a good database matching the destination, but you don’t.
Your old database might have had info, but doesn’t match destination. Recreated DB lacks that old info.

As a side note, dindex files are helpful but less critical than dblock. This tool doesn’t even use dindex:

Duplicati.CommandLine.RecoveryTool.exe

This tool can be used in very specific situations, where you have to restore data from a corrupted backup.

Regular Duplicati wants things to look as they should, or it tries to guide you into getting them that way.
RecoveryTool tries to make the most of whatever it has left, in terms of dblock files that it can get open.
If you had to, you could probably keep the old version backup for this, and run a clean new backup, but taking that path is premature as you can probably get your current backup backing up as shown above.

I deleted the two files and started “delete and rebuild” again. Looks good this time: Repair found the two missing files, but database seems to be in sync again and backup executed without errors.

 18. Sept. 2022 04:13 - Operation: Backup
Zeit
Beginn 2022-09-18 03:22:00
Ende 2022-09-18 04:13:36
Dauer 00:51:37
Quelldateien
Geprüft 113811 (83.38 GB)
Geöffnet 89847 (80.63 GB)
Hinzugefügt 9 (61.69 KB)
Geändert 148 (21.86 MB)
Gelöscht 0
 Test Phase 
 Phase Löschen (alte Sicherungsversionen) 
 Warnings 0
 Errors 0
Vollständiges Protokoll 
................................................
 18. Sept. 2022 01:17 - Operation: Reparieren
Zeit
Beginn 2022-09-18 00:55:07
Ende 2022-09-18 01:17:22
Dauer 00:22:16
Quelldateien
Geprüft 0 (0 bytes)
Geöffnet 0 (0 bytes)
Hinzugefügt 0 (0 bytes)
Geändert 0 (0 bytes)
Gelöscht 0
 Datenbank-Wiederherstellungsphase 
 Warnings 2 
2022-09-18 01:16:08 +02 - [Warning-Duplicati.Library.Main.Database.LocalRecreateDatabase-MissingVolumesDetected]: Found 1 missing volumes; attempting to replace blocks from existing volumes
2022-09-18 01:16:56 +02 - [Warning-Duplicati.Library.Main.Database.LocalRecreateDatabase-MissingVolumesDetected]: Found 1 missing volumes; attempting to replace blocks from existing volumes
 Errors 1 
2022-09-18 01:12:09 +02 - [Error-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-MissingFileDetected]: Remote file referenced as duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes by duplicati-i6324d97e8d7347b8a1cc3ea90d4c465c.dindex.zip.aes, but not found in list, registering a missing remote file
Vollständiges Protokoll 

At least I thought it should, it started a “check data(base)”, )located under “for professionals”, right to database) I hope translation is correct I have the german UI, and got an error again :

Error while running Privatdaten
Unexpected difference in fileset version 1: 17.09.2022 12:19:21 (database id: 12), found 155922 entries, but expected 155924

I’m not sure how to interpret this, might be o.k. since I deleted the two files now missing in the older fileset.

Started a commandline compare first version 0 - 1 with timestams reported:

  1: 17.09.2022 10:19:21
  0: 18.09.2022 01:22:00

Then compared 0-2:

Listing changes
  2: 11.09.2022 01:22:02
  0: 18.09.2022 01:22:00

No errors reported changes seem to be o.k

Find command for frequently modified PW-file looks good as well:

Listing files and versions:
E:\Privatdaten\apache-ftpserver\ftp_transfer\PWD_neu.kdbx
0       : 18.09.2022 03:22:00 108,47 KB
1       : 17.09.2022 12:19:21 108,47 KB
2       : 11.09.2022 03:22:02 108,47 KB
3       : 03.09.2022 03:22:00 108,47 KB
4       : 02.09.2022 03:22:01 108,47 KB
5       : 26.08.2022 03:22:00 106,72 KB
6       : 22.07.2022 03:22:00 107,06 KB
7       : 16.06.2022 03:22:05 104,33 KB
8       : 10.05.2022 03:26:36 102,67 KB
9       : 01.04.2022 03:27:26 102,34 KB
10      : 27.02.2022 02:29:14 100,09 KB
11      : 24.02.2022 01:48:20 99,87 KB
12      : 23.02.2022 23:52:24  -

Seems all is in sync again and I can access old files and versions. Thank you very much for your help.

Btw. I think Duplicati has still some room for improvement here. Whatever interface you use in is just a question of time/possibility data corrption during transfer will happen. Seems Duplicati can handle it in most cases, but there seems to be at least one exception.
If I can contribute helping to identify the gap I’ll be glad to do so. Maybe information ist lost in this case but it is very likely it will happen again sooner or later.
At least I know how to fix it next time and maybe this information will help somebody else.

There is much useful information in the old database that I’ve been asking about. Does a copy exist?
If you don’t want to mess with the working backup, you can make a dummy backup job to look at DB.
What’s in the dummy job doesn’t matter. Just never run it. Only use it to look for things I’ve asked for.

Was duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes kept to look at header problem?
Is that 2048 bytes long (again suspiciously binary-even)? I wonder what actually got put inside there?
Dragging it onto notepad would be easy. If it’s blank, that’s a clue. If it’s random junk, that’s a clue too.

I still have copies of deleted files and database before restore

duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes:


SharpAESCrypt.exe d C.......4 duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes
Error: File length is invalid

duplicati-i6324d97e8d7347b8a1cc3ea90d4c465c.dindex.zip.aes:
SharpAESCrypt.exe d C…4 duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes
Error: Invalid header marker


Both won’t decrypt, filelenghts are correct. I think I reported this before, sorry I did not explicitly mention they are not blank.

I’m not into databases so this comes not so easy. I will work through your text and try find out you asked for.

Yep. That’s a wrong start of file. Is the header marker anywhere? Might need to search, but flipping through sample spots might give you some idea of what’s there – maybe something visible, maybe all just random.

Correct on what basis? Duplicati said the 4096 should be 24813. This is (or was) all in some database.

Some of it doesn’t require a database browser. Just make the dummy job, copy the old database where Local database path is (there won’t be a database there before the first backup), and look into the logs.

On the home screen under Reporting for the fake job, Create bug report downloads a privacy-sanitized database copy which will probably be too big to just put in the forum, but maybe you have some cloud storage that will let you publish a link to file. That’s probably the easiest way to share the DB bug report.

Created a bugreport of the original database you can download it and and two .aes files here

Thank you. The most obvious thing is that pCloud WebDAV uploads had a very bad Sep 10 and 11.
This would be visible in a log-file=<path> log-file-log-level=retry if you had one. Maybe time to start.

The last log in this job database was the Sep 9 backup, but it’s erased in the bug report, for privacy.
You can look at it in your Duplicati job logs, and Sep 10 and 11 might be in the server log, if backup completely failed. It looks like it exhausted the default number-of-retries which is 5, so errored out…

This by itself is not a problem. Next Duplicati backup attempts to pick up from where uploads ended.

Sep 9 looked normal. list at start checks things are in order, dblock file data (and its dindex) upload,
dlist says what’s in backup, retention deletes older dlist, list checks things, and sampled test occurs.

ID      OperationID     Timestamp       Operation       Path                                                            Data
4964    287             1662686871      list
4965    287             1662687049      put             duplicati-bd092300e2a1642c695580e3903b4ba6c.dblock.zip.aes      {"Size":547309,"Hash":"BeV/mRq0tKCHYkrFc6XezIAwA/7WvqQFguZEk5lnrZ4="}
4966    287             1662687052      put             duplicati-i642faf4e90b44b43920386c7f0a7e327.dindex.zip.aes      {"Size":4797,"Hash":"bUMLX4rwCSYJlQ7CMQO6Bb+F6ofmBVI71S5lUUbqe+0="}
4967    287             1662687054      put             duplicati-20220909T012200Z.dlist.zip.aes                        {"Size":13816701,"Hash":"ToGS8S0Iz6tPoBC3j+DOlMaJaKmol/c0zvp96Jd9ijg="}
4968    287             1662687073      delete          duplicati-20220812T012200Z.dlist.zip.aes        
4969    287             1662687512      list
4970    287             1662687532      get             duplicati-20220907T012200Z.dlist.zip.aes                        {"Size":13816221,"Hash":"zExvCpvP4k2vOdtpVMTJ2QAPPlVI7PCsutR7dKNzLRU="}
4971    287             1662687532      get             duplicati-i0607f51181ee4ba3b2af3cb5aa675813.dindex.zip.aes      {"Size":146077,"Hash":"Ws1Q8iWPxaHE4wCLpyaZflPRQI3Aprad3fB2FCUdV14="}
4972    287             1662687532      get             duplicati-b525acb4e23e44af8aa260994d8d65c21.dblock.zip.aes      {"Size":52333469,"Hash":"kIXuIv93XrKSZPaQqbvFMHZRlUjZqkmK61zV0Qbwm8M="}

Sep 10 has to try dblock twice, uploads dindex OK, exhausts default 5 retries on dlist, and errors out.
The retried dblock gets new random name. The retried dlists gets its name incremented by 1 second.
Log file would be clearer, but seeing the hash and size be the same suggests that it’s the same data.

ID      OperationID     Timestamp       Operation       Path                                                            Data
4973    288             1662773040      list
4974    288             1662773154      put             duplicati-bb55ed12f64a84368a9d7165ef471e5f3.dblock.zip.aes      {"Size":384525,"Hash":"eyGcoUFdYE/PiFYhHG/z5RP4A1EMY7jfgAnCSzNZkLs="}
4975    288             1662773184      put             duplicati-b2a297cd9c6624c7891eb1aa8d731ce0e.dblock.zip.aes      {"Size":384525,"Hash":"eyGcoUFdYE/PiFYhHG/z5RP4A1EMY7jfgAnCSzNZkLs="}
4976    288             1662773204      put             duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes      {"Size":5069,"Hash":"e8aRJJvi/uRuxa26YtxzSfjYNu0mhJynaiLg1x50e30="}
4977    288             1662773225      put             duplicati-20220910T012200Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}
4978    288             1662773256      put             duplicati-20220910T012201Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}
4979    288             1662773285      put             duplicati-20220910T012202Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}
4980    288             1662773325      put             duplicati-20220910T012203Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}
4981    288             1662773364      put             duplicati-20220910T012204Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}
4982    288             1662773404      put             duplicati-20220910T012205Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}

Sep 11 is still not uploading well, but one odd finding is it’s retrying Sep 10 dlist using reused names.
The size and content hash also seem to have changed. Some change may be normal (time stamps),
however I’m not sure if that’s enough to account for the size change. Regardless, I can’t dissect files.

ID      OperationID     Timestamp       Operation       Path                                                            Data
4983    289             1662859391      list
4984    289             1662859537      put             duplicati-bd96475eb84a1488192cbd3d88c6817ec.dblock.zip.aes      {"Size":24813,"Hash":"nE+sklcKP2E10/H93nUmEao4mRacK228TG71znudLhQ="}
4985    289             1662859575      put             duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes      {"Size":24813,"Hash":"nE+sklcKP2E10/H93nUmEao4mRacK228TG71znudLhQ="}
4986    289             1662859596      put             duplicati-i6324d97e8d7347b8a1cc3ea90d4c465c.dindex.zip.aes      {"Size":3709,"Hash":"u83hCCODlqVEdJIlHM/etDFtE0slHrAqlw3/z9eghA0="}
4987    289             1662859616      put             duplicati-20220911T012200Z.dlist.zip.aes                        {"Size":13817997,"Hash":"Xa5vSpHUUgR4W3ZvrWT2wRIKFEGDYT0aSaFS5BRHT3I="}
4988    289             1662859616      put             duplicati-20220910T012201Z.dlist.zip.aes                        {"Size":13816701,"Hash":"8mDIdHCDBVusing6h9jiRLz1bb0RT1Qj1meeXs35aSU="}
4989    289             1662859660      put             duplicati-20220911T012201Z.dlist.zip.aes                        {"Size":13817997,"Hash":"Xa5vSpHUUgR4W3ZvrWT2wRIKFEGDYT0aSaFS5BRHT3I="}
4990    289             1662859679      put             duplicati-20220910T012202Z.dlist.zip.aes                        {"Size":13816701,"Hash":"8mDIdHCDBVusing6h9jiRLz1bb0RT1Qj1meeXs35aSU="}
4991    289             1662859712      put             duplicati-20220910T012203Z.dlist.zip.aes                        {"Size":13816701,"Hash":"8mDIdHCDBVusing6h9jiRLz1bb0RT1Qj1meeXs35aSU="}
4992    289             1662859751      put             duplicati-20220910T012204Z.dlist.zip.aes                        {"Size":13816701,"Hash":"8mDIdHCDBVusing6h9jiRLz1bb0RT1Qj1meeXs35aSU="}

Let’s try looking at the size errors.

duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes at size 5069 had a Sep 10 upload, but because backup ended prematurely, its after-backup list was not done. The list at start of Sep 11 found

{“Name”:“duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes”,“LastAccess”:“2022-09-10T03:27:33+02:00”,“LastModification”:“2022-09-10T03:27:33+02:00”,“Size”:2048,“IsFolder”:false},

duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes at size 24813 had upload on Sep 11 suffering a similar fate. The before backup file list check at start of Sep 12 found

{“Name”:“duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes”,“LastAccess”:“2022-09-11T03:26:42+02:00”,“LastModification”:“2022-09-11T03:26:42+02:00”,“Size”:4096,“IsFolder”:false},

It looks like pCloud accepted these files fine (or there would have been retries) but corrupted contents.

I still haven’t figured out where the constraint error is from. That might need some logs that don’t exist.
After the above errors, your later backups just do the before backup file list, then error. The list can try changing the Remotevolume table (sometimes logging why). The code that might be hitting error is at:

The log messages look like information level, and retry level is a little more, so that would catch them.

Thank you for analyzing this. With your expIanation I was able find both files as put… and in the list of the next day with different lengh as well.
Your suggestion is the change the log level for the job. Currently this was default, no entry in Advanced options, I changed it to log level retry. OK? So this hopefully will give more information it this will happen again.

You also need to say where to log to. This isn’t the regular job log here.

Retry level is a fairly light log. I hope we don’t need to go super-detailed such as seeing all the SQL.
That might give a very nice view of the exact UPDATE that got the constraint failure, but log is huge.
Let’s not go there unless lighter logging doesn’t give us the answer, if the constraint failure reoccurs.

Your less-than-fatal retry situation is a different worry. Retries are OK. Acceptance then loss is not…

Feature Request: PCloud backup #4337 is (like so many others) awaiting volunteer developer work, however it gives some worrisome comments about pCloud WebDAV, and a possible rclone solution.

Found a UNIQUE constraint clue. I’d been wondering if there was a duplicate name, and I found one.
SELECT Name FROM Remotevolume gave 2543 rows
SELECT DISTINCT Name FROM Remotevolume gave 2542 rows, so there’s a duplicate somewhere
SELECT Name FROM Remotevolume GROUP BY Name HAVING COUNT(Name) > 1
said duplicati-20220910T012206Z.dlist.zip.aes

ID      OperationID     Name                                            Type    Size            Hash                                            State           VerificationCount       DeleteGraceTime
3471    289             duplicati-20220910T012206Z.dlist.zip.aes        Files   13817645        Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0=    Temporary       0                       0
3478    289             duplicati-20220910T012206Z.dlist.zip.aes        Files   13816701        8mDIdHCDBVusing6h9jiRLz1bb0RT1Qj1meeXs35aSU=    Uploading       0                       0

The Size and Hash are familiar, but the Name has not been mentioned before and isn’t in the file list therefore this finding might be going nowhere but it needs a more expert opinion and none is available.