Remote backup task quit working : constraint failed UNIQUE constraint failed: Remotevolume.Name, Remotevolume.State

Thank you for the crosslink, interessting. Slowly I glimpse the complexity of this. Maybe i should have a look at rclone or change the backup concept running the duplicati backup on a local network drive which is synced with pcloud using their SW. Or jaustwait, this worked for 6 months after all, and could be fixed without loss. Maybe the pcloud API will be supported some day.

The job does not run. Sorry i missed that, it just exected once on Sunday rebuilding the local database. Stops complaining about the 2 missing files I deleted:

2022-09-22 03:22:00 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: Die Operation Backup wurde gestartet
2022-09-22 03:22:19 +02 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 1: 17.09.2022 12:19:21 (database id: 12), found 155922 entries, but expected 155924
   bei Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency(Int64 blocksize, Int64 hashsize, Boolean verifyfilelists, IDbTransaction transaction)
   bei Duplicati.Library.Main.Operation.Backup.BackupDatabase.<>c__DisplayClass34_0.<VerifyConsistencyAsync>b__0()
   bei Duplicati.Library.Main.Operation.Common.SingleRunner.<>c__DisplayClass3_0.<RunOnMain>b__0()
   bei Duplicati.Library.Main.Operation.Common.SingleRunner.<DoRunOnMain>d__2`1.MoveNext()
--- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde ---
   bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   bei Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext()

Sorry for german log, changed the interface to english some days ago, but seems to depend on language the job was created.

Meaning there was also a backup on Sunday or there was never a backup after the database rebuild?
Version 0 is the latest. You got the error on version 1 Sat Sep 17. What does Restore show as latest?
Sometimes one can use the delete command in Commandline to delete the complaining version, but

being in a loop (yet only complaining once) sometimes means you just fail on the next bad one found…
It’s probably worth a try, but we wouldn’t want to delete too many versions if it seems like it keeps going.
Problem is that there aren’t any easy great solutions (I think).

Ok, seems I got it back to normal again. Deleted the backup version lacking the two files I deleted manually. Two backups succeded since that. One was started by me after deleting the spoilt version and one last night as scheduled.
By this the problem should be fixed for my needs.
To wrap it up what I/we did:

  1. Delete and Rebuild local Database
  2. Delete remote files with wrong size/spoilt header reported in Duplicati logfile
  3. Delete backup version originally containing spoilt files
    Results:
    After step 2 was able to run one backup succesfully , but only one.
    After step 3 , current state after two runs, it’s back to normal.
    At least step two did not help to solve the problem, I even wonder if rebuild of the database was necessary. Should it happen again I would just delete the faulty backup version in a first step.

Thanks again for your help with this and anybody invoved with Duplicati for this great tool.
I’ll keep Retry level-backup activated, if it happens again I will post it here.
Finally this is the log for the last changes/backup just to complete this:

2022-09-21 03:22:00 +02 - [Warning-Duplicati.Library.Main.Controller-DeprecatedOption]: The option log-level is deprecated: Use the log-file-log-level and console-log-level options instead
2022-09-21 03:22:00 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started
2022-09-21 03:22:20 +02 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 1: 17.09.2022 12:19:21 (database id: 12), found 155922 entries, but expected 155924
   bei Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency(Int64 blocksize, Int64 hashsize, Boolean verifyfilelists, IDbTransaction transaction)
   bei Duplicati.Library.Main.Operation.Backup.BackupDatabase.<>c__DisplayClass34_0.<VerifyConsistencyAsync>b__0()
   bei Duplicati.Library.Main.Operation.Common.SingleRunner.<>c__DisplayClass3_0.<RunOnMain>b__0()
   bei Duplicati.Library.Main.Operation.Common.SingleRunner.<DoRunOnMain>d__2`1.MoveNext()
--- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde ---
   bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   bei Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext()
2022-09-22 03:22:00 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: Die Operation Backup wurde gestartet
2022-09-22 03:22:19 +02 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 1: 17.09.2022 12:19:21 (database id: 12), found 155922 entries, but expected 155924
   bei Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency(Int64 blocksize, Int64 hashsize, Boolean verifyfilelists, IDbTransaction transaction)
   bei Duplicati.Library.Main.Operation.Backup.BackupDatabase.<>c__DisplayClass34_0.<VerifyConsistencyAsync>b__0()
   bei Duplicati.Library.Main.Operation.Common.SingleRunner.<>c__DisplayClass3_0.<RunOnMain>b__0()
   bei Duplicati.Library.Main.Operation.Common.SingleRunner.<DoRunOnMain>d__2`1.MoveNext()
--- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde ---
   bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   bei Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext()
2022-09-23 00:31:11 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Delete has started
2022-09-23 00:31:12 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2022-09-23 00:32:44 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (2,47 KB)
2022-09-23 00:32:44 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-StartCheck]: Start checking if backups can be removed
2022-09-23 00:32:44 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-FramesAndIntervals]: Time frames and intervals pairs: 7.00:00:00 / 1.00:00:00, 28.00:00:00 / 7.00:00:00, 365.00:00:00 / 31.00:00:00
2022-09-23 00:32:44 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupList]: Backups to consider: 17.09.2022 12:19:21, 11.09.2022 03:22:02, 03.09.2022 03:22:00, 02.09.2022 03:22:01, 26.08.2022 03:22:00, 22.07.2022 03:22:00, 16.06.2022 03:22:05, 10.05.2022 03:26:36, 01.04.2022 03:27:26, 27.02.2022 02:29:14, 24.02.2022 01:48:20, 23.02.2022 23:52:24
2022-09-23 00:32:44 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupsToDelete]: Backups outside of all time frames and thus getting deleted: 
2022-09-23 00:32:44 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-AllBackupsToDelete]: All backups to delete: 
2022-09-23 00:32:44 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteRemoteFileset]: Deleting 1 remote fileset(s) ...
2022-09-23 00:32:57 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-20220917T101921Z.dlist.zip.aes (13,19 MB)
2022-09-23 00:32:58 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-20220917T101921Z.dlist.zip.aes (13,19 MB)
2022-09-23 00:32:58 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteResults]: Deleted 1 remote fileset(s)
2022-09-23 00:33:06 +02 - [Information-Duplicati.Library.Main.Database.LocalDeleteDatabase-CompactReason]: Compacting not required
2022-09-23 00:33:31 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started
2022-09-23 00:33:46 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2022-09-23 00:35:05 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (2,47 KB)
2022-09-23 00:38:05 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bb665c0271a4b40f4867a40c969995393.dblock.zip.aes (49,97 MB)
2022-09-23 00:38:07 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b0f6ede1e352a47fe8ba76f54150dec73.dblock.zip.aes (23,65 MB)
2022-09-23 00:38:42 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b0f6ede1e352a47fe8ba76f54150dec73.dblock.zip.aes (23,65 MB)
2022-09-23 00:38:42 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ic0e635c07228428fb056adf9d9adda4f.dindex.zip.aes (43,90 KB)
2022-09-23 00:38:42 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ic0e635c07228428fb056adf9d9adda4f.dindex.zip.aes (43,90 KB)
2022-09-23 00:38:58 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bb665c0271a4b40f4867a40c969995393.dblock.zip.aes (49,97 MB)
2022-09-23 00:38:58 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i0d1af0e359eb46999667a21c31830542.dindex.zip.aes (66,17 KB)
2022-09-23 00:38:58 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i0d1af0e359eb46999667a21c31830542.dindex.zip.aes (66,17 KB)
2022-09-23 00:38:59 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20220922T223331Z.dlist.zip.aes (13,21 MB)
2022-09-23 00:39:09 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20220922T223331Z.dlist.zip.aes (13,21 MB)
2022-09-23 00:39:09 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-StartCheck]: Start checking if backups can be removed
2022-09-23 00:39:09 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-FramesAndIntervals]: Time frames and intervals pairs: 7.00:00:00 / 1.00:00:00, 28.00:00:00 / 7.00:00:00, 365.00:00:00 / 31.00:00:00
2022-09-23 00:39:09 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupList]: Backups to consider: 18.09.2022 03:22:00, 11.09.2022 03:22:02, 03.09.2022 03:22:00, 02.09.2022 03:22:01, 26.08.2022 03:22:00, 22.07.2022 03:22:00, 16.06.2022 03:22:05, 10.05.2022 03:26:36, 01.04.2022 03:27:26, 27.02.2022 02:29:14, 24.02.2022 01:48:20, 23.02.2022 23:52:24
2022-09-23 00:39:09 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupsToDelete]: Backups outside of all time frames and thus getting deleted: 
2022-09-23 00:39:09 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-AllBackupsToDelete]: All backups to delete: 
2022-09-23 00:39:14 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteResults]: No remote filesets were deleted
2022-09-23 00:39:14 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2022-09-23 00:40:49 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (2,48 KB)
2022-09-23 00:40:49 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20220401T012726Z.dlist.zip.aes (11,14 MB)
2022-09-23 00:40:55 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20220401T012726Z.dlist.zip.aes (11,14 MB)
2022-09-23 00:40:55 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-id70e0c8cec9645a6ac6a9b90caca19ee.dindex.zip.aes (70,20 KB)
2022-09-23 00:40:56 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-id70e0c8cec9645a6ac6a9b90caca19ee.dindex.zip.aes (70,20 KB)
2022-09-23 00:40:56 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b9c9451aca94a43919ad6da78e3f4d3f1.dblock.zip.aes (49,93 MB)
2022-09-23 00:41:09 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b9c9451aca94a43919ad6da78e3f4d3f1.dblock.zip.aes (49,93 MB)
2022-09-23 03:22:00 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started
2022-09-23 03:22:14 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2022-09-23 03:25:02 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (2,48 KB)
2022-09-23 03:26:21 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b06da1b0ec1df4800a8b1c8e3e332996d.dblock.zip.aes (23,42 KB)
2022-09-23 03:26:22 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b06da1b0ec1df4800a8b1c8e3e332996d.dblock.zip.aes (23,42 KB)
2022-09-23 03:26:29 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ie1563679a06249ba8509da0e8da75201.dindex.zip.aes (3,40 KB)
2022-09-23 03:26:29 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ie1563679a06249ba8509da0e8da75201.dindex.zip.aes (3,40 KB)
2022-09-23 03:26:31 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20220923T012200Z.dlist.zip.aes (13,21 MB)
2022-09-23 03:27:42 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20220923T012200Z.dlist.zip.aes (13,21 MB)
2022-09-23 03:27:42 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-StartCheck]: Start checking if backups can be removed
2022-09-23 03:27:42 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-FramesAndIntervals]: Time frames and intervals pairs: 7.00:00:00 / 1.00:00:00, 28.00:00:00 / 7.00:00:00, 365.00:00:00 / 31.00:00:00
2022-09-23 03:27:42 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupList]: Backups to consider: 23.09.2022 00:33:31, 18.09.2022 03:22:00, 11.09.2022 03:22:02, 03.09.2022 03:22:00, 02.09.2022 03:22:01, 26.08.2022 03:22:00, 22.07.2022 03:22:00, 16.06.2022 03:22:05, 10.05.2022 03:26:36, 01.04.2022 03:27:26, 27.02.2022 02:29:14, 24.02.2022 01:48:20, 23.02.2022 23:52:24
2022-09-23 03:27:42 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupsToDelete]: Backups outside of all time frames and thus getting deleted: 
2022-09-23 03:27:42 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-AllBackupsToDelete]: All backups to delete: 
2022-09-23 03:27:47 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteResults]: No remote filesets were deleted
2022-09-23 03:27:54 +02 - [Information-Duplicati.Library.Main.Database.LocalDeleteDatabase-CompactReason]: Compacting not required
2022-09-23 03:27:54 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2022-09-23 03:30:39 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (2,48 KB)
2022-09-23 03:30:40 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20220902T012201Z.dlist.zip.aes (13,17 MB)
2022-09-23 03:31:18 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20220902T012201Z.dlist.zip.aes (13,17 MB)
2022-09-23 03:31:18 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-i3041530125fa46209fd33d46ddd87b09.dindex.zip.aes (28,68 KB)
2022-09-23 03:31:19 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-i3041530125fa46209fd33d46ddd87b09.dindex.zip.aes (28,68 KB)
2022-09-23 03:31:19 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b93abe098f95344cebfa0d3b96fb22978.dblock.zip.aes (49,93 MB)
2022-09-23 03:33:54 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b93abe098f95344cebfa0d3b96fb22978.dblock.zip.aes (49,93 MB)
2022-09-23 07:55:31 +02 - [Warning-Duplicati.Library.Main.Controller-DeprecatedOption]: The option verbose is deprecated: Set a log-level for the desired output method instead
2022-09-23 07:55:31 +02 - [Warning-Duplicati.Library.Main.Controller-UnsupportedOption]: The supplied option --full-result  is not supported and will be ignored
2022-09-23 07:55:31 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation ListChanges has started

That’s good to hear. If you haven’t tested a sample restore yet, that would be an excellent thing to try.

I was able to mimic your database bug report (thanks for that) to reproduce UNIQUE constraint error.
This confirmed my suspicion that RemoteListAnalysis cited earlier had a collision between two rows.

We don’t have enough information on the messy middle portion to know how those rows came to be.
The reason the database doesn’t show them being created is probably due to rollback of partial work.
This is where log files do better. It’s just a sequential log without the ability to roll back any of the lines.

2022-09-22 20:40:05 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.BackupHandler-PreBackupVerify]: Starting - PreBackupVerify
2022-09-22 20:40:05 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationList]: Starting - RemoteOperationList
2022-09-22 20:40:05 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2022-09-22 20:40:05 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (3 bytes)
2022-09-22 20:40:05 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationList]: RemoteOperationList took 0:00:00:00.001
2022-09-22 20:40:05 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC
2022-09-22 20:40:05 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC took 0:00:00:00.000
2022-09-22 20:40:05 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT DISTINCT "Name", "State" FROM "Remotevolume" WHERE "Name" IN (SELECT "Name" FROM "Remotevolume" WHERE "State" IN ("Deleted", "Deleting")) AND NOT "State" IN ("Deleted", "Deleting")
2022-09-22 20:40:05 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT DISTINCT "Name", "State" FROM "Remotevolume" WHERE "Name" IN (SELECT "Name" FROM "Remotevolume" WHERE "State" IN ("Deleted", "Deleting")) AND NOT "State" IN ("Deleted", "Deleting") took 0:00:00:00.000
2022-09-22 20:40:05 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-RemoteUnwantedMissingFile]: removing file listed as Temporary: duplicati-20220919T144756Z.dlist.zip
2022-09-22 20:40:05 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-SchedulingMissingFileForDelete]: scheduling missing file for deletion, currently listed as Uploading: duplicati-20220919T144756Z.dlist.zip
2022-09-22 20:40:05 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.BackupHandler-PreBackupVerify]: PreBackupVerify took 0:00:00:00.005
2022-09-22 20:40:06 -04 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
code = Constraint (19), message = System.Data.SQLite.SQLiteException (0x800027AF): constraint failed
UNIQUE constraint failed: Remotevolume.Name, Remotevolume.State
   at System.Data.SQLite.SQLite3.Reset(SQLiteStatement stmt)
   at System.Data.SQLite.SQLite3.Step(SQLiteStatement stmt)
   at System.Data.SQLite.SQLiteDataReader.NextResult()
   at System.Data.SQLite.SQLiteDataReader..ctor(SQLiteCommand cmd, CommandBehavior behave)
   at System.Data.SQLite.SQLiteCommand.ExecuteReader(CommandBehavior behavior)
   at System.Data.SQLite.SQLiteCommand.ExecuteNonQuery(CommandBehavior behavior)
   at Duplicati.Library.Main.Database.LocalDatabase.UpdateRemoteVolume(String name, RemoteVolumeState state, Int64 size, String hash, Boolean suppressCleanup, TimeSpan deleteGraceTime, IDbTransaction transaction)
   at Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
   at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext()
2022-09-22 20:40:06 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: PRAGMA optimize
2022-09-22 20:40:06 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: PRAGMA optimize took 0:00:00:00.000
2022-09-22 20:40:06 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Controller-RunBackup]: Running Backup took 0:00:00:00.302
2022-09-22 20:40:06 -04 - [Error-Duplicati.Library.Main.Controller-FailedOperation]: The operation Backup has failed with error: constraint failed
UNIQUE constraint failed: Remotevolume.Name, Remotevolume.State
code = Constraint (19), message = System.Data.SQLite.SQLiteException (0x800027AF): constraint failed
UNIQUE constraint failed: Remotevolume.Name, Remotevolume.State
   at System.Data.SQLite.SQLite3.Reset(SQLiteStatement stmt)
   at System.Data.SQLite.SQLite3.Step(SQLiteStatement stmt)
   at System.Data.SQLite.SQLiteDataReader.NextResult()
   at System.Data.SQLite.SQLiteDataReader..ctor(SQLiteCommand cmd, CommandBehavior behave)
   at System.Data.SQLite.SQLiteCommand.ExecuteReader(CommandBehavior behavior)
   at System.Data.SQLite.SQLiteCommand.ExecuteNonQuery(CommandBehavior behavior)
   at Duplicati.Library.Main.Database.LocalDatabase.UpdateRemoteVolume(String name, RemoteVolumeState state, Int64 size, String hash, Boolean suppressCleanup, TimeSpan deleteGraceTime, IDbTransaction transaction)
   at Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
   at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.<Backup>b__0(BackupResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)

I just made a little test backup, then used DB Browser for SQLite to copy the dlist row twice, using a name incremented by a second. First extra row got State Temporary, and second got State Uploading, like yours.

Temporary gets set for deletion at

Uploading gets set for deletion at

Theory is that at some later point in the code, there’s an (unfortunately unlogged) UpdateRemoteVolume causing these two rows to attempt to get to the same State, e.g. Deleting. This is a uniqueness violation.

Resore a frequently change file from a old version. It worked, files opens in application, but seeing complains about missing list entries and files is not very promissing.

Duplicati.CommandLine.exe restore "webdavs://ewebdav.pcloud.com/Backups/Duplicati/Privatdaten?auth-username=******&auth-password=*******" "E:\Privatdaten\OwnDataKL\PWD_neu.kdbx" --version=5 --restore-path="e:\Duplicati_Error_Backup\restore"
Restore started at 23.09.2022 18:04:06

Verschlüsselungspassphrase eingeben: C.....4
  Listing remote folder ...
  Downloading file (13,17 MB) ...
  Downloading file (31,93 KB) ...
  Downloading file (106,51 KB) ...
  Downloading file (42,61 KB) ...
  ...
  Downloading file (58,22 KB) ...
  Downloading file (3,62 KB) ...
  Downloading file (42,83 KB) ...
Remote file referenced as duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes by duplicati-i6324d97e8d7347b8a1cc3ea90d4c465c.dindex.zip.aes, but not found in list, registering a missing remote file
  Downloading file (49,40 KB) ...
  Downloading file (46,34 KB) ...
  Downloading file (17,97 KB) ...
  Downloading file (91,42 KB) ...
  Downloading file (32,50 KB) ...
  ...
  Downloading file (17,97 KB) ...
  Downloading file (32,75 KB) ...
  Downloading file (53,09 KB) ...
  Downloading file (44,37 KB) ...
  Downloading file (38,26 KB) ...
Found 1 missing volumes; attempting to replace blocks from existing volumes
Found 1 missing volumes; attempting to replace blocks from existing volumes
Checking remote backup ...
  Listing remote folder ...
Checking existing target files ...
  1 files need to be restored (102,11 KB)
Scanning local files for needed data ...
  Downloading file (9,59 MB) ...
  0 files need to be restored (0 Bytes)
Verifying restored files ...
Restored 1 (102,11 KB) files to e:\Duplicati_Error_Backup\restore
Duration of restore: 00:11:44

Hello

could you set the record straight about your 3 jobs for a cloud backend; are each job pointing to a different directory on the backend (preferred option), and if not, did you set a prefix for each job ?

The three jobs start with 1 hour offset, so they do not overlap. Each job has its own destination folder on PCloud.

This is the file that pCloud seemingly broke so you deleted. Ordinarily one would list and purge broken files right after that, but the exact sequence done is a little bit confusing for me. You could try those steps again.

What this restore result suggests though is that the file you chose will be purged because it’s part missing. Destination killing files is a good way to damage backup. FWIW here’s a Duplicacy pCloud problem today:

Pcloud Webdav failing

The day of that pCloud error start is different than yours, but both of you had seemingly good history before.

One other the other jobs hang “verifying backend data” day ago. I had to kill the duplicati process. I*'m wondering for what reason there were two of it


Killing the one using less recousses showed no effect. When I killed the second the interface was no longer accessible.
After restart I started the incomplete job manually but received connection error
System.Net.WebException: Die Verbindung mit dem Remoteserver kann nicht hergestellt werden. ---> System.Net.Sockets.SocketException: Ein Verbindungsversuch ist fehlgeschlagen, da die Gegenstelle nach einer bestimmten Zeitspanne nicht richtig reagiert hat, oder die hergestellte Verbindung war fehlerhaft, da der verbundene Host nicht reagiert hat 45.131.244.151:443
Tried again a few times in the next hours without success. So PCloud had a problem with WEBDAV interface. Seems it had been down for several hours. Last two nights all three jobs succeded.
Since I have enough cloudspace it think about setting up a rclone backup in parallel using the Pcloud API for testing. Sad to day rclone is missing many features I like Duclicati for.

That’s strange, I tried

 curl https://ewebdav.pcloud.com
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>401 Unauthorized</title>
</head><body>
<h1>Unauthorized</h1>
<p>This server could not verify that you
are authorized to access the document
requested.  Either you supplied the wrong
credentials (e.g., bad password), or your
browser doesn't understand how to supply
the credentials required.</p>
<hr>
<address>Apache/2.4.38 (Debian) Server at ewebdav.pcloud.com Port 443</address>
</body></html>

so it accepts connections. Did you try to restart your computer ?

Sorry, when I wasn’t clear with that, currently it’s working again for two nights. Problems occoured 25th & 26th

 Sep 28, 2022 4:24 AM - Operation: Backup
 Sep 27, 2022 4:26 AM - Operation: Backup
 Sep 24, 2022 4:32 AM - Operation: Backup
 Sep 23, 2022 4:29 AM - Operation: Backup
 Sep 22, 2022 4:39 AM - Operation: Backup
Sep 26, 2022 4:24 AM: Failed while executing "Backup" with id: 5
Sep 26, 2022 3:24 AM: Failed while executing "Backup" with id: 4
Sep 26, 2022 2:24 AM: Failed while executing "Backup" with id: 3
Sep 25, 2022 8:36 PM: Failed while executing "Backup" with id: 5
Sep 25, 2022 6:19 PM: Failed while executing "Backup" with id: 5

Scheduled backup Sep25 4:2AM was not started since MY server was down. 25th 9:20 started ist manually, 2 jobs succeded but the lanst hang took a screenshot when hanging

if you stop Duplicati server like that when your backend hangs, it’s not surprising that the database is corrupted. Duplicati is using an embedded database, that means that database transactions can’t help if the process is killed.

Now Duplicati should stop a job by itself when the backend hangs. I tried it when another poster reported the same problem and could not reproduce it, but it could be specific to a backend, that is, work with my test backend (that I control since it’s my own sftp server) and not with your webdav backend. Even when it works it’s a very long wait since Duplicati by default does 5 retries/timeout.

Fully agree, but after waiting serval hours, I saw no other way but killing the processes. Shutting down the machine wouldn’t make any differance I asumed. Luckily the basebase wasn not currupted this time. This was the first time I was forced to do so.

There are always two by default. The job of the first is to find the latest Duplicati and run it to do real work.
AUTOUPDATER_Duplicati_POLICY can stop this (but you lose autoupdater, so most people don’t use it).

So all is well now except that pCloud WebDAV continues to be suspect and you’re seeking workarounds?

You can combine them to get roughly the usual Duplicati experience. Set up pCloud on rclone, then set up Rclone storage type in Duplicati to use rclone as its file transfer system. It sometimes needs some tuning.

Thank you, that was valid information.I had rclone setup and tested as standalone already, but Ioverlooked the option configuring rclone Interface in Duplicati.
The setup seems a bit weird to me, asking for a local and remote destination, but finally a got it working.
I set up a parallel job for the affected archive, to see if there is a difference in robustness.
Do you know if it is save to switch a existing job from WEBDAV Interface to Rclone? Just in case it will turn out rclone option is more stable.
Besides that all jobs executed sucessfully for 4 nights now, restore test was o.k., everything seems to be fine.

That should be safe after stability and general results have been proven on the new backup.
Be sure not to reuse the same folder. Each backup requires its own folder and its database.

On initial run using rclone interface I received warnings.Same on second run again.
The first seems to be related to the fact I used the original WEBDAV-bsed job as base. It seems the has some auth option still activated- no problem.
The other is quite strange again a file seems to be spoilt with lenght misfit - no idea why this is not classified as an error

2022-10-01 01:22:00 +02 - [Warning-Duplicati.Library.Main.Controller-UnsupportedOption]: The supplied option --auth-username is not supported and will be ignored
2022-10-01 01:22:10 +02 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-b536ad8f6b21f498081f1d6835fcd586d.dblock.zip.aes is listed as Uploaded with size  but should be 52365517, please verify the sha256 hash "Jg7iV0zBY1p5UDv5+5VDSsEE097ErAVslqbXCoO4Faw="
2022-10-01 01:25:18 +02 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-b536ad8f6b21f498081f1d6835fcd586d.dblock.zip.aes is listed as Uploaded with size 5337088 but should be 52365517, please verify the sha256 hash "Jg7iV0zBY1p5UDv5+5VDSsEE097ErAVslqbXCoO4Faw="

I download the file: reported length is 5337088, decrypt says

SharpAESCrypt d *** duplicati-b536ad8f6b21f498081f1d6835fcd586d.dblock.zip.aes
Error: File length is invalid

This is very similar to the original problem we started with.

How to check SHA256, Generating SHA256 with Total commander returns this

057a7999fdbdd87e5a079db40f1509739cbd180c0dcffcd87275a76946c4218d *duplicati-b536ad8f6b21f498081f1d6835fcd586d.dblock.zip.aes

–auth-password is not supported gives a simple way to stop your warning (even though it’s a different one).
Changing backend type leaves invalid backend options #3082 is the issue (awaiting a JavaScript volunteer)

Error seems more suitable. I don’t know why it’s just Warning. Was this whole backup uploaded by rclone? Everything else turn out OK, although one error out of a whole lot of them is still one bad upload too many?
To confirm, rclone is set up for native pCloud, not WebDAV, right? Attempting to bypass the WebDAV error.
Possibly though, it’s a generic pCloud issue. Before concluding anything, want to understand the test done.

The message seems like it’s tailored to advanced users or developers, but Duplicati relies on Base64 encodes of SHA256 for lookups and integrity. https://cryptii.com/pipes/hex-to-base64 is one converter.

You’d check upload log for that dblock against BXp5mf292H5aB520DxUJc5y9GAwNz/zYcnWnaUbEIY0=

Finding the upload might be hard, but you can look in job’s About → Show log → Remote at its file time.
Click on the file if you find it. Example below:

image

Note that this also gave the size Duplicati thinks the file should have. If it’s now wrong, hash likely is too.

If you are curious enough to check it, you can also use DB Browser for SQLite to look in local database.
Remotevolume table has a search box at top for that Name. RemoteOperation table calls column Path.
This table is more interesting in a way because it more clearly shows where bad file is in the sequence. Remote log in GUI can as well, and is probably just reading back the info in table but with harder search.