Remote backup task quit working : constraint failed UNIQUE constraint failed: Remotevolume.Name, Remotevolume.State

I deleted the two files and started “delete and rebuild” again. Looks good this time: Repair found the two missing files, but database seems to be in sync again and backup executed without errors.

 18. Sept. 2022 04:13 - Operation: Backup
Zeit
Beginn 2022-09-18 03:22:00
Ende 2022-09-18 04:13:36
Dauer 00:51:37
Quelldateien
Geprüft 113811 (83.38 GB)
Geöffnet 89847 (80.63 GB)
Hinzugefügt 9 (61.69 KB)
Geändert 148 (21.86 MB)
Gelöscht 0
 Test Phase 
 Phase Löschen (alte Sicherungsversionen) 
 Warnings 0
 Errors 0
Vollständiges Protokoll 
................................................
 18. Sept. 2022 01:17 - Operation: Reparieren
Zeit
Beginn 2022-09-18 00:55:07
Ende 2022-09-18 01:17:22
Dauer 00:22:16
Quelldateien
Geprüft 0 (0 bytes)
Geöffnet 0 (0 bytes)
Hinzugefügt 0 (0 bytes)
Geändert 0 (0 bytes)
Gelöscht 0
 Datenbank-Wiederherstellungsphase 
 Warnings 2 
2022-09-18 01:16:08 +02 - [Warning-Duplicati.Library.Main.Database.LocalRecreateDatabase-MissingVolumesDetected]: Found 1 missing volumes; attempting to replace blocks from existing volumes
2022-09-18 01:16:56 +02 - [Warning-Duplicati.Library.Main.Database.LocalRecreateDatabase-MissingVolumesDetected]: Found 1 missing volumes; attempting to replace blocks from existing volumes
 Errors 1 
2022-09-18 01:12:09 +02 - [Error-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-MissingFileDetected]: Remote file referenced as duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes by duplicati-i6324d97e8d7347b8a1cc3ea90d4c465c.dindex.zip.aes, but not found in list, registering a missing remote file
Vollständiges Protokoll 

At least I thought it should, it started a “check data(base)”, )located under “for professionals”, right to database) I hope translation is correct I have the german UI, and got an error again :

Error while running Privatdaten
Unexpected difference in fileset version 1: 17.09.2022 12:19:21 (database id: 12), found 155922 entries, but expected 155924

I’m not sure how to interpret this, might be o.k. since I deleted the two files now missing in the older fileset.

Started a commandline compare first version 0 - 1 with timestams reported:

  1: 17.09.2022 10:19:21
  0: 18.09.2022 01:22:00

Then compared 0-2:

Listing changes
  2: 11.09.2022 01:22:02
  0: 18.09.2022 01:22:00

No errors reported changes seem to be o.k

Find command for frequently modified PW-file looks good as well:

Listing files and versions:
E:\Privatdaten\apache-ftpserver\ftp_transfer\PWD_neu.kdbx
0       : 18.09.2022 03:22:00 108,47 KB
1       : 17.09.2022 12:19:21 108,47 KB
2       : 11.09.2022 03:22:02 108,47 KB
3       : 03.09.2022 03:22:00 108,47 KB
4       : 02.09.2022 03:22:01 108,47 KB
5       : 26.08.2022 03:22:00 106,72 KB
6       : 22.07.2022 03:22:00 107,06 KB
7       : 16.06.2022 03:22:05 104,33 KB
8       : 10.05.2022 03:26:36 102,67 KB
9       : 01.04.2022 03:27:26 102,34 KB
10      : 27.02.2022 02:29:14 100,09 KB
11      : 24.02.2022 01:48:20 99,87 KB
12      : 23.02.2022 23:52:24  -

Seems all is in sync again and I can access old files and versions. Thank you very much for your help.

Btw. I think Duplicati has still some room for improvement here. Whatever interface you use in is just a question of time/possibility data corrption during transfer will happen. Seems Duplicati can handle it in most cases, but there seems to be at least one exception.
If I can contribute helping to identify the gap I’ll be glad to do so. Maybe information ist lost in this case but it is very likely it will happen again sooner or later.
At least I know how to fix it next time and maybe this information will help somebody else.

There is much useful information in the old database that I’ve been asking about. Does a copy exist?
If you don’t want to mess with the working backup, you can make a dummy backup job to look at DB.
What’s in the dummy job doesn’t matter. Just never run it. Only use it to look for things I’ve asked for.

Was duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes kept to look at header problem?
Is that 2048 bytes long (again suspiciously binary-even)? I wonder what actually got put inside there?
Dragging it onto notepad would be easy. If it’s blank, that’s a clue. If it’s random junk, that’s a clue too.

I still have copies of deleted files and database before restore

duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes:


SharpAESCrypt.exe d C.......4 duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes
Error: File length is invalid

duplicati-i6324d97e8d7347b8a1cc3ea90d4c465c.dindex.zip.aes:
SharpAESCrypt.exe d C…4 duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes
Error: Invalid header marker


Both won’t decrypt, filelenghts are correct. I think I reported this before, sorry I did not explicitly mention they are not blank.

I’m not into databases so this comes not so easy. I will work through your text and try find out you asked for.

Yep. That’s a wrong start of file. Is the header marker anywhere? Might need to search, but flipping through sample spots might give you some idea of what’s there – maybe something visible, maybe all just random.

Correct on what basis? Duplicati said the 4096 should be 24813. This is (or was) all in some database.

Some of it doesn’t require a database browser. Just make the dummy job, copy the old database where Local database path is (there won’t be a database there before the first backup), and look into the logs.

On the home screen under Reporting for the fake job, Create bug report downloads a privacy-sanitized database copy which will probably be too big to just put in the forum, but maybe you have some cloud storage that will let you publish a link to file. That’s probably the easiest way to share the DB bug report.

Created a bugreport of the original database you can download it and and two .aes files here

Thank you. The most obvious thing is that pCloud WebDAV uploads had a very bad Sep 10 and 11.
This would be visible in a log-file=<path> log-file-log-level=retry if you had one. Maybe time to start.

The last log in this job database was the Sep 9 backup, but it’s erased in the bug report, for privacy.
You can look at it in your Duplicati job logs, and Sep 10 and 11 might be in the server log, if backup completely failed. It looks like it exhausted the default number-of-retries which is 5, so errored out…

This by itself is not a problem. Next Duplicati backup attempts to pick up from where uploads ended.

Sep 9 looked normal. list at start checks things are in order, dblock file data (and its dindex) upload,
dlist says what’s in backup, retention deletes older dlist, list checks things, and sampled test occurs.

ID      OperationID     Timestamp       Operation       Path                                                            Data
4964    287             1662686871      list
4965    287             1662687049      put             duplicati-bd092300e2a1642c695580e3903b4ba6c.dblock.zip.aes      {"Size":547309,"Hash":"BeV/mRq0tKCHYkrFc6XezIAwA/7WvqQFguZEk5lnrZ4="}
4966    287             1662687052      put             duplicati-i642faf4e90b44b43920386c7f0a7e327.dindex.zip.aes      {"Size":4797,"Hash":"bUMLX4rwCSYJlQ7CMQO6Bb+F6ofmBVI71S5lUUbqe+0="}
4967    287             1662687054      put             duplicati-20220909T012200Z.dlist.zip.aes                        {"Size":13816701,"Hash":"ToGS8S0Iz6tPoBC3j+DOlMaJaKmol/c0zvp96Jd9ijg="}
4968    287             1662687073      delete          duplicati-20220812T012200Z.dlist.zip.aes        
4969    287             1662687512      list
4970    287             1662687532      get             duplicati-20220907T012200Z.dlist.zip.aes                        {"Size":13816221,"Hash":"zExvCpvP4k2vOdtpVMTJ2QAPPlVI7PCsutR7dKNzLRU="}
4971    287             1662687532      get             duplicati-i0607f51181ee4ba3b2af3cb5aa675813.dindex.zip.aes      {"Size":146077,"Hash":"Ws1Q8iWPxaHE4wCLpyaZflPRQI3Aprad3fB2FCUdV14="}
4972    287             1662687532      get             duplicati-b525acb4e23e44af8aa260994d8d65c21.dblock.zip.aes      {"Size":52333469,"Hash":"kIXuIv93XrKSZPaQqbvFMHZRlUjZqkmK61zV0Qbwm8M="}

Sep 10 has to try dblock twice, uploads dindex OK, exhausts default 5 retries on dlist, and errors out.
The retried dblock gets new random name. The retried dlists gets its name incremented by 1 second.
Log file would be clearer, but seeing the hash and size be the same suggests that it’s the same data.

ID      OperationID     Timestamp       Operation       Path                                                            Data
4973    288             1662773040      list
4974    288             1662773154      put             duplicati-bb55ed12f64a84368a9d7165ef471e5f3.dblock.zip.aes      {"Size":384525,"Hash":"eyGcoUFdYE/PiFYhHG/z5RP4A1EMY7jfgAnCSzNZkLs="}
4975    288             1662773184      put             duplicati-b2a297cd9c6624c7891eb1aa8d731ce0e.dblock.zip.aes      {"Size":384525,"Hash":"eyGcoUFdYE/PiFYhHG/z5RP4A1EMY7jfgAnCSzNZkLs="}
4976    288             1662773204      put             duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes      {"Size":5069,"Hash":"e8aRJJvi/uRuxa26YtxzSfjYNu0mhJynaiLg1x50e30="}
4977    288             1662773225      put             duplicati-20220910T012200Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}
4978    288             1662773256      put             duplicati-20220910T012201Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}
4979    288             1662773285      put             duplicati-20220910T012202Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}
4980    288             1662773325      put             duplicati-20220910T012203Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}
4981    288             1662773364      put             duplicati-20220910T012204Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}
4982    288             1662773404      put             duplicati-20220910T012205Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}

Sep 11 is still not uploading well, but one odd finding is it’s retrying Sep 10 dlist using reused names.
The size and content hash also seem to have changed. Some change may be normal (time stamps),
however I’m not sure if that’s enough to account for the size change. Regardless, I can’t dissect files.

ID      OperationID     Timestamp       Operation       Path                                                            Data
4983    289             1662859391      list
4984    289             1662859537      put             duplicati-bd96475eb84a1488192cbd3d88c6817ec.dblock.zip.aes      {"Size":24813,"Hash":"nE+sklcKP2E10/H93nUmEao4mRacK228TG71znudLhQ="}
4985    289             1662859575      put             duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes      {"Size":24813,"Hash":"nE+sklcKP2E10/H93nUmEao4mRacK228TG71znudLhQ="}
4986    289             1662859596      put             duplicati-i6324d97e8d7347b8a1cc3ea90d4c465c.dindex.zip.aes      {"Size":3709,"Hash":"u83hCCODlqVEdJIlHM/etDFtE0slHrAqlw3/z9eghA0="}
4987    289             1662859616      put             duplicati-20220911T012200Z.dlist.zip.aes                        {"Size":13817997,"Hash":"Xa5vSpHUUgR4W3ZvrWT2wRIKFEGDYT0aSaFS5BRHT3I="}
4988    289             1662859616      put             duplicati-20220910T012201Z.dlist.zip.aes                        {"Size":13816701,"Hash":"8mDIdHCDBVusing6h9jiRLz1bb0RT1Qj1meeXs35aSU="}
4989    289             1662859660      put             duplicati-20220911T012201Z.dlist.zip.aes                        {"Size":13817997,"Hash":"Xa5vSpHUUgR4W3ZvrWT2wRIKFEGDYT0aSaFS5BRHT3I="}
4990    289             1662859679      put             duplicati-20220910T012202Z.dlist.zip.aes                        {"Size":13816701,"Hash":"8mDIdHCDBVusing6h9jiRLz1bb0RT1Qj1meeXs35aSU="}
4991    289             1662859712      put             duplicati-20220910T012203Z.dlist.zip.aes                        {"Size":13816701,"Hash":"8mDIdHCDBVusing6h9jiRLz1bb0RT1Qj1meeXs35aSU="}
4992    289             1662859751      put             duplicati-20220910T012204Z.dlist.zip.aes                        {"Size":13816701,"Hash":"8mDIdHCDBVusing6h9jiRLz1bb0RT1Qj1meeXs35aSU="}

Let’s try looking at the size errors.

duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes at size 5069 had a Sep 10 upload, but because backup ended prematurely, its after-backup list was not done. The list at start of Sep 11 found

{“Name”:“duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes”,“LastAccess”:“2022-09-10T03:27:33+02:00”,“LastModification”:“2022-09-10T03:27:33+02:00”,“Size”:2048,“IsFolder”:false},

duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes at size 24813 had upload on Sep 11 suffering a similar fate. The before backup file list check at start of Sep 12 found

{“Name”:“duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes”,“LastAccess”:“2022-09-11T03:26:42+02:00”,“LastModification”:“2022-09-11T03:26:42+02:00”,“Size”:4096,“IsFolder”:false},

It looks like pCloud accepted these files fine (or there would have been retries) but corrupted contents.

I still haven’t figured out where the constraint error is from. That might need some logs that don’t exist.
After the above errors, your later backups just do the before backup file list, then error. The list can try changing the Remotevolume table (sometimes logging why). The code that might be hitting error is at:

The log messages look like information level, and retry level is a little more, so that would catch them.

Thank you for analyzing this. With your expIanation I was able find both files as put… and in the list of the next day with different lengh as well.
Your suggestion is the change the log level for the job. Currently this was default, no entry in Advanced options, I changed it to log level retry. OK? So this hopefully will give more information it this will happen again.

You also need to say where to log to. This isn’t the regular job log here.

Retry level is a fairly light log. I hope we don’t need to go super-detailed such as seeing all the SQL.
That might give a very nice view of the exact UPDATE that got the constraint failure, but log is huge.
Let’s not go there unless lighter logging doesn’t give us the answer, if the constraint failure reoccurs.

Your less-than-fatal retry situation is a different worry. Retries are OK. Acceptance then loss is not…

Feature Request: PCloud backup #4337 is (like so many others) awaiting volunteer developer work, however it gives some worrisome comments about pCloud WebDAV, and a possible rclone solution.

Found a UNIQUE constraint clue. I’d been wondering if there was a duplicate name, and I found one.
SELECT Name FROM Remotevolume gave 2543 rows
SELECT DISTINCT Name FROM Remotevolume gave 2542 rows, so there’s a duplicate somewhere
SELECT Name FROM Remotevolume GROUP BY Name HAVING COUNT(Name) > 1
said duplicati-20220910T012206Z.dlist.zip.aes

ID      OperationID     Name                                            Type    Size            Hash                                            State           VerificationCount       DeleteGraceTime
3471    289             duplicati-20220910T012206Z.dlist.zip.aes        Files   13817645        Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0=    Temporary       0                       0
3478    289             duplicati-20220910T012206Z.dlist.zip.aes        Files   13816701        8mDIdHCDBVusing6h9jiRLz1bb0RT1Qj1meeXs35aSU=    Uploading       0                       0

The Size and Hash are familiar, but the Name has not been mentioned before and isn’t in the file list therefore this finding might be going nowhere but it needs a more expert opinion and none is available.

Thank you for the crosslink, interessting. Slowly I glimpse the complexity of this. Maybe i should have a look at rclone or change the backup concept running the duplicati backup on a local network drive which is synced with pcloud using their SW. Or jaustwait, this worked for 6 months after all, and could be fixed without loss. Maybe the pcloud API will be supported some day.

The job does not run. Sorry i missed that, it just exected once on Sunday rebuilding the local database. Stops complaining about the 2 missing files I deleted:

2022-09-22 03:22:00 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: Die Operation Backup wurde gestartet
2022-09-22 03:22:19 +02 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 1: 17.09.2022 12:19:21 (database id: 12), found 155922 entries, but expected 155924
   bei Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency(Int64 blocksize, Int64 hashsize, Boolean verifyfilelists, IDbTransaction transaction)
   bei Duplicati.Library.Main.Operation.Backup.BackupDatabase.<>c__DisplayClass34_0.<VerifyConsistencyAsync>b__0()
   bei Duplicati.Library.Main.Operation.Common.SingleRunner.<>c__DisplayClass3_0.<RunOnMain>b__0()
   bei Duplicati.Library.Main.Operation.Common.SingleRunner.<DoRunOnMain>d__2`1.MoveNext()
--- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde ---
   bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   bei Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext()

Sorry for german log, changed the interface to english some days ago, but seems to depend on language the job was created.

Meaning there was also a backup on Sunday or there was never a backup after the database rebuild?
Version 0 is the latest. You got the error on version 1 Sat Sep 17. What does Restore show as latest?
Sometimes one can use the delete command in Commandline to delete the complaining version, but

being in a loop (yet only complaining once) sometimes means you just fail on the next bad one found…
It’s probably worth a try, but we wouldn’t want to delete too many versions if it seems like it keeps going.
Problem is that there aren’t any easy great solutions (I think).

Ok, seems I got it back to normal again. Deleted the backup version lacking the two files I deleted manually. Two backups succeded since that. One was started by me after deleting the spoilt version and one last night as scheduled.
By this the problem should be fixed for my needs.
To wrap it up what I/we did:

  1. Delete and Rebuild local Database
  2. Delete remote files with wrong size/spoilt header reported in Duplicati logfile
  3. Delete backup version originally containing spoilt files
    Results:
    After step 2 was able to run one backup succesfully , but only one.
    After step 3 , current state after two runs, it’s back to normal.
    At least step two did not help to solve the problem, I even wonder if rebuild of the database was necessary. Should it happen again I would just delete the faulty backup version in a first step.

Thanks again for your help with this and anybody invoved with Duplicati for this great tool.
I’ll keep Retry level-backup activated, if it happens again I will post it here.
Finally this is the log for the last changes/backup just to complete this:

2022-09-21 03:22:00 +02 - [Warning-Duplicati.Library.Main.Controller-DeprecatedOption]: The option log-level is deprecated: Use the log-file-log-level and console-log-level options instead
2022-09-21 03:22:00 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started
2022-09-21 03:22:20 +02 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 1: 17.09.2022 12:19:21 (database id: 12), found 155922 entries, but expected 155924
   bei Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency(Int64 blocksize, Int64 hashsize, Boolean verifyfilelists, IDbTransaction transaction)
   bei Duplicati.Library.Main.Operation.Backup.BackupDatabase.<>c__DisplayClass34_0.<VerifyConsistencyAsync>b__0()
   bei Duplicati.Library.Main.Operation.Common.SingleRunner.<>c__DisplayClass3_0.<RunOnMain>b__0()
   bei Duplicati.Library.Main.Operation.Common.SingleRunner.<DoRunOnMain>d__2`1.MoveNext()
--- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde ---
   bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   bei Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext()
2022-09-22 03:22:00 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: Die Operation Backup wurde gestartet
2022-09-22 03:22:19 +02 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 1: 17.09.2022 12:19:21 (database id: 12), found 155922 entries, but expected 155924
   bei Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency(Int64 blocksize, Int64 hashsize, Boolean verifyfilelists, IDbTransaction transaction)
   bei Duplicati.Library.Main.Operation.Backup.BackupDatabase.<>c__DisplayClass34_0.<VerifyConsistencyAsync>b__0()
   bei Duplicati.Library.Main.Operation.Common.SingleRunner.<>c__DisplayClass3_0.<RunOnMain>b__0()
   bei Duplicati.Library.Main.Operation.Common.SingleRunner.<DoRunOnMain>d__2`1.MoveNext()
--- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde ---
   bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   bei Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext()
2022-09-23 00:31:11 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Delete has started
2022-09-23 00:31:12 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2022-09-23 00:32:44 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (2,47 KB)
2022-09-23 00:32:44 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-StartCheck]: Start checking if backups can be removed
2022-09-23 00:32:44 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-FramesAndIntervals]: Time frames and intervals pairs: 7.00:00:00 / 1.00:00:00, 28.00:00:00 / 7.00:00:00, 365.00:00:00 / 31.00:00:00
2022-09-23 00:32:44 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupList]: Backups to consider: 17.09.2022 12:19:21, 11.09.2022 03:22:02, 03.09.2022 03:22:00, 02.09.2022 03:22:01, 26.08.2022 03:22:00, 22.07.2022 03:22:00, 16.06.2022 03:22:05, 10.05.2022 03:26:36, 01.04.2022 03:27:26, 27.02.2022 02:29:14, 24.02.2022 01:48:20, 23.02.2022 23:52:24
2022-09-23 00:32:44 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupsToDelete]: Backups outside of all time frames and thus getting deleted: 
2022-09-23 00:32:44 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-AllBackupsToDelete]: All backups to delete: 
2022-09-23 00:32:44 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteRemoteFileset]: Deleting 1 remote fileset(s) ...
2022-09-23 00:32:57 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-20220917T101921Z.dlist.zip.aes (13,19 MB)
2022-09-23 00:32:58 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-20220917T101921Z.dlist.zip.aes (13,19 MB)
2022-09-23 00:32:58 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteResults]: Deleted 1 remote fileset(s)
2022-09-23 00:33:06 +02 - [Information-Duplicati.Library.Main.Database.LocalDeleteDatabase-CompactReason]: Compacting not required
2022-09-23 00:33:31 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started
2022-09-23 00:33:46 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2022-09-23 00:35:05 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (2,47 KB)
2022-09-23 00:38:05 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bb665c0271a4b40f4867a40c969995393.dblock.zip.aes (49,97 MB)
2022-09-23 00:38:07 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b0f6ede1e352a47fe8ba76f54150dec73.dblock.zip.aes (23,65 MB)
2022-09-23 00:38:42 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b0f6ede1e352a47fe8ba76f54150dec73.dblock.zip.aes (23,65 MB)
2022-09-23 00:38:42 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ic0e635c07228428fb056adf9d9adda4f.dindex.zip.aes (43,90 KB)
2022-09-23 00:38:42 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ic0e635c07228428fb056adf9d9adda4f.dindex.zip.aes (43,90 KB)
2022-09-23 00:38:58 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bb665c0271a4b40f4867a40c969995393.dblock.zip.aes (49,97 MB)
2022-09-23 00:38:58 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i0d1af0e359eb46999667a21c31830542.dindex.zip.aes (66,17 KB)
2022-09-23 00:38:58 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i0d1af0e359eb46999667a21c31830542.dindex.zip.aes (66,17 KB)
2022-09-23 00:38:59 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20220922T223331Z.dlist.zip.aes (13,21 MB)
2022-09-23 00:39:09 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20220922T223331Z.dlist.zip.aes (13,21 MB)
2022-09-23 00:39:09 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-StartCheck]: Start checking if backups can be removed
2022-09-23 00:39:09 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-FramesAndIntervals]: Time frames and intervals pairs: 7.00:00:00 / 1.00:00:00, 28.00:00:00 / 7.00:00:00, 365.00:00:00 / 31.00:00:00
2022-09-23 00:39:09 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupList]: Backups to consider: 18.09.2022 03:22:00, 11.09.2022 03:22:02, 03.09.2022 03:22:00, 02.09.2022 03:22:01, 26.08.2022 03:22:00, 22.07.2022 03:22:00, 16.06.2022 03:22:05, 10.05.2022 03:26:36, 01.04.2022 03:27:26, 27.02.2022 02:29:14, 24.02.2022 01:48:20, 23.02.2022 23:52:24
2022-09-23 00:39:09 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupsToDelete]: Backups outside of all time frames and thus getting deleted: 
2022-09-23 00:39:09 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-AllBackupsToDelete]: All backups to delete: 
2022-09-23 00:39:14 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteResults]: No remote filesets were deleted
2022-09-23 00:39:14 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2022-09-23 00:40:49 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (2,48 KB)
2022-09-23 00:40:49 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20220401T012726Z.dlist.zip.aes (11,14 MB)
2022-09-23 00:40:55 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20220401T012726Z.dlist.zip.aes (11,14 MB)
2022-09-23 00:40:55 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-id70e0c8cec9645a6ac6a9b90caca19ee.dindex.zip.aes (70,20 KB)
2022-09-23 00:40:56 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-id70e0c8cec9645a6ac6a9b90caca19ee.dindex.zip.aes (70,20 KB)
2022-09-23 00:40:56 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b9c9451aca94a43919ad6da78e3f4d3f1.dblock.zip.aes (49,93 MB)
2022-09-23 00:41:09 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b9c9451aca94a43919ad6da78e3f4d3f1.dblock.zip.aes (49,93 MB)
2022-09-23 03:22:00 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started
2022-09-23 03:22:14 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2022-09-23 03:25:02 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (2,48 KB)
2022-09-23 03:26:21 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b06da1b0ec1df4800a8b1c8e3e332996d.dblock.zip.aes (23,42 KB)
2022-09-23 03:26:22 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b06da1b0ec1df4800a8b1c8e3e332996d.dblock.zip.aes (23,42 KB)
2022-09-23 03:26:29 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ie1563679a06249ba8509da0e8da75201.dindex.zip.aes (3,40 KB)
2022-09-23 03:26:29 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ie1563679a06249ba8509da0e8da75201.dindex.zip.aes (3,40 KB)
2022-09-23 03:26:31 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20220923T012200Z.dlist.zip.aes (13,21 MB)
2022-09-23 03:27:42 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20220923T012200Z.dlist.zip.aes (13,21 MB)
2022-09-23 03:27:42 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-StartCheck]: Start checking if backups can be removed
2022-09-23 03:27:42 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-FramesAndIntervals]: Time frames and intervals pairs: 7.00:00:00 / 1.00:00:00, 28.00:00:00 / 7.00:00:00, 365.00:00:00 / 31.00:00:00
2022-09-23 03:27:42 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupList]: Backups to consider: 23.09.2022 00:33:31, 18.09.2022 03:22:00, 11.09.2022 03:22:02, 03.09.2022 03:22:00, 02.09.2022 03:22:01, 26.08.2022 03:22:00, 22.07.2022 03:22:00, 16.06.2022 03:22:05, 10.05.2022 03:26:36, 01.04.2022 03:27:26, 27.02.2022 02:29:14, 24.02.2022 01:48:20, 23.02.2022 23:52:24
2022-09-23 03:27:42 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupsToDelete]: Backups outside of all time frames and thus getting deleted: 
2022-09-23 03:27:42 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-AllBackupsToDelete]: All backups to delete: 
2022-09-23 03:27:47 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteResults]: No remote filesets were deleted
2022-09-23 03:27:54 +02 - [Information-Duplicati.Library.Main.Database.LocalDeleteDatabase-CompactReason]: Compacting not required
2022-09-23 03:27:54 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2022-09-23 03:30:39 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (2,48 KB)
2022-09-23 03:30:40 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20220902T012201Z.dlist.zip.aes (13,17 MB)
2022-09-23 03:31:18 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20220902T012201Z.dlist.zip.aes (13,17 MB)
2022-09-23 03:31:18 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-i3041530125fa46209fd33d46ddd87b09.dindex.zip.aes (28,68 KB)
2022-09-23 03:31:19 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-i3041530125fa46209fd33d46ddd87b09.dindex.zip.aes (28,68 KB)
2022-09-23 03:31:19 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b93abe098f95344cebfa0d3b96fb22978.dblock.zip.aes (49,93 MB)
2022-09-23 03:33:54 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b93abe098f95344cebfa0d3b96fb22978.dblock.zip.aes (49,93 MB)
2022-09-23 07:55:31 +02 - [Warning-Duplicati.Library.Main.Controller-DeprecatedOption]: The option verbose is deprecated: Set a log-level for the desired output method instead
2022-09-23 07:55:31 +02 - [Warning-Duplicati.Library.Main.Controller-UnsupportedOption]: The supplied option --full-result  is not supported and will be ignored
2022-09-23 07:55:31 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation ListChanges has started

That’s good to hear. If you haven’t tested a sample restore yet, that would be an excellent thing to try.

I was able to mimic your database bug report (thanks for that) to reproduce UNIQUE constraint error.
This confirmed my suspicion that RemoteListAnalysis cited earlier had a collision between two rows.

We don’t have enough information on the messy middle portion to know how those rows came to be.
The reason the database doesn’t show them being created is probably due to rollback of partial work.
This is where log files do better. It’s just a sequential log without the ability to roll back any of the lines.

2022-09-22 20:40:05 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.BackupHandler-PreBackupVerify]: Starting - PreBackupVerify
2022-09-22 20:40:05 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationList]: Starting - RemoteOperationList
2022-09-22 20:40:05 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2022-09-22 20:40:05 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (3 bytes)
2022-09-22 20:40:05 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationList]: RemoteOperationList took 0:00:00:00.001
2022-09-22 20:40:05 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC
2022-09-22 20:40:05 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC took 0:00:00:00.000
2022-09-22 20:40:05 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT DISTINCT "Name", "State" FROM "Remotevolume" WHERE "Name" IN (SELECT "Name" FROM "Remotevolume" WHERE "State" IN ("Deleted", "Deleting")) AND NOT "State" IN ("Deleted", "Deleting")
2022-09-22 20:40:05 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT DISTINCT "Name", "State" FROM "Remotevolume" WHERE "Name" IN (SELECT "Name" FROM "Remotevolume" WHERE "State" IN ("Deleted", "Deleting")) AND NOT "State" IN ("Deleted", "Deleting") took 0:00:00:00.000
2022-09-22 20:40:05 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-RemoteUnwantedMissingFile]: removing file listed as Temporary: duplicati-20220919T144756Z.dlist.zip
2022-09-22 20:40:05 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-SchedulingMissingFileForDelete]: scheduling missing file for deletion, currently listed as Uploading: duplicati-20220919T144756Z.dlist.zip
2022-09-22 20:40:05 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.BackupHandler-PreBackupVerify]: PreBackupVerify took 0:00:00:00.005
2022-09-22 20:40:06 -04 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
code = Constraint (19), message = System.Data.SQLite.SQLiteException (0x800027AF): constraint failed
UNIQUE constraint failed: Remotevolume.Name, Remotevolume.State
   at System.Data.SQLite.SQLite3.Reset(SQLiteStatement stmt)
   at System.Data.SQLite.SQLite3.Step(SQLiteStatement stmt)
   at System.Data.SQLite.SQLiteDataReader.NextResult()
   at System.Data.SQLite.SQLiteDataReader..ctor(SQLiteCommand cmd, CommandBehavior behave)
   at System.Data.SQLite.SQLiteCommand.ExecuteReader(CommandBehavior behavior)
   at System.Data.SQLite.SQLiteCommand.ExecuteNonQuery(CommandBehavior behavior)
   at Duplicati.Library.Main.Database.LocalDatabase.UpdateRemoteVolume(String name, RemoteVolumeState state, Int64 size, String hash, Boolean suppressCleanup, TimeSpan deleteGraceTime, IDbTransaction transaction)
   at Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
   at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext()
2022-09-22 20:40:06 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: PRAGMA optimize
2022-09-22 20:40:06 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: PRAGMA optimize took 0:00:00:00.000
2022-09-22 20:40:06 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Controller-RunBackup]: Running Backup took 0:00:00:00.302
2022-09-22 20:40:06 -04 - [Error-Duplicati.Library.Main.Controller-FailedOperation]: The operation Backup has failed with error: constraint failed
UNIQUE constraint failed: Remotevolume.Name, Remotevolume.State
code = Constraint (19), message = System.Data.SQLite.SQLiteException (0x800027AF): constraint failed
UNIQUE constraint failed: Remotevolume.Name, Remotevolume.State
   at System.Data.SQLite.SQLite3.Reset(SQLiteStatement stmt)
   at System.Data.SQLite.SQLite3.Step(SQLiteStatement stmt)
   at System.Data.SQLite.SQLiteDataReader.NextResult()
   at System.Data.SQLite.SQLiteDataReader..ctor(SQLiteCommand cmd, CommandBehavior behave)
   at System.Data.SQLite.SQLiteCommand.ExecuteReader(CommandBehavior behavior)
   at System.Data.SQLite.SQLiteCommand.ExecuteNonQuery(CommandBehavior behavior)
   at Duplicati.Library.Main.Database.LocalDatabase.UpdateRemoteVolume(String name, RemoteVolumeState state, Int64 size, String hash, Boolean suppressCleanup, TimeSpan deleteGraceTime, IDbTransaction transaction)
   at Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
   at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.<Backup>b__0(BackupResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)

I just made a little test backup, then used DB Browser for SQLite to copy the dlist row twice, using a name incremented by a second. First extra row got State Temporary, and second got State Uploading, like yours.

Temporary gets set for deletion at

Uploading gets set for deletion at

Theory is that at some later point in the code, there’s an (unfortunately unlogged) UpdateRemoteVolume causing these two rows to attempt to get to the same State, e.g. Deleting. This is a uniqueness violation.

Resore a frequently change file from a old version. It worked, files opens in application, but seeing complains about missing list entries and files is not very promissing.

Duplicati.CommandLine.exe restore "webdavs://ewebdav.pcloud.com/Backups/Duplicati/Privatdaten?auth-username=******&auth-password=*******" "E:\Privatdaten\OwnDataKL\PWD_neu.kdbx" --version=5 --restore-path="e:\Duplicati_Error_Backup\restore"
Restore started at 23.09.2022 18:04:06

Verschlüsselungspassphrase eingeben: C.....4
  Listing remote folder ...
  Downloading file (13,17 MB) ...
  Downloading file (31,93 KB) ...
  Downloading file (106,51 KB) ...
  Downloading file (42,61 KB) ...
  ...
  Downloading file (58,22 KB) ...
  Downloading file (3,62 KB) ...
  Downloading file (42,83 KB) ...
Remote file referenced as duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes by duplicati-i6324d97e8d7347b8a1cc3ea90d4c465c.dindex.zip.aes, but not found in list, registering a missing remote file
  Downloading file (49,40 KB) ...
  Downloading file (46,34 KB) ...
  Downloading file (17,97 KB) ...
  Downloading file (91,42 KB) ...
  Downloading file (32,50 KB) ...
  ...
  Downloading file (17,97 KB) ...
  Downloading file (32,75 KB) ...
  Downloading file (53,09 KB) ...
  Downloading file (44,37 KB) ...
  Downloading file (38,26 KB) ...
Found 1 missing volumes; attempting to replace blocks from existing volumes
Found 1 missing volumes; attempting to replace blocks from existing volumes
Checking remote backup ...
  Listing remote folder ...
Checking existing target files ...
  1 files need to be restored (102,11 KB)
Scanning local files for needed data ...
  Downloading file (9,59 MB) ...
  0 files need to be restored (0 Bytes)
Verifying restored files ...
Restored 1 (102,11 KB) files to e:\Duplicati_Error_Backup\restore
Duration of restore: 00:11:44

Hello

could you set the record straight about your 3 jobs for a cloud backend; are each job pointing to a different directory on the backend (preferred option), and if not, did you set a prefix for each job ?

The three jobs start with 1 hour offset, so they do not overlap. Each job has its own destination folder on PCloud.

This is the file that pCloud seemingly broke so you deleted. Ordinarily one would list and purge broken files right after that, but the exact sequence done is a little bit confusing for me. You could try those steps again.

What this restore result suggests though is that the file you chose will be purged because it’s part missing. Destination killing files is a good way to damage backup. FWIW here’s a Duplicacy pCloud problem today:

Pcloud Webdav failing

The day of that pCloud error start is different than yours, but both of you had seemingly good history before.

One other the other jobs hang “verifying backend data” day ago. I had to kill the duplicati process. I*'m wondering for what reason there were two of it


Killing the one using less recousses showed no effect. When I killed the second the interface was no longer accessible.
After restart I started the incomplete job manually but received connection error
System.Net.WebException: Die Verbindung mit dem Remoteserver kann nicht hergestellt werden. ---> System.Net.Sockets.SocketException: Ein Verbindungsversuch ist fehlgeschlagen, da die Gegenstelle nach einer bestimmten Zeitspanne nicht richtig reagiert hat, oder die hergestellte Verbindung war fehlerhaft, da der verbundene Host nicht reagiert hat 45.131.244.151:443
Tried again a few times in the next hours without success. So PCloud had a problem with WEBDAV interface. Seems it had been down for several hours. Last two nights all three jobs succeded.
Since I have enough cloudspace it think about setting up a rclone backup in parallel using the Pcloud API for testing. Sad to day rclone is missing many features I like Duclicati for.

That’s strange, I tried

 curl https://ewebdav.pcloud.com
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>401 Unauthorized</title>
</head><body>
<h1>Unauthorized</h1>
<p>This server could not verify that you
are authorized to access the document
requested.  Either you supplied the wrong
credentials (e.g., bad password), or your
browser doesn't understand how to supply
the credentials required.</p>
<hr>
<address>Apache/2.4.38 (Debian) Server at ewebdav.pcloud.com Port 443</address>
</body></html>

so it accepts connections. Did you try to restart your computer ?