Remote backup task quit working : constraint failed UNIQUE constraint failed: Remotevolume.Name, Remotevolume.State

Thank you. The most obvious thing is that pCloud WebDAV uploads had a very bad Sep 10 and 11.
This would be visible in a log-file=<path> log-file-log-level=retry if you had one. Maybe time to start.

The last log in this job database was the Sep 9 backup, but it’s erased in the bug report, for privacy.
You can look at it in your Duplicati job logs, and Sep 10 and 11 might be in the server log, if backup completely failed. It looks like it exhausted the default number-of-retries which is 5, so errored out…

This by itself is not a problem. Next Duplicati backup attempts to pick up from where uploads ended.

Sep 9 looked normal. list at start checks things are in order, dblock file data (and its dindex) upload,
dlist says what’s in backup, retention deletes older dlist, list checks things, and sampled test occurs.

ID      OperationID     Timestamp       Operation       Path                                                            Data
4964    287             1662686871      list
4965    287             1662687049      put             duplicati-bd092300e2a1642c695580e3903b4ba6c.dblock.zip.aes      {"Size":547309,"Hash":"BeV/mRq0tKCHYkrFc6XezIAwA/7WvqQFguZEk5lnrZ4="}
4966    287             1662687052      put             duplicati-i642faf4e90b44b43920386c7f0a7e327.dindex.zip.aes      {"Size":4797,"Hash":"bUMLX4rwCSYJlQ7CMQO6Bb+F6ofmBVI71S5lUUbqe+0="}
4967    287             1662687054      put             duplicati-20220909T012200Z.dlist.zip.aes                        {"Size":13816701,"Hash":"ToGS8S0Iz6tPoBC3j+DOlMaJaKmol/c0zvp96Jd9ijg="}
4968    287             1662687073      delete          duplicati-20220812T012200Z.dlist.zip.aes        
4969    287             1662687512      list
4970    287             1662687532      get             duplicati-20220907T012200Z.dlist.zip.aes                        {"Size":13816221,"Hash":"zExvCpvP4k2vOdtpVMTJ2QAPPlVI7PCsutR7dKNzLRU="}
4971    287             1662687532      get             duplicati-i0607f51181ee4ba3b2af3cb5aa675813.dindex.zip.aes      {"Size":146077,"Hash":"Ws1Q8iWPxaHE4wCLpyaZflPRQI3Aprad3fB2FCUdV14="}
4972    287             1662687532      get             duplicati-b525acb4e23e44af8aa260994d8d65c21.dblock.zip.aes      {"Size":52333469,"Hash":"kIXuIv93XrKSZPaQqbvFMHZRlUjZqkmK61zV0Qbwm8M="}

Sep 10 has to try dblock twice, uploads dindex OK, exhausts default 5 retries on dlist, and errors out.
The retried dblock gets new random name. The retried dlists gets its name incremented by 1 second.
Log file would be clearer, but seeing the hash and size be the same suggests that it’s the same data.

ID      OperationID     Timestamp       Operation       Path                                                            Data
4973    288             1662773040      list
4974    288             1662773154      put             duplicati-bb55ed12f64a84368a9d7165ef471e5f3.dblock.zip.aes      {"Size":384525,"Hash":"eyGcoUFdYE/PiFYhHG/z5RP4A1EMY7jfgAnCSzNZkLs="}
4975    288             1662773184      put             duplicati-b2a297cd9c6624c7891eb1aa8d731ce0e.dblock.zip.aes      {"Size":384525,"Hash":"eyGcoUFdYE/PiFYhHG/z5RP4A1EMY7jfgAnCSzNZkLs="}
4976    288             1662773204      put             duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes      {"Size":5069,"Hash":"e8aRJJvi/uRuxa26YtxzSfjYNu0mhJynaiLg1x50e30="}
4977    288             1662773225      put             duplicati-20220910T012200Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}
4978    288             1662773256      put             duplicati-20220910T012201Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}
4979    288             1662773285      put             duplicati-20220910T012202Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}
4980    288             1662773325      put             duplicati-20220910T012203Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}
4981    288             1662773364      put             duplicati-20220910T012204Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}
4982    288             1662773404      put             duplicati-20220910T012205Z.dlist.zip.aes                        {"Size":13817645,"Hash":"Nzwsz3pNlLtX8sDhw8pqrLLcBN3j2+oRK+/iBTkE0D0="}

Sep 11 is still not uploading well, but one odd finding is it’s retrying Sep 10 dlist using reused names.
The size and content hash also seem to have changed. Some change may be normal (time stamps),
however I’m not sure if that’s enough to account for the size change. Regardless, I can’t dissect files.

ID      OperationID     Timestamp       Operation       Path                                                            Data
4983    289             1662859391      list
4984    289             1662859537      put             duplicati-bd96475eb84a1488192cbd3d88c6817ec.dblock.zip.aes      {"Size":24813,"Hash":"nE+sklcKP2E10/H93nUmEao4mRacK228TG71znudLhQ="}
4985    289             1662859575      put             duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes      {"Size":24813,"Hash":"nE+sklcKP2E10/H93nUmEao4mRacK228TG71znudLhQ="}
4986    289             1662859596      put             duplicati-i6324d97e8d7347b8a1cc3ea90d4c465c.dindex.zip.aes      {"Size":3709,"Hash":"u83hCCODlqVEdJIlHM/etDFtE0slHrAqlw3/z9eghA0="}
4987    289             1662859616      put             duplicati-20220911T012200Z.dlist.zip.aes                        {"Size":13817997,"Hash":"Xa5vSpHUUgR4W3ZvrWT2wRIKFEGDYT0aSaFS5BRHT3I="}
4988    289             1662859616      put             duplicati-20220910T012201Z.dlist.zip.aes                        {"Size":13816701,"Hash":"8mDIdHCDBVusing6h9jiRLz1bb0RT1Qj1meeXs35aSU="}
4989    289             1662859660      put             duplicati-20220911T012201Z.dlist.zip.aes                        {"Size":13817997,"Hash":"Xa5vSpHUUgR4W3ZvrWT2wRIKFEGDYT0aSaFS5BRHT3I="}
4990    289             1662859679      put             duplicati-20220910T012202Z.dlist.zip.aes                        {"Size":13816701,"Hash":"8mDIdHCDBVusing6h9jiRLz1bb0RT1Qj1meeXs35aSU="}
4991    289             1662859712      put             duplicati-20220910T012203Z.dlist.zip.aes                        {"Size":13816701,"Hash":"8mDIdHCDBVusing6h9jiRLz1bb0RT1Qj1meeXs35aSU="}
4992    289             1662859751      put             duplicati-20220910T012204Z.dlist.zip.aes                        {"Size":13816701,"Hash":"8mDIdHCDBVusing6h9jiRLz1bb0RT1Qj1meeXs35aSU="}

Let’s try looking at the size errors.

duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes at size 5069 had a Sep 10 upload, but because backup ended prematurely, its after-backup list was not done. The list at start of Sep 11 found

{“Name”:“duplicati-i5b13500e0c3347dcb3f23e30495e4f75.dindex.zip.aes”,“LastAccess”:“2022-09-10T03:27:33+02:00”,“LastModification”:“2022-09-10T03:27:33+02:00”,“Size”:2048,“IsFolder”:false},

duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes at size 24813 had upload on Sep 11 suffering a similar fate. The before backup file list check at start of Sep 12 found

{“Name”:“duplicati-bfca3601c65d7412486eab2b8620a9812.dblock.zip.aes”,“LastAccess”:“2022-09-11T03:26:42+02:00”,“LastModification”:“2022-09-11T03:26:42+02:00”,“Size”:4096,“IsFolder”:false},

It looks like pCloud accepted these files fine (or there would have been retries) but corrupted contents.

I still haven’t figured out where the constraint error is from. That might need some logs that don’t exist.
After the above errors, your later backups just do the before backup file list, then error. The list can try changing the Remotevolume table (sometimes logging why). The code that might be hitting error is at:

The log messages look like information level, and retry level is a little more, so that would catch them.