"Unexpected number of remote volumes marked as deleted" error

I am getting an “Unexpected number of remote volumes marked as deleted.” error every time I run my backup job.

Googling found several topics with similar errors, but most seemed very old or were fixed by upgrading Duplicati. Since the version upgraded to was older than what I’m running, I don’t think that would help. :slight_smile:

I have tried running the repair operation multiple times, it does not fix anything.

I have also deleted and recreated the database via the webui multiple times. Again, it did not help.

How can I fix this?

OS: Ubuntu 16.04
Duplicati: 2.0.2.1_beta_2017-08-01

Here is the sanatized output I get when running the backup command from the command line.

Reach# mono "/usr/lib/duplicati/Duplicati.CommandLine.exe" backup "ftp://goflex.lionsarch.tyria/GoFlex Home Personal/Duplicati/Reach/GoFlex Reach Home?auth-username=<ftpusername>&auth-password=password" "/home/<localusername>/" "/etc/hosts" "/var/lib/mysql/" "/opt/databasedumps/" --backup-name="GoFlex Reach Home" --dbpath="/root/.config/Duplicati/AHGLUNAFNK.sqlite" --encryption-module="aes" --compression-module="zip" --dblock-size="50mb" --keep-time="3Y" --passphrase="backuppassword" --send-mail-from="no-reply-reach" --send-mail-level="All" --send-mail-password="mailpass" --send-mail-to="mailto" --send-mail-url="mailurl" --send-mail-username="mailuser" --zip-compression-level="9" --run-script-timeout="3h" --exclude-files-attributes="temporary,system" --disable-module="console-password-input" --exclude="<a bunch of exclude flags I've removed from this post."
Backup started at 12/27/2017 1:07:33 PM
Checking remote backup ...
  Listing remote folder ...
promoting uploaded complete file from Uploading to Uploaded: duplicati-20171227T191233Z.dlist.zip.aes
Expected there to be a temporary fileset for synthetic filelist (21, duplicati-b001c8ef433d745b78d76bc03be5cf0fa.dblock.zip.aes), but none was found?
Scanning local files ...
  54040 files need to be examined (10.04 GB) (still counting)
  67680 files need to be examined (10.13 GB) (still counting)
  83257 files need to be examined (10.56 GB) (still counting)
  98338 files need to be examined (7.21 GB) (still counting)
  113448 files need to be examined (7.29 GB) (still counting)
  129146 files need to be examined (9.99 GB) (still counting)
  142060 files need to be examined (10.43 GB) (still counting)
  157211 files need to be examined (10.23 GB) (still counting)
  171613 files need to be examined (7.40 GB) (still counting)
  186575 files need to be examined (5.50 GB) (still counting)
  203106 files need to be examined (6.78 GB) (still counting)
  221252 files need to be examined (6.92 GB) (still counting)
  235631 files need to be examined (7.21 GB) (still counting)
  247748 files need to be examined (7.65 GB) (still counting)
  262362 files need to be examined (8.41 GB) (still counting)
  277052 files need to be examined (8.35 GB) (still counting)
  294158 files need to be examined (8.81 GB) (still counting)
  312092 files need to be examined (9.75 GB) (still counting)
  327488 files need to be examined (14.19 GB) (still counting)
  343615 files need to be examined (18.96 GB) (still counting)
  364082 files need to be examined (87.38 GB) (still counting)
  380246 files need to be examined (87.86 GB) (still counting)
  398018 files need to be examined (88.45 GB) (still counting)
  414861 files need to be examined (88.61 GB) (still counting)
  427148 files need to be examined (89.27 GB) (still counting)
  441818 files need to be examined (114.69 GB) (still counting)
  452367 files need to be examined (118.26 GB) (still counting)
  464372 files need to be examined (216.78 GB) (still counting)
  474967 files need to be examined (222.76 GB) (still counting)
  489727 files need to be examined (224.32 GB) (still counting)
  506209 files need to be examined (230.69 GB) (still counting)
  507212 files need to be examined (233.88 GB)
  497502 files need to be examined (233.80 GB)
  487694 files need to be examined (233.75 GB)
  478752 files need to be examined (233.42 GB)
  467801 files need to be examined (233.36 GB)
  457881 files need to be examined (233.32 GB)
  449432 files need to be examined (233.29 GB)
  438388 files need to be examined (231.96 GB)
  428816 files need to be examined (231.92 GB)
  419558 files need to be examined (231.89 GB)
  407020 files need to be examined (231.87 GB)
  396889 files need to be examined (231.57 GB)
  387065 files need to be examined (231.51 GB)
  378708 files need to be examined (231.34 GB)
  371759 files need to be examined (231.00 GB)
  365839 files need to be examined (230.98 GB)
  359989 files need to be examined (230.83 GB)
  352953 files need to be examined (230.50 GB)
  345417 files need to be examined (230.06 GB)
  338649 files need to be examined (229.98 GB)
  331673 files need to be examined (229.95 GB)
  325954 files need to be examined (229.90 GB)
  319188 files need to be examined (229.81 GB)
  312426 files need to be examined (229.76 GB)
  307315 files need to be examined (229.43 GB)
  300634 files need to be examined (229.34 GB)
  292639 files need to be examined (229.09 GB)
  286077 files need to be examined (228.53 GB)
  278045 files need to be examined (228.38 GB)
  270419 files need to be examined (227.81 GB)
  261494 files need to be examined (223.95 GB)
  252545 files need to be examined (223.77 GB)
  243601 files need to be examined (222.77 GB)
  234324 files need to be examined (220.49 GB)
  226274 files need to be examined (217.91 GB)
  216232 files need to be examined (150.40 GB)
  206788 files need to be examined (150.32 GB)
  197373 files need to be examined (149.94 GB)
  187970 files need to be examined (149.72 GB)
  177859 files need to be examined (149.64 GB)
  167406 files need to be examined (149.52 GB)
  157456 files need to be examined (149.04 GB)
  147055 files need to be examined (148.74 GB)
  137373 files need to be examined (148.69 GB)
  129160 files need to be examined (148.65 GB)
  120268 files need to be examined (148.54 GB)
  112224 files need to be examined (146.92 GB)
  105536 files need to be examined (124.50 GB)
  97179 files need to be examined (122.69 GB)
  93589 files need to be examined (116.22 GB)
  81442 files need to be examined (115.70 GB)
  74303 files need to be examined (109.93 GB)
  65793 files need to be examined (17.44 GB)
  56302 files need to be examined (12.43 GB)
  55682 files need to be examined (12.39 GB)
  52433 files need to be examined (11.83 GB)
  45164 files need to be examined (11.18 GB)
  37147 files need to be examined (10.96 GB)
  29288 files need to be examined (9.42 GB)
  27422 files need to be examined (9.37 GB)
  25429 files need to be examined (9.34 GB)
  19577 files need to be examined (5.86 GB)
  10352 files need to be examined (3.63 GB)
  0 files need to be examined (0 bytes)
  Uploading file (25.27 MB) ...
  Uploading file (255.03 KB) ...
  Uploading file (37.94 MB) ...
Fatal error => Unexpected number of remote volumes marked as deleted. Found 0 filesets, but 1 volumes

System.Exception: Unexpected number of remote volumes marked as deleted. Found 0 filesets, but 1 volumes
  at Duplicati.Library.Main.Database.LocalDeleteDatabase+<DropFilesetsFromTable>d__5.MoveNext () [0x00252] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at System.Collections.Generic.LargeArrayBuilder`1[T].AddRange (System.Collections.Generic.IEnumerable`1[T] items) [0x0005e] in <521f3fe476f34ea88748e97b3b984d05>:0 
  at System.Collections.Generic.EnumerableHelpers.ToArray[T] (System.Collections.Generic.IEnumerable`1[T] source) [0x00033] in <521f3fe476f34ea88748e97b3b984d05>:0 
  at System.Linq.Enumerable.ToArray[TSource] (System.Collections.Generic.IEnumerable`1[T] source) [0x00018] in <521f3fe476f34ea88748e97b3b984d05>:0 
  at Duplicati.Library.Main.Operation.DeleteHandler.DoRun (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Data.IDbTransaction& transaction, System.Boolean hasVerifiedBacked, System.Boolean forceCompact) [0x0015c] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired (Duplicati.Library.Main.BackendManager backend, System.Int64 lastVolumeSize) [0x00092] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String[] sources, Duplicati.Library.Utility.IFilter filter) [0x00860] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Controller+<>c__DisplayClass16_0.<Backup>b__0 (Duplicati.Library.Main.BackupResults result) [0x0030f] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action`1[T] method) [0x0014b] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Controller.Backup (System.String[] inputsources, Duplicati.Library.Utility.IFilter filter) [0x00068] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.CommandLine.Commands.Backup (System.IO.TextWriter outwriter, System.Action`1[T] setup, System.Collections.Generic.List`1[T] args, System.Collections.Generic.Dictionary`2[TKey,TValue] options, Duplicati.Library.Utility.IFilter filter) [0x000ee] in <08be045d41054657bb406455b4cf1a83>:0 
  at (wrapper delegate-invoke) System.Func`6[System.IO.TextWriter,System.Action`1[Duplicati.Library.Main.Controller],System.Collections.Generic.List`1[System.String],System.Collections.Generic.Dictionary`2[System.String,System.String],Duplicati.Library.Utility.IFilter,System.Int32]:invoke_TResult_T1_T2_T3_T4_T5 (System.IO.TextWriter,System.Action`1<Duplicati.Library.Main.Controller>,System.Collections.Generic.List`1<string>,System.Collections.Generic.Dictionary`2<string, string>,Duplicati.Library.Utility.IFilter)
  at Duplicati.CommandLine.Program.RunCommandLine (System.IO.TextWriter outwriter, System.IO.TextWriter errwriter, System.Action`1[T] setup, System.String[] args) [0x003fb] in <08be045d41054657bb406455b4cf1a83>:0 

Here is what I get when I run the repair operation from the command line:

Reach# mono "/usr/lib/duplicati/Duplicati.CommandLine.exe" repair "ftp://goflex.lionsarch.tyria/GoFlex Home Personal/Duplicati/Reach/GoFlex Reach Home?auth-username=ftpuser&auth-password=ftppass" --passphrase="password" --dbpath="/root/.config/Duplicati/AHGLUNAFNK.sqlite"
  Listing remote folder ...
Destination and database are synchronized, not making any changes

I’d post the output of the delete and recreate operation, but I’m not 100% sure how to run that from the command line. Do I just rm the database file and then run a repair?

As you probably noted, database recreation can take quite a while. If you have the time available then it’s certainly worth a try, but I’d rename or move the existing database rather than delete it (just in case).

As for the actual “Unexpected number of remote volumes marked as deleted” error you’re seeing, I haven’t dealt with it before so I’m not exactly sure what’s going on. I believe it’s saying is that the number of files deleted from the local database (the filesets) didn’t mach the number of destination files that were flagged to be eventually be deleted (the volumes) during the next cleanup / compacting phase.

So in your case, there were 0 files deleted from the database, yet 1 destination file was found to be flagged for later deletion.

This is odd because code wise this should only happen if Duplicati is interrupted exactly during this block of code. Do you recall a power outage or system crash that might have happened just before you started getting this?

As for resolving it, @kees-z or @kenkendk might have a suggestion.

Well, I finally had time to get back to this issue.

I went ahead and upgraded to v2.0.2.15-2.0.2.15_experimental_2018-01-03 downloaded from GitHub.

I moved the old database and ran the repair operation from the command line. It completed successfully, except for these messages partway through:

Operation Get with file duplicati-20170611T183256Z.dlist.zip.aes attempt 1 of 5 failed with message: File length is invalid => File length is invalid
  Downloading file (40.42 MB) ...
Operation Get with file duplicati-20170611T183256Z.dlist.zip.aes attempt 2 of 5 failed with message: File length is invalid => File length is invalid
  Downloading file (40.42 MB) ...
Operation Get with file duplicati-20170611T183256Z.dlist.zip.aes attempt 3 of 5 failed with message: File length is invalid => File length is invalid
  Downloading file (40.42 MB) ...
Operation Get with file duplicati-20170611T183256Z.dlist.zip.aes attempt 4 of 5 failed with message: File length is invalid => File length is invalid
  Downloading file (40.42 MB) ...
Operation Get with file duplicati-20170611T183256Z.dlist.zip.aes attempt 5 of 5 failed with message: File length is invalid => File length is invalid
  Downloading file (41.32 MB) ...
Failed to process file: duplicati-20170611T183256Z.dlist.zip.aes => File length is invalid

It ended with this:

Processing required 1 blocklist volumes
  Downloading file (49.99 MB) ...
Recreate completed, verifying the database consistency
Recreate completed, and consistency checks completed, marking database as complete

Then I ran a backup, and received the same error as before.

Here is the error message again. Just in case there’s something different I didn’t spot.

Fatal error => Unexpected number of remote volumes marked as deleted. Found 0 filesets, but 1 volumes

System.Exception: Unexpected number of remote volumes marked as deleted. Found 0 filesets, but 1 volumes
  at Duplicati.Library.Main.Database.LocalDeleteDatabase+<DropFilesetsFromTable>d__5.MoveNext () [0x00252] in <ab07a85a74244406aaefc2202f7e530f>:0 
  at System.Collections.Generic.LargeArrayBuilder`1[T].AddRange (System.Collections.Generic.IEnumerable`1[T] items) [0x0005e] in <521f3fe476f34ea88748e97b3b984d05>:0 
  at System.Collections.Generic.EnumerableHelpers.ToArray[T] (System.Collections.Generic.IEnumerable`1[T] source) [0x00033] in <521f3fe476f34ea88748e97b3b984d05>:0 
  at System.Linq.Enumerable.ToArray[TSource] (System.Collections.Generic.IEnumerable`1[T] source) [0x00018] in <521f3fe476f34ea88748e97b3b984d05>:0 
  at Duplicati.Library.Main.Operation.DeleteHandler.DoRun (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Data.IDbTransaction& transaction, System.Boolean hasVerifiedBacked, System.Boolean forceCompact, Duplicati.Library.Main.BackendManager sharedManager) [0x00169] in <ab07a85a74244406aaefc2202f7e530f>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired (Duplicati.Library.Main.BackendManager backend, System.Int64 lastVolumeSize) [0x000a5] in <ab07a85a74244406aaefc2202f7e530f>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String[] sources, Duplicati.Library.Utility.IFilter filter) [0x008a5] in <ab07a85a74244406aaefc2202f7e530f>:0 
  at Duplicati.Library.Main.Controller+<>c__DisplayClass17_0.<Backup>b__0 (Duplicati.Library.Main.BackupResults result) [0x00036] in <ab07a85a74244406aaefc2202f7e530f>:0 
  at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action`1[T] method) [0x0014b] in <ab07a85a74244406aaefc2202f7e530f>:0 
  at Duplicati.Library.Main.Controller.Backup (System.String[] inputsources, Duplicati.Library.Utility.IFilter filter) [0x00068] in <ab07a85a74244406aaefc2202f7e530f>:0 
  at Duplicati.CommandLine.Commands.Backup (System.IO.TextWriter outwriter, System.Action`1[T] setup, System.Collections.Generic.List`1[T] args, System.Collections.Generic.Dictionary`2[TKey,TValue] options, Duplicati.Library.Utility.IFilter filter) [0x000ee] in <4d8cdd60210946ec8cc8a22c1148fc2f>:0 
  at (wrapper delegate-invoke) System.Func`6[System.IO.TextWriter,System.Action`1[Duplicati.Library.Main.Controller],System.Collections.Generic.List`1[System.String],System.Collections.Generic.Dictionary`2[System.String,System.String],Duplicati.Library.Utility.IFilter,System.Int32]:invoke_TResult_T1_T2_T3_T4_T5 (System.IO.TextWriter,System.Action`1<Duplicati.Library.Main.Controller>,System.Collections.Generic.List`1<string>,System.Collections.Generic.Dictionary`2<string, string>,Duplicati.Library.Utility.IFilter)
  at Duplicati.CommandLine.Program.RunCommandLine (System.IO.TextWriter outwriter, System.IO.TextWriter errwriter, System.Action`1[T] setup, System.String[] args) [0x003fb] in <4d8cdd60210946ec8cc8a22c1148fc2f>:0 

If you need the rest of the output for some reason, I have saved it.

It’s been months since this started happening, so I don’t remember. I actually have the exact same data being backed up to Google Drive in another job. So it hasn’t been on the top of my priority list. (The data is in a work account, so I want a local backup. Plus, I have yet to figure out why my internet connection starts losing packets whenever I try to backup to Google drive here at home. It works fine from work…)

Thanks for following up on how things went for you after the version change.

At this point I think we’ll need input from somebody more familiar with the database then I am (such as @kenkendk) for how to resolve the “Unexpected number of remote volumes marked as deleted” error. :frowning:

Email notified me of this post so I tried the --auto-cleanup flag. It did not help.

I’m on Duplicati - 2.0.3.3_beta_2018-04-02 and Ubuntu 18.04 now.

The end of the output:

0 files need to be examined (0 bytes)
  Uploading file (17.74 MB) ...
  Uploading file (218.09 KB) ...
  Uploading file (81.68 MB) ...
Fatal error => Unexpected number of remote volumes marked as deleted. Found 0 filesets, but 1 volumes

System.Exception: Unexpected number of remote volumes marked as deleted. Found 0 filesets, but 1 volumes
  at Duplicati.Library.Main.Database.LocalDeleteDatabase+<DropFilesetsFromTable>d__5.MoveNext () [0x00252] in <ae134c5a9abb455eb7f06c134d211773>:0 
  at System.Collections.Generic.LargeArrayBuilder`1[T].AddRange (System.Collections.Generic.IEnumerable`1[T] items) [0x0005e] in <4bd4dbd20d094f3f8734f677798e9741>:0 
  at System.Collections.Generic.EnumerableHelpers.ToArray[T] (System.Collections.Generic.IEnumerable`1[T] source) [0x00033] in <4bd4dbd20d094f3f8734f677798e9741>:0 
  at System.Linq.Enumerable.ToArray[TSource] (System.Collections.Generic.IEnumerable`1[T] source) [0x00018] in <4bd4dbd20d094f3f8734f677798e9741>:0 
  at Duplicati.Library.Main.Operation.DeleteHandler.DoRun (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Data.IDbTransaction& transaction, System.Boolean hasVerifiedBacked, System.Boolean forceCompact, Duplicati.Library.Main.BackendManager sharedManager) [0x00169] in <ae134c5a9abb455eb7f06c134d211773>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired (Duplicati.Library.Main.BackendManager backend, System.Int64 lastVolumeSize) [0x000a5] in <ae134c5a9abb455eb7f06c134d211773>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String[] sources, Duplicati.Library.Utility.IFilter filter) [0x008e0] in <ae134c5a9abb455eb7f06c134d211773>:0 
  at Duplicati.Library.Main.Controller+<>c__DisplayClass17_0.<Backup>b__0 (Duplicati.Library.Main.BackupResults result) [0x00036] in <ae134c5a9abb455eb7f06c134d211773>:0 
  at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action`1[T] method) [0x0014b] in <ae134c5a9abb455eb7f06c134d211773>:0 
  at Duplicati.Library.Main.Controller.Backup (System.String[] inputsources, Duplicati.Library.Utility.IFilter filter) [0x00068] in <ae134c5a9abb455eb7f06c134d211773>:0 
  at Duplicati.CommandLine.Commands.Backup (System.IO.TextWriter outwriter, System.Action`1[T] setup, System.Collections.Generic.List`1[T] args, System.Collections.Generic.Dictionary`2[TKey,TValue] options, Duplicati.Library.Utility.IFilter filter) [0x000ee] in <71ed3aa6fd0e4ec4934b502e9e6ebc99>:0 
  at (wrapper delegate-invoke) System.Func`6[System.IO.TextWriter,System.Action`1[Duplicati.Library.Main.Controller],System.Collections.Generic.List`1[System.String],System.Collections.Generic.Dictionary`2[System.String,System.String],Duplicati.Library.Utility.IFilter,System.Int32].invoke_TResult_T1_T2_T3_T4_T5(System.IO.TextWriter,System.Action`1<Duplicati.Library.Main.Controller>,System.Collections.Generic.List`1<string>,System.Collections.Generic.Dictionary`2<string, string>,Duplicati.Library.Utility.IFilter)
  at Duplicati.CommandLine.Program.RunCommandLine (System.IO.TextWriter outwriter, System.IO.TextWriter errwriter, System.Action`1[T] setup, System.String[] args) [0x003fb] in <71ed3aa6fd0e4ec4934b502e9e6ebc99>:0 

Ok, so last time I tried the autocleanup flag, I forgot to add the =true part. So I tried again. I also logged profiling output to a logfile that is now using up a few gb of disk space…

My sanitized command:

sudo mono /usr/lib/duplicati/Duplicati.CommandLine.exe backup "ftp://<blah>" /home/user/ /etc/hosts /var/lib/mysql/ /opt/databasedumps/ --backup-name="BackupName" --dbpath=/root/.config/Duplicati/BackupName.sqlite --encryption-module=aes --compression-module=zip --dblock-size=50mb --keep-time=3Y --passphrase=<pass> <email options> --zip-compression-level=9 --run-script-timeout=3h --exclude-files-attributes="system,temporary" --disable-module=console-password-input <--exclude flags> --auto-cleanup=true log-level=Profiling log-file=/home/user/2018.05.10.duplicati.log

The last bit of the log file:

<lots of stuff>
2018-05-11 03:52:18Z - Profiling: ExecuteNonQuery: INSERT INTO "DeletedBlock" ("Hash", "Size", "VolumeID") SELECT "Hash", "Size", "VolumeID" FROM "Block" WHERE "ID" NOT IN (SELECT DISTINCT "BlockID" AS "BlockID" FROM "BlocksetEntry" UNION SELECT DISTINCT "ID" FROM "Block", "BlocklistHash" WHERE "Block"."Hash" = "BlocklistHash"."Hash")  took 00:00:21.171
2018-05-11 03:52:18Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM "Block" WHERE "ID" NOT IN (SELECT DISTINCT "BlockID" FROM "BlocksetEntry" UNION SELECT DISTINCT "ID" FROM "Block", "BlocklistHash" WHERE "Block"."Hash" = "BlocklistHash"."Hash") 
2018-05-11 03:52:40Z - Profiling: ExecuteNonQuery: DELETE FROM "Block" WHERE "ID" NOT IN (SELECT DISTINCT "BlockID" FROM "BlocksetEntry" UNION SELECT DISTINCT "ID" FROM "Block", "BlocklistHash" WHERE "Block"."Hash" = "BlocklistHash"."Hash")  took 00:00:22.519
2018-05-11 03:52:40Z - Profiling: Starting - ExecuteNonQuery: UPDATE "RemoteVolume" SET "State" = ? WHERE "Type" = ? AND "State" IN (?, ?) AND "ID" NOT IN (SELECT "VolumeID" FROM "Fileset") 
2018-05-11 03:52:40Z - Profiling: ExecuteNonQuery: UPDATE "RemoteVolume" SET "State" = ? WHERE "Type" = ? AND "State" IN (?, ?) AND "ID" NOT IN (SELECT "VolumeID" FROM "Fileset")  took 00:00:00.019
2018-05-11 03:52:40Z - Error: Fatal error
System.Exception: Unexpected number of remote volumes marked as deleted. Found 0 filesets, but 1 volumes
  at Duplicati.Library.Main.Database.LocalDeleteDatabase+<DropFilesetsFromTable>d__5.MoveNext () [0x00252] in <ae134c5a9abb455eb7f06c134d211773>:0 
  at System.Collections.Generic.LargeArrayBuilder`1[T].AddRange (System.Collections.Generic.IEnumerable`1[T] items) [0x0005e] in <c0502b9f220448589f6c0bfcd98a6c9b>:0 
  at System.Collections.Generic.EnumerableHelpers.ToArray[T] (System.Collections.Generic.IEnumerable`1[T] source) [0x00033] in <c0502b9f220448589f6c0bfcd98a6c9b>:0 
  at System.Linq.Enumerable.ToArray[TSource] (System.Collections.Generic.IEnumerable`1[T] source) [0x00018] in <c0502b9f220448589f6c0bfcd98a6c9b>:0 
  at Duplicati.Library.Main.Operation.DeleteHandler.DoRun (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Data.IDbTransaction& transaction, System.Boolean hasVerifiedBacked, System.Boolean forceCompact, Duplicati.Library.Main.BackendManager sharedManager) [0x00169] in <ae134c5a9abb455eb7f06c134d211773>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired (Duplicati.Library.Main.BackendManager backend, System.Int64 lastVolumeSize) [0x000a5] in <ae134c5a9abb455eb7f06c134d211773>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String[] sources, Duplicati.Library.Utility.IFilter filter) [0x0076f] in <ae134c5a9abb455eb7f06c134d211773>:0 

2018-05-11 03:52:41Z - Profiling: Running Backup took 03:53:45.181
<stuff dealing with emailing me about the failed job.>

I think if a boolean parameter is added without a value it defaults to true (meaning --auto-cleanup is the same as --auto-cleanup=true.

As far as the “Unexpected number of remote volumes marked as deleted” error goes our best bet is to try again to get @kenkendk involved.

Since you’re the third person (along with @glaurent and @tacioandrade) I know of to report this so far it may be an issue with some database state while updating to / beyond 2.0.3.3.

Yes. All boolean options are false by default, so supplying them will set them to true, unless the value is either false, no, 0 or off.

The error is a fail-safe check that tests that the database query performs as expected, which is somehow not the case here.

It looks like it is trying to delete the oldest backup and this causes some trouble.

You can try:

sudo mono /usr/lib/duplicati/Duplicati.CommandLine.exe list "ftp://<blah>" --dbpath=/root/.config/Duplicati/BackupName.sqlite

This will show you all the versions Duplicati can find. You can then try to delete the oldest one like this (replace 8 with the right version from the list command):

sudo mono /usr/lib/duplicati/Duplicati.CommandLine.exe delete "ftp://<blah>" --dbpath=/root/.config/Duplicati/BackupName.sqlite 8

Maybe it works better when deleted manually.

Er, I have it set to keep backups for 3 years. Why would it be trying to delete anything when it hasn’t been 3 years since I started using Duplicati?

In any case, I did try deleting the oldest fileset. This is what I got:

mono /usr/lib/duplicati/Duplicati.CommandLine.exe delete "ftp://..." --dbpath=/root/.config/Duplicati/blah.sqlite --passphrase=password --version=28
  Listing remote folder ...
promoting uploaded complete file from Uploading to Uploaded: duplicati-20180510T235856Z.dlist.zip.aes
Deleting 1 remote fileset(s) ...

System.Exception: Unexpected number of remote volumes marked as deleted. Found 1 filesets, but 2 volumes
  at Duplicati.Library.Main.Database.LocalDeleteDatabase+<DropFilesetsFromTable>d__5.MoveNext () [0x00252] in <ae134c5a9abb455eb7f06c134d211773>:0 
  at System.Collections.Generic.LargeArrayBuilder`1[T].AddRange (System.Collections.Generic.IEnumerable`1[T] items) [0x0005e] in <c0502b9f220448589f6c0bfcd98a6c9b>:0 
  at System.Collections.Generic.EnumerableHelpers.ToArray[T] (System.Collections.Generic.IEnumerable`1[T] source) [0x00033] in <c0502b9f220448589f6c0bfcd98a6c9b>:0 
  at System.Linq.Enumerable.ToArray[TSource] (System.Collections.Generic.IEnumerable`1[T] source) [0x00018] in <c0502b9f220448589f6c0bfcd98a6c9b>:0 
  at Duplicati.Library.Main.Operation.DeleteHandler.DoRun (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Data.IDbTransaction& transaction, System.Boolean hasVerifiedBacked, System.Boolean forceCompact, Duplicati.Library.Main.BackendManager sharedManager) [0x00169] in <ae134c5a9abb455eb7f06c134d211773>:0 
  at Duplicati.Library.Main.Operation.DeleteHandler.Run () [0x00070] in <ae134c5a9abb455eb7f06c134d211773>:0 
  at Duplicati.Library.Main.Controller.<Delete>b__20_0 (Duplicati.Library.Main.DeleteResults result) [0x00012] in <ae134c5a9abb455eb7f06c134d211773>:0 
  at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action`1[T] method) [0x0014b] in <ae134c5a9abb455eb7f06c134d211773>:0 
  at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.Action`1[T] method) [0x00009] in <ae134c5a9abb455eb7f06c134d211773>:0 
  at Duplicati.Library.Main.Controller.Delete () [0x00006] in <ae134c5a9abb455eb7f06c134d211773>:0 
  at Duplicati.CommandLine.Commands.Delete (System.IO.TextWriter outwriter, System.Action`1[T] setup, System.Collections.Generic.List`1[T] args, System.Collections.Generic.Dictionary`2[TKey,TValue] options, Duplicati.Library.Utility.IFilter filter) [0x00087] in <71ed3aa6fd0e4ec4934b502e9e6ebc99>:0 
  at (wrapper delegate-invoke) System.Func`6[System.IO.TextWriter,System.Action`1[Duplicati.Library.Main.Controller],System.Collections.Generic.List`1[System.String],System.Collections.Generic.Dictionary`2[System.String,System.String],Duplicati.Library.Utility.IFilter,System.Int32].invoke_TResult_T1_T2_T3_T4_T5(System.IO.TextWriter,System.Action`1<Duplicati.Library.Main.Controller>,System.Collections.Generic.List`1<string>,System.Collections.Generic.Dictionary`2<string, string>,Duplicati.Library.Utility.IFilter)
  at Duplicati.CommandLine.Program.RunCommandLine (System.IO.TextWriter outwriter, System.IO.TextWriter errwriter, System.Action`1[T] setup, System.String[] args) [0x003fb] in <71ed3aa6fd0e4ec4934b502e9e6ebc99>:0 

Is there a way to list the volumes that are marked as deleted, then compare that value to what’s really there and fix it manually somehow?

Good question, it should not delete things then, but the error message indicates that this is what is happening. You should see a message like Deleting 1 remote filesets in the log for the backup.

My guess is that somehow a remote volume is marked as deleted incorrectly, and then all the database queries pick up this problem.

If you can make a bugreport for the database, I can have a look and see if I can fix it, but it will be a week or two before I can look.

I’m getting this "Unexpected number of remote volumes marked as deleted" error now as well after a series of repairs, deletes, and recreates attempting to resolve a different issue (“Unexpected difference in fileset” errors [2.0.3.9]).

I’m using 2.0.3.9 with custom retention - would a database bugreport be useful? (It’s a 24MB database for 55 versions on 3.22GB / 12,869 files of source data - and yes, 25 of those versions were with 7zip before I understood the issues with that.)

I don’t know if it’s related, but a full-remote-verification test of the backup yields a single “complaint” of:

duplicati-ic98a84640e544f96a9c604a629d03646.dindex.zip.aes: 8 errors
	Extra: 72GlTPhZBSyUypO2KjA4a5euf0xSaMw6DuhNCWcB6u8=
	Extra: A7D0Yc+WJ3uhZsboMetGNxxEfumIv5+hWNipQ+dGA5I=
	Extra: ACh5r+Coq6UD2gZrEGosOEo7C76gyY+GIUdWxtVvCBs=
	Extra: EysjZ9dqEp9fTJlcvwRu15xnddKyFmvKF0HMSGxbwQg=
	Extra: NVYEvgQ4zxYoVZQyq/ENwSRM/ygh9PS+cERsHlkTC5Q=
	Extra: pJPSjziAnss6dfH5td/pgN42uXnc3kYiqMH9rCzIkf8=
	Extra: pW0VfeIcdZXraUqO6L6ARkG4bxmxi1CDoy/TSLs6AFc=
	Extra: duplicati-bc62ac1e7e23a4e199bf887db828d1691.dblock.zip.aes

Here’s the backup error messages as included in the notification email:

Failed: One or more errors occurred.
Details: System.AggregateException: One or more errors occurred. ---> System.AggregateException: Unexpected number of remote volumes marked as deleted. Found 2 filesets, but 27 volumes ---> System.Exception: Unexpected number of remote volumes marked as deleted. Found 2 filesets, but 27 volumes
  at Duplicati.Library.Main.Database.LocalDeleteDatabase+<DropFilesetsFromTable>d__6.MoveNext () [0x0023f] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at System.Collections.Generic.LargeArrayBuilder`1[T].AddRange (System.Collections.Generic.IEnumerable`1[T] items) [0x0005e] in <abeb153532fb46718501a600866a8d76>:0
  at System.Collections.Generic.EnumerableHelpers.ToArray[T] (System.Collections.Generic.IEnumerable`1[T] source) [0x00033] in <abeb153532fb46718501a600866a8d76>:0
  at System.Linq.Enumerable.ToArray[TSource] (System.Collections.Generic.IEnumerable`1[T] source) [0x00018] in <abeb153532fb46718501a600866a8d76>:0
  at Duplicati.Library.Main.Operation.DeleteHandler.DoRun (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Data.IDbTransaction& transaction, System.Boolean hasVerifiedBacked, System.Boolean forceCompact, Duplicati.Library.Main.BackendManager sharedManager) [0x00179] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired (Duplicati.Library.Main.BackendManager backend, System.Int64 lastVolumeSize) [0x000a5] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at Duplicati.Library.Main.Operation.BackupHandler+<RunAsync>d__19.MoveNext () [0x00db0] in <0ce58d578b8642d49036dc15fbad38f1>:0
   --- End of inner exception stack trace ---
  at Duplicati.Library.Main.Operation.BackupHandler+<RunAsync>d__19.MoveNext () [0x0102c] in <0ce58d578b8642d49036dc15fbad38f1>:0
   --- End of inner exception stack trace ---
  at CoCoL.ChannelExtensions.WaitForTaskOrThrow (System.Threading.Tasks.Task task) [0x0005d] in <6973ce2780de4b28aaa2c5ffc59993b1>:0
  at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String[] sources, Duplicati.Library.Utility.IFilter filter) [0x00008] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at Duplicati.Library.Main.Controller+<>c__DisplayClass13_0.<Backup>b__0 (Duplicati.Library.Main.BackupResults result) [0x00035] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action`1[T] method) [0x0011d] in <0ce58d578b8642d49036dc15fbad38f1>:0
---> (Inner Exception #0) System.AggregateException: Unexpected number of remote volumes marked as deleted. Found 2 filesets, but 27 volumes ---> System.Exception: Unexpected number of remote volumes marked as deleted. Found 2 filesets, but 27 volumes
  at Duplicati.Library.Main.Database.LocalDeleteDatabase+<DropFilesetsFromTable>d__6.MoveNext () [0x0023f] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at System.Collections.Generic.LargeArrayBuilder`1[T].AddRange (System.Collections.Generic.IEnumerable`1[T] items) [0x0005e] in <abeb153532fb46718501a600866a8d76>:0
  at System.Collections.Generic.EnumerableHelpers.ToArray[T] (System.Collections.Generic.IEnumerable`1[T] source) [0x00033] in <abeb153532fb46718501a600866a8d76>:0
  at System.Linq.Enumerable.ToArray[TSource] (System.Collections.Generic.IEnumerable`1[T] source) [0x00018] in <abeb153532fb46718501a600866a8d76>:0
  at Duplicati.Library.Main.Operation.DeleteHandler.DoRun (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Data.IDbTransaction& transaction, System.Boolean hasVerifiedBacked, System.Boolean forceCompact, Duplicati.Library.Main.BackendManager sharedManager) [0x00179] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired (Duplicati.Library.Main.BackendManager backend, System.Int64 lastVolumeSize) [0x000a5] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at Duplicati.Library.Main.Operation.BackupHandler+<RunAsync>d__19.MoveNext () [0x00db0] in <0ce58d578b8642d49036dc15fbad38f1>:0
   --- End of inner exception stack trace ---
  at Duplicati.Library.Main.Operation.BackupHandler+<RunAsync>d__19.MoveNext () [0x0102c] in <0ce58d578b8642d49036dc15fbad38f1>:0
---> (Inner Exception #0) System.Exception: Unexpected number of remote volumes marked as deleted. Found 2 filesets, but 27 volumes
  at Duplicati.Library.Main.Database.LocalDeleteDatabase+<DropFilesetsFromTable>d__6.MoveNext () [0x0023f] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at System.Collections.Generic.LargeArrayBuilder`1[T].AddRange (System.Collections.Generic.IEnumerable`1[T] items) [0x0005e] in <abeb153532fb46718501a600866a8d76>:0
  at System.Collections.Generic.EnumerableHelpers.ToArray[T] (System.Collections.Generic.IEnumerable`1[T] source) [0x00033] in <abeb153532fb46718501a600866a8d76>:0
  at System.Linq.Enumerable.ToArray[TSource] (System.Collections.Generic.IEnumerable`1[T] source) [0x00018] in <abeb153532fb46718501a600866a8d76>:0
  at Duplicati.Library.Main.Operation.DeleteHandler.DoRun (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Data.IDbTransaction& transaction, System.Boolean hasVerifiedBacked, System.Boolean forceCompact, Duplicati.Library.Main.BackendManager sharedManager) [0x00179] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired (Duplicati.Library.Main.BackendManager backend, System.Int64 lastVolumeSize) [0x000a5] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at Duplicati.Library.Main.Operation.BackupHandler+<RunAsync>d__19.MoveNext () [0x00db0] in <0ce58d578b8642d49036dc15fbad38f1>:0 <---

---> (Inner Exception #1) System.AggregateException: One or more errors occurred. ---> System.Exception: Unable to find log in lookup table, this may be caused by attempting to transport call contexts between AppDomains (eg. with remoting calls)
  at Duplicati.Library.Logging.Log.get_CurrentScope () [0x0004d] in <a0d17c5f1b8942efac55f621de8cc00b>:0
  at Duplicati.Library.Logging.Log.WriteMessage (Duplicati.Library.Logging.LogMessageType type, System.String tag, System.String id, System.Exception ex, System.String message, System.Object[] arguments) [0x0001e] in <a0d17c5f1b8942efac55f621de8cc00b>:0
  at Duplicati.Library.Logging.Log.WriteWarningMessage (System.String tag, System.String id, System.Exception ex, System.String message, System.Object[] arguments) [0x00000] in <a0d17c5f1b8942efac55f621de8cc00b>:0
  at Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess+<>c.<Run>b__1_3 (System.String rootpath, System.String errorpath, System.Exception ex) [0x00000] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at Duplicati.Library.Utility.Utility+<EnumerateFileSystemEntries>d__23.MoveNext () [0x001d9] in <5336a8b903594fa5ae5a9692dae4b7fe>:0
  at System.Linq.Enumerable+SelectManySingleSelectorIterator`2[TSource,TResult].MoveNext () [0x0006f] in <abeb153532fb46718501a600866a8d76>:0
  at Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess+<ExpandWorkList>d__4.MoveNext () [0x000da] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess+<>c__DisplayClass1_0+<<Run>b__0>d.MoveNext () [0x00263] in <0ce58d578b8642d49036dc15fbad38f1>:0
--- End of stack trace from previous location where exception was thrown ---
  at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <a6f810169ec746eeb2789b925e2858c0>:0
  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0003e] in <a6f810169ec746eeb2789b925e2858c0>:0
  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x00028] in <a6f810169ec746eeb2789b925e2858c0>:0
  at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x00008] in <a6f810169ec746eeb2789b925e2858c0>:0
  at System.Runtime.CompilerServices.TaskAwaiter.GetResult () [0x00000] in <a6f810169ec746eeb2789b925e2858c0>:0
  at CoCoL.AutomationExtensions+<RunTask>d__10`1[T].MoveNext () [0x000cc] in <6973ce2780de4b28aaa2c5ffc59993b1>:0
   --- End of inner exception stack trace ---
---> (Inner Exception #0) System.Exception: Unable to find log in lookup table, this may be caused by attempting to transport call contexts between AppDomains (eg. with remoting calls)
  at Duplicati.Library.Logging.Log.get_CurrentScope () [0x0004d] in <a0d17c5f1b8942efac55f621de8cc00b>:0
  at Duplicati.Library.Logging.Log.WriteMessage (Duplicati.Library.Logging.LogMessageType type, System.String tag, System.String id, System.Exception ex, System.String message, System.Object[] arguments) [0x0001e] in <a0d17c5f1b8942efac55f621de8cc00b>:0
  at Duplicati.Library.Logging.Log.WriteWarningMessage (System.String tag, System.String id, System.Exception ex, System.String message, System.Object[] arguments) [0x00000] in <a0d17c5f1b8942efac55f621de8cc00b>:0
  at Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess+<>c.<Run>b__1_3 (System.String rootpath, System.String errorpath, System.Exception ex) [0x00000] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at Duplicati.Library.Utility.Utility+<EnumerateFileSystemEntries>d__23.MoveNext () [0x001d9] in <5336a8b903594fa5ae5a9692dae4b7fe>:0
  at System.Linq.Enumerable+SelectManySingleSelectorIterator`2[TSource,TResult].MoveNext () [0x0006f] in <abeb153532fb46718501a600866a8d76>:0
  at Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess+<ExpandWorkList>d__4.MoveNext () [0x000da] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess+<>c__DisplayClass1_0+<<Run>b__0>d.MoveNext () [0x00263] in <0ce58d578b8642d49036dc15fbad38f1>:0
--- End of stack trace from previous location where exception was thrown ---
  at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <a6f810169ec746eeb2789b925e2858c0>:0
  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0003e] in <a6f810169ec746eeb2789b925e2858c0>:0
  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x00028] in <a6f810169ec746eeb2789b925e2858c0>:0
  at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x00008] in <a6f810169ec746eeb2789b925e2858c0>:0
  at System.Runtime.CompilerServices.TaskAwaiter.GetResult () [0x00000] in <a6f810169ec746eeb2789b925e2858c0>:0
  at CoCoL.AutomationExtensions+<RunTask>d__10`1[T].MoveNext () [0x000cc] in <6973ce2780de4b28aaa2c5ffc59993b1>:0 <---
<---
<---

Log data:

2018-08-04 03:22:22 -05 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
System.Exception: Unexpected number of remote volumes marked as deleted. Found 2 filesets, but 27 volumes
  at Duplicati.Library.Main.Database.LocalDeleteDatabase+<DropFilesetsFromTable>d__6.MoveNext () [0x0023f] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at System.Collections.Generic.LargeArrayBuilder`1[T].AddRange (System.Collections.Generic.IEnumerable`1[T] items) [0x0005e] in <abeb153532fb46718501a600866a8d76>:0
  at System.Collections.Generic.EnumerableHelpers.ToArray[T] (System.Collections.Generic.IEnumerable`1[T] source) [0x00033] in <abeb153532fb46718501a600866a8d76>:0
  at System.Linq.Enumerable.ToArray[TSource] (System.Collections.Generic.IEnumerable`1[T] source) [0x00018] in <abeb153532fb46718501a600866a8d76>:0
  at Duplicati.Library.Main.Operation.DeleteHandler.DoRun (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Data.IDbTransaction& transaction, System.Boolean hasVerifiedBacked, System.Boolean forceCompact, Duplicati.Library.Main.BackendManager sharedManager) [0x00179] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired (Duplicati.Library.Main.BackendManager backend, System.Int64 lastVolumeSize) [0x000a5] in <0ce58d578b8642d49036dc15fbad38f1>:0
  at Duplicati.Library.Main.Operation.BackupHandler+<RunAsync>d__19.MoveNext () [0x00db0] in <0ce58d578b8642d49036dc15fbad38f1>:0

It is.

With 2.0.3.10 I get the following message on a database Recreate:

Error-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-MissingFileDetected]: Remote file referenced as duplicati-bc62ac1e7e23a4e199bf887db828d1691.dblock.zip.aes by duplicati-ic98a84640e544f96a9c604a629d03646.dindex.zip.aes, but not found in list, registering a missing remote file

This matches up with the full-remote-verification “complaint” mentioned above meaning the issue appears to be that somehow a dindex file didn’t get updated to no longer contain a reference to a deleted dblock file. I believe on it’s own this isn’t a problem, but if a database Recreate happens (meaning the database is built from the dlist and dindex files on the destination) then the problem is brought into the database and starts causing issues.

@kenkendk, do you have any thoughts on getting either of these implemented?

  1. Make it so this issue during retention cleanup does NOT cause a backup to fail (the backup works, it’s the cleanup that’s failing)

  2. Identify and fix the underlying cause of the “bad” dindex file


For the record, I have been unable to resolve this using any of the following (but I’m not too worried because the backups themselves are still working):

  • database Repairs
  • database Recreates
  • dindex + dlist deletes followed by Repair
  • manually creating an empty version of missing dblock file (followed by Repair or Recreate)
  • delete broken dindex file (remove destination broken-ness) followed by database Recreate (remove database broken-ness - and yes, this likely requires downloading some dblock files to recalculate hashes)

Just a quick update - the backups themselves only seem to be working (as in show up in the restore list) on days when the number of remote volumes marked as deleted changes. So basically, when something “ages out” and needs to be flagged for cleanup by the retention code the backup DOES seem to run but also causes the error discussed above.

For backups where the retention policy does NOT cause any new items to be flagged for deletion, I get the error but the backup does NOT show up in the restore list.

For the restore list oddity, you could set --unencrypted-database (if necessary for your database browser) and see if the Fileset table gets an entry. In case something else is upsetting listing, I did SQL comparison:

web UI restore initial web page build

2018-09-17 07:39:32 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Controller-RunList]: Starting - Running List
2018-09-17 07:39:32 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: Starting - ExecuteScalarInt64: INSERT INTO "Operation" ("Description", "Timestamp") VALUES ("List", 1537184372); SELECT last_insert_rowid();
2018-09-17 07:39:32 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: ExecuteScalarInt64: INSERT INTO "Operation" ("Description", "Timestamp") VALUES ("List", 1537184372); SELECT last_insert_rowid(); took 0:00:00:00.091
2018-09-17 07:39:32 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC
2018-09-17 07:39:32 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC took 0:00:00:00.000
2018-09-17 07:39:32 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: CREATE TEMPORARY TABLE "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F" AS SELECT DISTINCT "ID" AS "FilesetID", "Timestamp" AS "Timestamp" FROM "Fileset"  WHERE  "Timestamp" <= 1536864403
2018-09-17 07:39:32 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: CREATE TEMPORARY TABLE "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F" AS SELECT DISTINCT "ID" AS "FilesetID", "Timestamp" AS "Timestamp" FROM "Fileset"  WHERE  "Timestamp" <= 1536864403 took 0:00:00:00.000
2018-09-17 07:39:32 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: CREATE INDEX "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F_FilesetIDTimestampIndex" ON "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F" ("FilesetID", "Timestamp" DESC)
2018-09-17 07:39:32 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: CREATE INDEX "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F_FilesetIDTimestampIndex" ON "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F" ("FilesetID", "Timestamp" DESC) took 0:00:00:00.000
2018-09-17 07:39:32 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DELETE FROM "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F" WHERE "FilesetID" NOT IN (SELECT "FilesetID" FROM "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F" ORDER BY "Timestamp" DESC LIMIT 1 )
2018-09-17 07:39:32 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DELETE FROM "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F" WHERE "FilesetID" NOT IN (SELECT "FilesetID" FROM "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F" ORDER BY "Timestamp" DESC LIMIT 1 ) took 0:00:00:00.000
2018-09-17 07:39:32 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT DISTINCT "A"."FilesetID", "B"."FileCount", "B"."FileSizes" FROM "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F" A LEFT OUTER JOIN ( SELECT "A"."FilesetID" AS "FilesetID", COUNT(*) AS "FileCount", SUM("C"."Length") AS "FileSizes" FROM "FilesetEntry" A, "File" B, "Blockset" C WHERE "A"."FileID" = "B"."ID" AND "B"."BlocksetID" = "C"."ID" AND "A"."FilesetID" IN (SELECT DISTINCT "FilesetID" FROM "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F") GROUP BY "A"."FilesetID"  ) B ON "A"."FilesetID" = "B"."FilesetID" ORDER BY "A"."Timestamp" DESC 
2018-09-17 07:39:32 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT DISTINCT "A"."FilesetID", "B"."FileCount", "B"."FileSizes" FROM "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F" A LEFT OUTER JOIN ( SELECT "A"."FilesetID" AS "FilesetID", COUNT(*) AS "FileCount", SUM("C"."Length") AS "FileSizes" FROM "FilesetEntry" A, "File" B, "Blockset" C WHERE "A"."FileID" = "B"."ID" AND "B"."BlocksetID" = "C"."ID" AND "A"."FilesetID" IN (SELECT DISTINCT "FilesetID" FROM "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F") GROUP BY "A"."FilesetID"  ) B ON "A"."FilesetID" = "B"."FilesetID" ORDER BY "A"."Timestamp" DESC  took 0:00:00:00.008
2018-09-17 07:39:32 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: CREATE TEMPORARY TABLE "Filenames-F22B356E6D9F8440BD0B572E8C95EB55" ("Path" TEXT NOT NULL)
2018-09-17 07:39:32 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: CREATE TEMPORARY TABLE "Filenames-F22B356E6D9F8440BD0B572E8C95EB55" ("Path" TEXT NOT NULL) took 0:00:00:00.000
2018-09-17 07:39:32 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: INSERT INTO "Filenames-F22B356E6D9F8440BD0B572E8C95EB55" SELECT DISTINCT "Path" FROM "File" WHERE "Path" LIKE "C:\USERS\TED\%"
2018-09-17 07:39:33 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: INSERT INTO "Filenames-F22B356E6D9F8440BD0B572E8C95EB55" SELECT DISTINCT "Path" FROM "File" WHERE "Path" LIKE "C:\USERS\TED\%" took 0:00:00:00.009
2018-09-17 07:39:33 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DELETE FROM "Filenames-F22B356E6D9F8440BD0B572E8C95EB55" WHERE "Path" NOT IN (SELECT DISTINCT "Path" FROM "File", "FilesetEntry" WHERE "FilesetEntry"."FileID" = "File"."ID" AND "FilesetEntry"."FilesetID" IN (SELECT "FilesetID" FROM "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F") ) 
2018-09-17 07:39:33 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DELETE FROM "Filenames-F22B356E6D9F8440BD0B572E8C95EB55" WHERE "Path" NOT IN (SELECT DISTINCT "Path" FROM "File", "FilesetEntry" WHERE "FilesetEntry"."FileID" = "File"."ID" AND "FilesetEntry"."FilesetID" IN (SELECT "FilesetID" FROM "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F") )  took 0:00:00:00.058
2018-09-17 07:39:33 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: CREATE TEMPORARY TABLE "Filenames-7131D1FA921E16459307B54FF71542E2" ("Path" TEXT NOT NULL)
2018-09-17 07:39:33 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: CREATE TEMPORARY TABLE "Filenames-7131D1FA921E16459307B54FF71542E2" ("Path" TEXT NOT NULL) took 0:00:00:00.000
2018-09-17 07:39:33 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT DISTINCT "Path" FROM "Filenames-F22B356E6D9F8440BD0B572E8C95EB55" 
2018-09-17 07:39:33 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT DISTINCT "Path" FROM "Filenames-F22B356E6D9F8440BD0B572E8C95EB55"  took 0:00:00:00.000
2018-09-17 07:39:33 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: CREATE INDEX "Filenames-7131D1FA921E16459307B54FF71542E2_PathIndex" ON "Filenames-7131D1FA921E16459307B54FF71542E2" ("Path")
2018-09-17 07:39:33 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: CREATE INDEX "Filenames-7131D1FA921E16459307B54FF71542E2_PathIndex" ON "Filenames-7131D1FA921E16459307B54FF71542E2" ("Path") took 0:00:00:00.000
2018-09-17 07:39:33 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT "C"."Path", "D"."Length", "C"."FilesetID" FROM (SELECT "A"."Path", "B"."FilesetID" FROM "Filenames-7131D1FA921E16459307B54FF71542E2" A, (SELECT "FilesetID", "Timestamp" FROM "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F" ORDER BY "Timestamp" DESC) B ORDER BY "A"."Path" ASC, "B"."Timestamp" DESC) C LEFT OUTER JOIN (SELECT "Length", "FilesetEntry"."FilesetID", "File"."Path" FROM "Blockset", "FilesetEntry", "File" WHERE "File"."BlocksetID" = "Blockset"."ID" AND "FilesetEntry"."FileID" = "File"."ID" AND FilesetEntry."FilesetID" IN (SELECT DISTINCT "FilesetID" FROM "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F") ) D ON "C"."FilesetID" = "D"."FilesetID" AND "C"."Path" = "D"."Path"
2018-09-17 07:39:33 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT "C"."Path", "D"."Length", "C"."FilesetID" FROM (SELECT "A"."Path", "B"."FilesetID" FROM "Filenames-7131D1FA921E16459307B54FF71542E2" A, (SELECT "FilesetID", "Timestamp" FROM "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F" ORDER BY "Timestamp" DESC) B ORDER BY "A"."Path" ASC, "B"."Timestamp" DESC) C LEFT OUTER JOIN (SELECT "Length", "FilesetEntry"."FilesetID", "File"."Path" FROM "Blockset", "FilesetEntry", "File" WHERE "File"."BlocksetID" = "Blockset"."ID" AND "FilesetEntry"."FileID" = "File"."ID" AND FilesetEntry."FilesetID" IN (SELECT DISTINCT "FilesetID" FROM "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F") ) D ON "C"."FilesetID" = "D"."FilesetID" AND "C"."Path" = "D"."Path" took 0:00:00:00.037
2018-09-17 07:39:33 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "Filenames-F22B356E6D9F8440BD0B572E8C95EB55" 
2018-09-17 07:39:33 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DROP TABLE IF EXISTS "Filenames-F22B356E6D9F8440BD0B572E8C95EB55"  took 0:00:00:00.000
2018-09-17 07:39:33 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "Filenames-7131D1FA921E16459307B54FF71542E2"
2018-09-17 07:39:33 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DROP TABLE IF EXISTS "Filenames-7131D1FA921E16459307B54FF71542E2" took 0:00:00:00.000
2018-09-17 07:39:33 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F" 
2018-09-17 07:39:33 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DROP TABLE IF EXISTS "Filesets-CAC30430AA7F2F4881D06CEBC079FD5F"  took 0:00:00:00.000
2018-09-17 07:39:33 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Controller-RunList]: Running List took 0:00:00:00.263

Commandline list, no filename

Usage: find <storage-URL> ["<filename>"] [<options>]

  Finds specific files in specific backups. If <filename> is specified, all
  occurrences of <filename> in the backup are listed. <filename> can contain
  * and ? as wildcards. File names in [brackets] are interpreted as regular
  expression. Latest backup is searched by default. If entire path is
  specified, all available versions of the file are listed. If no <filename>
  is specified, a list of all available backups is shown.

2018-09-17 07:43:58 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Controller-RunList]: Starting - Running List
2018-09-17 07:43:58 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: Starting - ExecuteScalarInt64: INSERT INTO "Operation" ("Description", "Timestamp") VALUES ("List", 1537184638); SELECT last_insert_rowid();
2018-09-17 07:43:58 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: ExecuteScalarInt64: INSERT INTO "Operation" ("Description", "Timestamp") VALUES ("List", 1537184638); SELECT last_insert_rowid(); took 0:00:00:00.074
2018-09-17 07:43:58 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC
2018-09-17 07:43:58 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC took 0:00:00:00.000
2018-09-17 07:43:58 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: CREATE TEMPORARY TABLE "Filesets-A95CCE2BD58BB644B3FD68954F5A96C8" AS SELECT DISTINCT "ID" AS "FilesetID", "Timestamp" AS "Timestamp" FROM "Fileset" 
2018-09-17 07:43:58 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: CREATE TEMPORARY TABLE "Filesets-A95CCE2BD58BB644B3FD68954F5A96C8" AS SELECT DISTINCT "ID" AS "FilesetID", "Timestamp" AS "Timestamp" FROM "Fileset"  took 0:00:00:00.000
2018-09-17 07:43:58 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: CREATE INDEX "Filesets-A95CCE2BD58BB644B3FD68954F5A96C8_FilesetIDTimestampIndex" ON "Filesets-A95CCE2BD58BB644B3FD68954F5A96C8" ("FilesetID", "Timestamp" DESC)
2018-09-17 07:43:58 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: CREATE INDEX "Filesets-A95CCE2BD58BB644B3FD68954F5A96C8_FilesetIDTimestampIndex" ON "Filesets-A95CCE2BD58BB644B3FD68954F5A96C8" ("FilesetID", "Timestamp" DESC) took 0:00:00:00.000
2018-09-17 07:43:58 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT DISTINCT "A"."FilesetID", "B"."FileCount", "B"."FileSizes" FROM "Filesets-A95CCE2BD58BB644B3FD68954F5A96C8" A LEFT OUTER JOIN ( SELECT "A"."FilesetID" AS "FilesetID", COUNT(*) AS "FileCount", SUM("C"."Length") AS "FileSizes" FROM "FilesetEntry" A, "File" B, "Blockset" C WHERE "A"."FileID" = "B"."ID" AND "B"."BlocksetID" = "C"."ID" AND "A"."FilesetID" IN (SELECT DISTINCT "FilesetID" FROM "Filesets-A95CCE2BD58BB644B3FD68954F5A96C8") GROUP BY "A"."FilesetID"  ) B ON "A"."FilesetID" = "B"."FilesetID" ORDER BY "A"."Timestamp" DESC 
2018-09-17 07:43:59 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT DISTINCT "A"."FilesetID", "B"."FileCount", "B"."FileSizes" FROM "Filesets-A95CCE2BD58BB644B3FD68954F5A96C8" A LEFT OUTER JOIN ( SELECT "A"."FilesetID" AS "FilesetID", COUNT(*) AS "FileCount", SUM("C"."Length") AS "FileSizes" FROM "FilesetEntry" A, "File" B, "Blockset" C WHERE "A"."FileID" = "B"."ID" AND "B"."BlocksetID" = "C"."ID" AND "A"."FilesetID" IN (SELECT DISTINCT "FilesetID" FROM "Filesets-A95CCE2BD58BB644B3FD68954F5A96C8") GROUP BY "A"."FilesetID"  ) B ON "A"."FilesetID" = "B"."FilesetID" ORDER BY "A"."Timestamp" DESC  took 0:00:00:00.152
2018-09-17 07:43:59 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "Filesets-A95CCE2BD58BB644B3FD68954F5A96C8" 
2018-09-17 07:43:59 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DROP TABLE IF EXISTS "Filesets-A95CCE2BD58BB644B3FD68954F5A96C8"  took 0:00:00:00.000
2018-09-17 07:43:59 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Controller-RunList]: Running List took 0:00:00:00.271


Commandline list, no filename, and with --list-sets-only

  --list-sets-only (Boolean): List only filesets
    Use this option to only list filesets and avoid traversing file names and
    other metadata which slows down the process
    * default value: false

2018-09-17 07:36:04 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Controller-RunList]: Starting - Running List
2018-09-17 07:36:04 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: Starting - ExecuteScalarInt64: INSERT INTO "Operation" ("Description", "Timestamp") VALUES ("List", 1537184164); SELECT last_insert_rowid();
2018-09-17 07:36:04 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: ExecuteScalarInt64: INSERT INTO "Operation" ("Description", "Timestamp") VALUES ("List", 1537184164); SELECT last_insert_rowid(); took 0:00:00:00.307
2018-09-17 07:36:04 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC
2018-09-17 07:36:04 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC took 0:00:00:00.000
2018-09-17 07:36:04 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: CREATE TEMPORARY TABLE "Filesets-F7B41C5513744440932872D9CDBC7852" AS SELECT DISTINCT "ID" AS "FilesetID", "Timestamp" AS "Timestamp" FROM "Fileset" 
2018-09-17 07:36:04 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: CREATE TEMPORARY TABLE "Filesets-F7B41C5513744440932872D9CDBC7852" AS SELECT DISTINCT "ID" AS "FilesetID", "Timestamp" AS "Timestamp" FROM "Fileset"  took 0:00:00:00.000
2018-09-17 07:36:04 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: CREATE INDEX "Filesets-F7B41C5513744440932872D9CDBC7852_FilesetIDTimestampIndex" ON "Filesets-F7B41C5513744440932872D9CDBC7852" ("FilesetID", "Timestamp" DESC)
2018-09-17 07:36:04 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: CREATE INDEX "Filesets-F7B41C5513744440932872D9CDBC7852_FilesetIDTimestampIndex" ON "Filesets-F7B41C5513744440932872D9CDBC7852" ("FilesetID", "Timestamp" DESC) took 0:00:00:00.000
2018-09-17 07:36:04 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT DISTINCT "ID" FROM "Fileset" ORDER BY "Timestamp" DESC <-- (NOTE: this SQL got simpler)
2018-09-17 07:36:04 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT DISTINCT "ID" FROM "Fileset" ORDER BY "Timestamp" DESC  took 0:00:00:00.000
2018-09-17 07:36:04 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "Filesets-F7B41C5513744440932872D9CDBC7852" 
2018-09-17 07:36:04 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DROP TABLE IF EXISTS "Filesets-F7B41C5513744440932872D9CDBC7852"  took 0:00:00:00.000
2018-09-17 07:36:04 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Controller-RunList]: Running List took 0:00:00:00.320

Each one gets less ambitious than the one before it. Maybe at some point your missing version will show up, however if you chose to peek at the database, and saw nothing added, no simplified SQL is going to show it.

Thanks for the suggestion!

My guess is the “error” is handled sightly differently if the number has changed and for whatever reason that lets the backup “succeed” otherwise the backup is considered failed so doesn’t record it as restorable.

I’ve gotten busy so may not have much time to test anymore, but if/when I do I’ll try to update here. :slight_smile:

Another quick update here as I think I’ve figured out another part of the puzzle.

My latest error is “Failed: Unexpected number of remote volumes marked as deleted. Found 29 filesets, but 54 volumes

  1. In looking at my destination location I see 25 dlist files with file names that show dates OLDER than the oldest version restorable from the job menu. I’m guessing the 54 volumes found are due to the 25 “extra” dlist files (24 + 25 = 54).

  2. In my Job restore menu I see 59 restorable versions, but
    a “Direct restore from destination” shows 87 restorable versions!

  3. The newest of the 25 “extra” dlist files happens to match the date of the newest “Direct” restore version that is NOT available on the job restore list.

  4. All 25 of the “extra” dlist files are 7z files as are the dblocks for related timeframes (because I didn’t realize 7z had issues when I started my backups then ended up switching to zip).

So - if everybody else with this error happened to have used 7z files at some point, that may be our smoking gun. (Maybe the retention policy code that’s throwing the error) can’t handle the extension change.)

But I suspect that is NOT the case and that having extra dlist files of any sort will cause this error.

Thanks for your time.

In looking at my destination location I see 25 dlist files with file names that show dates OLDER than the oldest version restorable from the job menu. I’m guessing the 54 volumes found are due to the 25 “extra” dlist files (24 + 25 = 54).

I’m guessing that you meant 29 instead of 25 (for math reasons lol). But then 59 + 29 amounts to 88 not 87. Are the numbers correct?

Edit:

29 instead of 25 (for math reasons lol)

I got confused too, it should be 29 + 25 not 24 + 29. So I don’t know what they should be anymore

Yeah, I either miscounted, mistyped, or both. :blush:

I’ll try to recheck my numbers when I can get to the machine.

OK - progress has been made! Just to be clear - I’m pretty sure I brought this on myself, and am slowly look just because I want to know how I managed to inflict it upon myself. :slight_smile:

To re-iterate, I started my backup with 7z (before I knew about the issues) and then switched to zip. So I now have:

  • 88 dlist files (63 zip + 25 7z)
  • 92 dindex files (92 zip and 0 7z)
  • 92 dblock files (79 zip and 13 7z)

I suspect the all-zip dindex files are because at one point I tested deleting all my dindex files and doing a Database repair - and because I am (now) using zip, all the “fresh” dindex files came in as zip.

Now this MAY be the underlying cause of the issue - I’m not sure, but if I do a database Recreate with the above sets of files everything runs fine EXCEPT:

  • I get a “Got 25 warnings” message. This happens to match the # of 7z dlist files I have PLUS the 5 shown in the log are all 7z:
2018-11-16 15:06:06 -06 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileProcessingFailed]: Failed to process file: duplicati-20170913T190637Z.dlist.7z.aes,
2018-11-16 15:06:06 -06 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileProcessingFailed]: Failed to process file: duplicati-20170921T180007Z.dlist.7z.aes,
2018-11-16 15:06:06 -06 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileProcessingFailed]: Failed to process file: duplicati-20171009T163147Z.dlist.7z.aes,
2018-11-16 15:06:07 -06 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileProcessingFailed]: Failed to process file: duplicati-20171016T180000Z.dlist.7z.aes,
2018-11-16 15:06:07 -06 - [Warning-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-FileProcessingFailed]: Failed to process file: duplicati-20171023T180000Z.dlist.7z.aes,
  • Running Select * From "Remotevolume" Where "Type" = "Files" And "ID" Not In (Select "VolumeID" From "FileSet") on my database returns 25 rows (all 7z dlist files)

  • Running Select * From "FileSet" on my database returns 63 rows with IDs starting AFTER the highest dlist.7z ID (and of course 63 dlist.zip + 25 dlist.7z = my total of 88 dlist files)

So it appears that during the database Recreate, the RemoteVolume table DOES get the 7z files noted in it, but the FileSet table does NOT (likely part of the 25 “RecreateDatabaseHandler-FileProcessingFailed” warnings).

Because of this discrepancy between the FileList and RemoteVolume tables, the DropFilesetsFromTable method (that is called as part of retention policy cleanup) FAILS with "Unexpected number of remote volumes marked as deleted. Found 29 filesets, but 54 volumes" (which should probably be re-worded as "Unexpected number of remote volumes (dlist files) marked as deletable. Expected 29 but found 54").

Unfortunately, this WARNING causes the entire backup job to be flagged as unsuccessful so even though the backup worked (and is in fact restorable) it’s treated as a failed backup.


I don’t know if the dlist.7z files are ACTUALLY bad (meaning the Fileset table is correct) or there’s a bug in that code (meaning the RemoteVolumes table is correct) or something else completely - but at least it’s not a total black box anymore. :slight_smile:

Hopefully I’ll get some time to to fine-tune the debugging in RecreateDatabaseHandler and do a local test to figure out exactly where the errors are coming from.

Of course I wouldn’t object if somebody else wanted to take a look at it as well… :wink:

I am using Duplicati to backup my personal files to B2 cloud storage. I came across this error twice by making the same stupid mistake. My C: drive was on the verge of filling up so I deleted the files in my temporary folder to free up space, completely forgetting that Duplicati stores the dblock file there during the backup process. As soon as I deleted this I got the error “Unexpected number of remote volumes marked as deleted. Found 1 filesets, but 0 volumes” as above.The only solution that worked for me was to delete all the files from the backup in the cloud and start the job again from scratch. I’m not as computer literate as a lot of people on this site, but I can only assume that the Duplicati software expects to see a deleted volume in the cloud but can only find a deleted fileset (i.e. the one that I deleted from the Temp folder). The folder I’m backing up contains lots of large sized videos, which could explain why it only finds 1 fileset.