"Unexpected number of remote volumes detected: 0!"-error

So an update:

  1. I changed the source-data to only contain C:\Users\ (it used to be the whole of C:\)
  2. I ran the backup – it started scanning for up-to 70gbyte, but eventually it finished the backup (with a lot of warnings)
  3. If i now start the backup, i get the same error: Unexpected number of remote volumes detected: 0!

So i am quiete a bit lost – i have 753GByte free on the source machine (of a total of 1TB). What i did also see in the logfiles is that it seems not to use VSS as I got ‘File is in use’ errors.

One thing that I now consider is that Crashplan might be interfering with its’ own VSS snapshots perhaps?

@maarten, I was under the impression that unlike CrashPlan Duplicati does not make a subfolder for each backup “source” so the “unexpected number of remote volumes detected” error CAN be caused by multiple sources backing up to the same destination folder (though I have not verified this). Could that be what’s going on for you?

I’d suggest creating a new (extra small) backup set pointing to a different destination folder and see if the error happens with that one. If not, change the destination folder to the same as your existing set and see if the error DOES happen. If it does, then I think there’s a way to do a verify or cleanup of the destination contents but I don’t recall exactly what it is. (sorry)

If duplicate sources to a single destination IS the source of this issue for you and for whatever reason you don’t want to set up multiple destinations, I believe you can use the --prefix parameter to allow single destination use for multiple sources. (Note that this is NOT a way to get de-dulication across multiple sources.)

Also, unless (and sometimes even if) you’re running Duplicati under a system account certain files (think NTUSER.DAT, temp files, files open in some apps) are still locked - even with Volume Shadow copy Service. Even with C:\Users I expect you’ll want to include some exclusions such as skipping the Duplicati control dir (which will always be in use during a backup) which I do with an “Exclude Regular Expression” similar to:

.:\\Users\\[^\\]*\\AppData\\Roaming\\Duplicati\\control_dir.*\\

Which pretty much just says:

  • any drive letter
  • followed by “Users”
  • then any SINGLE user name folder
  • then "\AppData\Roaming\Duplicati\
  • then any folder that starts with “control_dir” (because that folder is named differently for different versions of Duplicati)

Good luck!

2 Likes

No, I am only testing with one machine. The first time it failed (hence the backupset-id 2) I also removed the whole folder on the target system.

I’ll test with a new backup though that is a simple thing to do.

Hmm, also that backup (no exclusions, just my home folder, other destination-folder) finished with errors unfortunately:

disk i/o error

Fatal error
System.Data.SQLite.SQLiteException (0x80004005): disk I/O error

disk I/O error

at System.Data.SQLite.SQLite3.Reset(SQLiteStatement stmt)

at System.Data.SQLite.SQLite3.Step(SQLiteStatement stmt)

at System.Data.SQLite.SQLiteDataReader.NextResult()

at System.Data.SQLite.SQLiteDataReader…ctor(SQLiteCommand cmd, CommandBehavior behave)

at System.Data.SQLite.SQLiteCommand.ExecuteReader(CommandBehavior behavior)

at System.Data.SQLite.SQLiteCommand.ExecuteNonQuery(CommandBehavior behavior)

at Duplicati.Library.Main.BasicResults.LogDbMessage(String type, String message, Exception ex)

at Duplicati.Library.Main.BasicResults.AddWarning(String message, Exception ex)

at Duplicati.Library.Main.Operation.BackupHandler.HandleFilesystemEntry(ISnapshotService snapshot, BackendManager backend, String path, FileAttributes attributes)

at Duplicati.Library.Main.Operation.BackupHandler.RunMainOperation(ISnapshotService snapshot, BackendManager backend)

at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)

And some other sqlite errors which are related i think. Before the sqlite error, I get quite a bit of “File in use”, so it really seems it is not using VSS at all. Is there somewhere I can check that?

If you set --snapshot-policy=required Duplicati will refuse to run if it cannot start a VSS snapshot.

Thanks for the hint! :slight_smile: So i tried using this commandline into a new subdirectory on the target (my earlier attempts were starting the backup using the web-ui):

C:\Program Files\Duplicati 2>Duplicati.CommandLine.exe backup "ssh://192.168.0.133//storage/duplicati-test/test3?auth-username=maarten&auth-password=xxx&ssh-fingerprint=ssh-rsa%202048" C:\Users\ --snapshot-policy=required

And got eventually this result:

I/O error
  98087 files need to be examined (24.44 GB)
  97099 files need to be examined (24.44 GB)
  Uploading file (49.98 MB) ...
  96968 files need to be examined (24.43 GB)
  Uploading file (466.22 KB) ...
  96637 files need to be examined (24.38 GB)
  96053 files need to be examined (24.37 GB)
  95539 files need to be examined (24.34 GB)
Failed to process path: C:\Users\maarten\Desktop\Saved Desktop (24 mei 2017)\cutedog.mp4 => disk I/O error
disk I/O error
  95529 files need to be examined (24.33 GB)
Fatal error => disk I/O error
disk I/O error
Rollback error: unknown error
No transaction is active on this connection => unknown error
No transaction is active on this connection
Failed disposing index volume => Attempted to dispose an index volume that was being written
  0 files need to be examined (0 bytes)

System.Data.SQLite.SQLiteException (0x80004005): disk I/O error
disk I/O error
   at System.Data.SQLite.SQLite3.Reset(SQLiteStatement stmt)
   at System.Data.SQLite.SQLite3.Step(SQLiteStatement stmt)
   at System.Data.SQLite.SQLiteDataReader.NextResult()
   at System.Data.SQLite.SQLiteDataReader..ctor(SQLiteCommand cmd, CommandBehavior behave)
   at System.Data.SQLite.SQLiteCommand.ExecuteReader(CommandBehavior behavior)
   at System.Data.SQLite.SQLiteCommand.ExecuteNonQuery(CommandBehavior behavior)
   at Duplicati.Library.Main.BasicResults.FlushLog()
   at Duplicati.Library.Main.BasicResults.LogDbMessage(String type, String message, Exception ex)
   at Duplicati.Library.Main.BasicResults.AddError(String message, Exception ex)
   at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass16_0.<Backup>b__0(BackupResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)
   at Duplicati.CommandLine.Commands.Backup(TextWriter outwriter, Action`1 setup, List`1 args, Dictionary`2 options, IFilter filter)
   at Duplicati.CommandLine.Program.RunCommandLine(TextWriter outwriter, TextWriter errwriter, Action`1 setup, String[] args)

C:\Program Files\Duplicati 2>

Is it somehow possible to show the actual Win32 error code when we get the I/O error? I checked the system log, hardware-diag log etc and no actual disk errors are logged so my guess is it is something else.

I just checked using vssadmin list shadows and I did notice i had 3 older snapshots listed, so I deleted them (vssadmin delete shadows /for=c:) and am now trying to run the backup again.

Ok another update – I hope this helps in troubleshooting what is going wrong. So the errors about object-not-being-set-to-a-reference keep coming.

I now tried to start the backup again and wanted to troubleshoot the ‘Cannot process this file’ issue. I did this (copy & paste from the commandline follows):

  1. Deleted all shadow copies that i could delete
  2. Started duplicati with the option to require a snapshot
  3. Destination for the backup already exists, but never finished correctly
  4. I started the backup until the first error occured then i Ctrl+C’ed it.
  5. I mounted the created shadowcopy manually
  6. I checked if i could access the file from the shadow copy --> I could
Command transcript
C:\Program Files\Duplicati 2>vssadmin list shadows
vssadmin 1.1 - Volume Shadow Copy Service administrative command-line tool
(C) Copyright 2001-2013 Microsoft Corp.

Contents of shadow copy set ID: {3bc921d1-4127-4423-a698-e96089829a72}
   Contained 1 shadow copies at creation time: 9-9-2017 08:58:09
      Shadow Copy ID: {13d4b8e7-befa-45fd-aa11-9a92f4781ff1}
         Original Volume: (C:)\\?\Volume{8dd32c78-d9c9-4cf0-9d00-6a32c6265d8d}\
         Shadow Copy Volume: \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy1986
         Originating Machine: HANNIBAL
         Service Machine: HANNIBAL
         Provider: 'Microsoft Software Shadow Copy provider 1.0'
         Type: Backup
         Attributes: Differential, Auto recovered


C:\Program Files\Duplicati 2>vssadmin delete shadows /for=C:
vssadmin 1.1 - Volume Shadow Copy Service administrative command-line tool
(C) Copyright 2001-2013 Microsoft Corp.

Error: Snapshots were found, but they were outside of your allowed context.  Try removing them with the
backup application which created them.


C:\Program Files\Duplicati 2>Duplicati.CommandLine.exe backup "ssh://192.168.0.133//storage/duplicati-test/test3?auth-username=maarten&auth-password=xxx" C:\Users\ --snapshot-policy=required
Backup started at 9-9-2017 09:42:24

Enter encryption passphrase: ****

Confirm encryption passphrase: ****
Checking remote backup ...
  Listing remote folder ...
removing file listed as Temporary: duplicati-20170909T065805Z.dlist.zip.aes
promoting uploaded complete file from Uploading to Uploaded: duplicati-bc5f57b001771456d92d0e50472d87c4d.dblock.zip.aes
removing incomplete remote file listed as Uploading: duplicati-i4fc474d965bc44f787cbac400daa39bd.dindex.zip.aes
  Deleting file duplicati-i4fc474d965bc44f787cbac400daa39bd.dindex.zip.aes ...
Expected there to be a temporary fileset for synthetic filelist (3, duplicati-i05c70c92d35941409b7dfa45b3359cb5.dindex.zip.aes), but none was found?
Re-creating missing index file for duplicati-bc5f57b001771456d92d0e50472d87c4d.dblock.zip.aes
  Uploading file (22.01 KB) ...
Scanning local files ...
  167740 files need to be examined (67.00 GB)
  150832 files need to be examined (66.78 GB)
  135021 files need to be examined (66.56 GB)
  117638 files need to be examined (66.43 GB)
  101357 files need to be examined (24.71 GB)
  96588 files need to be examined (24.64 GB)
  95928 files need to be examined (24.62 GB)
  95572 files need to be examined (24.58 GB)
  95556 files need to be examined (22.24 GB)
  95259 files need to be examined (22.16 GB)
  95127 files need to be examined (21.98 GB)
  95116 files need to be examined (21.90 GB)
  95114 files need to be examined (19.53 GB)
  95107 files need to be examined (16.47 GB)
  95106 files need to be examined (15.60 GB)
  95105 files need to be examined (14.73 GB)
  95104 files need to be examined (14.49 GB)
  Uploading file (49.98 MB) ...
  Uploading file (21.22 KB) ...
  Uploading file (49.94 MB) ...
  Uploading file (17.97 KB) ...
  Uploading file (49.93 MB) ...
  Uploading file (20.76 KB) ...
  Uploading file (49.94 MB) ...
  Uploading file (24.04 KB) ...
  Uploading file (49.98 MB) ...
  Uploading file (19.34 KB) ...
  Uploading file (49.91 MB) ...
  Uploading file (120.95 KB) ...
  Uploading file (50.00 MB) ...
  Uploading file (17.98 KB) ...
  Uploading file (49.93 MB) ...
  Uploading file (25.72 KB) ...
  95102 files need to be examined (13.67 GB)
  Uploading file (49.95 MB) ...
  Uploading file (94.18 KB) ...
  Uploading file (49.94 MB) ...
  Uploading file (29.78 KB) ...
  Uploading file (49.93 MB) ...
  Uploading file (45.86 KB) ...
  95090 files need to be examined (13.41 GB)
  Uploading file (49.91 MB) ...
  Uploading file (105.87 KB) ...
  95080 files need to be examined (13.34 GB)
  95074 files need to be examined (13.33 GB)
  Uploading file (49.96 MB) ...
  95067 files need to be examined (13.30 GB)
  Uploading file (65.48 KB) ...
  95066 files need to be examined (13.30 GB)
  95065 files need to be examined (13.26 GB)
  Uploading file (49.93 MB) ...
  95063 files need to be examined (13.25 GB)
  Uploading file (44.29 KB) ...
  95059 files need to be examined (13.19 GB)
  Uploading file (49.94 MB) ...
  Uploading file (40.40 KB) ...
  Uploading file (49.96 MB) ...
  95051 files need to be examined (13.11 GB)
  Uploading file (25.03 KB) ...
  95041 files need to be examined (13.09 GB)
  95033 files need to be examined (13.06 GB)
  Uploading file (49.97 MB) ...
  Uploading file (56.89 KB) ...
  95032 files need to be examined (13.02 GB)
  95013 files need to be examined (12.95 GB)
  Uploading file (49.94 MB) ...
  Uploading file (50.48 KB) ...
  94982 files need to be examined (12.91 GB)
  94517 files need to be examined (12.76 GB)
  Uploading file (49.95 MB) ...
  94386 files need to be examined (12.64 GB)
  Uploading file (87.01 KB) ...
  94201 files need to be examined (12.39 GB)
  94083 files need to be examined (12.20 GB)
  93973 files need to be examined (12.02 GB)
  93712 files need to be examined (11.97 GB)
  Uploading file (49.95 MB) ...
  Uploading file (101.81 KB) ...
  Uploading file (49.92 MB) ...
  93695 files need to be examined (11.90 GB)
  Uploading file (37.50 KB) ...
  Uploading file (49.92 MB) ...
  Uploading file (37.90 KB) ...
  93676 files need to be examined (11.82 GB)
  Uploading file (49.94 MB) ...
  Uploading file (37.28 KB) ...
  93629 files need to be examined (11.76 GB)
  Uploading file (49.91 MB) ...
  Uploading file (43.95 KB) ...
  93474 files need to be examined (11.72 GB)
  Uploading file (49.92 MB) ...
  93354 files need to be examined (11.66 GB)
  Uploading file (70.87 KB) ...
  93204 files need to be examined (11.61 GB)
  92979 files need to be examined (11.58 GB)
  Uploading file (49.91 MB) ...
  92967 files need to be examined (11.57 GB)
  Uploading file (103.08 KB) ...
  92895 files need to be examined (11.54 GB)
  92737 files need to be examined (11.52 GB)
  92402 files need to be examined (11.47 GB)
  Uploading file (49.97 MB) ...
  Uploading file (93.20 KB) ...
  92049 files need to be examined (11.46 GB)
  91226 files need to be examined (11.45 GB)
  90551 files need to be examined (11.45 GB)
  89788 files need to be examined (11.44 GB)
  88908 files need to be examined (11.43 GB)
  88341 files need to be examined (11.42 GB)
  87888 files need to be examined (11.42 GB)
  87039 files need to be examined (11.41 GB)
  86185 files need to be examined (11.41 GB)
  85308 files need to be examined (11.40 GB)
  84609 files need to be examined (11.40 GB)
  84007 files need to be examined (11.40 GB)
  83205 files need to be examined (11.39 GB)
  82492 files need to be examined (11.39 GB)
  81650 files need to be examined (11.38 GB)
  80960 files need to be examined (11.38 GB)
  80627 files need to be examined (11.34 GB)
  79892 files need to be examined (11.33 GB)
  79074 files need to be examined (11.32 GB)
  78276 files need to be examined (11.31 GB)
  77687 files need to be examined (11.30 GB)
  77045 files need to be examined (11.29 GB)
  76404 files need to be examined (11.28 GB)
  75794 files need to be examined (11.27 GB)
  75102 files need to be examined (11.26 GB)
  74496 files need to be examined (11.25 GB)
  73767 files need to be examined (11.24 GB)
  73054 files need to be examined (11.22 GB)
  72400 files need to be examined (11.21 GB)
  71695 files need to be examined (11.20 GB)
  71093 files need to be examined (11.19 GB)
  70727 files need to be examined (11.18 GB)
  Uploading file (49.95 MB) ...
  Uploading file (875.98 KB) ...
  Uploading file (49.99 MB) ...
  70724 files need to be examined (11.10 GB)
  Uploading file (17.97 KB) ...
  70719 files need to be examined (11.08 GB)
  70710 files need to be examined (11.06 GB)
  Uploading file (49.94 MB) ...
  Uploading file (76.45 KB) ...
  70708 files need to be examined (10.97 GB)
  70646 files need to be examined (10.89 GB)
  70633 files need to be examined (10.78 GB)
  Uploading file (49.92 MB) ...
  Uploading file (123.29 KB) ...
  70557 files need to be examined (10.59 GB)
  70516 files need to be examined (10.51 GB)
  70419 files need to be examined (10.49 GB)
  Uploading file (49.98 MB) ...
  Uploading file (123.93 KB) ...
  70415 files need to be examined (10.44 GB)
  Uploading file (49.93 MB) ...
  Uploading file (35.33 KB) ...
  70413 files need to be examined (10.39 GB)
  Uploading file (49.91 MB) ...
  Uploading file (34.67 KB) ...
  70411 files need to be examined (10.34 GB)
  Uploading file (49.96 MB) ...
  Uploading file (40.04 KB) ...
  70409 files need to be examined (10.33 GB)
  Uploading file (49.93 MB) ...
  Uploading file (34.36 KB) ...
  70405 files need to be examined (10.23 GB)
  70272 files need to be examined (10.19 GB)
  Uploading file (49.91 MB) ...
  Uploading file (54.40 KB) ...
  70260 files need to be examined (10.17 GB)
  Uploading file (49.93 MB) ...
  70177 files need to be examined (10.15 GB)
  Uploading file (35.53 KB) ...
  69390 files need to be examined (10.07 GB)
  68026 files need to be examined (9.99 GB)
  66972 files need to be examined (9.92 GB)
  66195 files need to be examined (9.89 GB)
  65347 files need to be examined (9.84 GB)
  64171 files need to be examined (9.81 GB)
  Uploading file (49.91 MB) ...
  62800 files need to be examined (9.79 GB)
  Uploading file (194.42 KB) ...
  61119 files need to be examined (9.76 GB)
  59417 files need to be examined (9.73 GB)
  57445 files need to be examined (9.69 GB)
  55451 files need to be examined (9.66 GB)
  Uploading file (49.97 MB) ...
  Uploading file (198.56 KB) ...
  54680 files need to be examined (9.63 GB)
  53353 files need to be examined (9.61 GB)
  51955 files need to be examined (9.58 GB)
  Uploading file (49.92 MB) ...
  Uploading file (155.43 KB) ...
  51081 files need to be examined (9.57 GB)
  49653 files need to be examined (9.54 GB)
  47797 files need to be examined (9.52 GB)
  46058 files need to be examined (9.49 GB)
  Uploading file (49.97 MB) ...
  Uploading file (224.86 KB) ...
  44374 files need to be examined (9.47 GB)
  42685 files need to be examined (9.44 GB)
  42399 files need to be examined (9.40 GB)
  42323 files need to be examined (9.38 GB)
  Uploading file (49.91 MB) ...
  42034 files need to be examined (9.35 GB)
  Uploading file (215.28 KB) ...
  41654 files need to be examined (9.32 GB)
  40924 files need to be examined (9.29 GB)
  40491 files need to be examined (9.29 GB)
  40247 files need to be examined (9.28 GB)
  39992 files need to be examined (9.27 GB)
  39712 files need to be examined (9.26 GB)
  39447 files need to be examined (9.25 GB)
  Uploading file (49.91 MB) ...
  Uploading file (259.31 KB) ...
  39272 files need to be examined (9.24 GB)
  38663 files need to be examined (9.23 GB)
Failed to process path: C:\Users\maarten\AppData\Local\Packages\Microsoft.Microsoft3DViewer_8wekyb3d8bbwe\LocalCache\ => I/O error occurred.
  37211 files need to be examined (9.19 GB)
  Uploading file (49.93 MB) ...
  Uploading file (189.25 KB) ...
  36244 files need to be examined (9.17 GB)
  Uploading file (49.92 MB) ...
  35280 files need to be examined (9.14 GB)
  Uploading file (126.67 KB) ...
  34197 files need to be examined (9.11 GB)
  Uploading file (49.91 MB) ...
  33297 files need to be examined (9.08 GB)
  Uploading file (127.04 KB) ...
  32035 files need to be examined (9.05 GB)
  Uploading file (49.93 MB) ...
  Uploading file (125.50 KB) ...
  31012 files need to be examined (9.02 GB)
  29905 files need to be examined (8.99 GB)
  Uploading file (49.92 MB) ...
  Uploading file (126.89 KB) ...
  29216 files need to be examined (8.97 GB)
^C
C:\Program Files\Duplicati 2>
C:\Program Files\Duplicati 2>vssadmin list shadows
vssadmin 1.1 - Volume Shadow Copy Service administrative command-line tool
(C) Copyright 2001-2013 Microsoft Corp.

Contents of shadow copy set ID: {afb6aaf0-033b-40ba-999f-6545711b2b48}
   Contained 1 shadow copies at creation time: 9-9-2017 09:42:31
      Shadow Copy ID: {a4019d4e-0b6b-4723-b095-f614b58549cd}
         Original Volume: (C:)\\?\Volume{8dd32c78-d9c9-4cf0-9d00-6a32c6265d8d}\
         Shadow Copy Volume: \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy1991
         Originating Machine: HANNIBAL
         Service Machine: HANNIBAL
         Provider: 'Microsoft Software Shadow Copy provider 1.0'
         Type: Backup
         Attributes: Differential, Auto recovered

Contents of shadow copy set ID: {6b3a68ca-c0e2-4667-aefa-7bb24f1bf445}
   Contained 1 shadow copies at creation time: 9-9-2017 09:52:40
      Shadow Copy ID: {631296d8-9a61-4587-93e5-0298805d1b9c}
         Original Volume: (C:)\\?\Volume{8dd32c78-d9c9-4cf0-9d00-6a32c6265d8d}\
         Shadow Copy Volume: \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy1994
         Originating Machine: HANNIBAL
         Service Machine: HANNIBAL
         Provider: 'Microsoft Software Shadow Copy provider 1.0'
         Type: ClientAccessible
         Attributes: Persistent, Client-accessible, No auto release, No writers, Differential


C:\Program Files\Duplicati 2>mklink
Creates a symbolic link.

MKLINK [[/D] | [/H] | [/J]] Link Target

        /D      Creates a directory symbolic link.  Default is a file
                symbolic link.
        /H      Creates a hard link instead of a symbolic link.
        /J      Creates a Directory Junction.
        Link    Specifies the new symbolic link name.
        Target  Specifies the path (relative or absolute) that the new link
                refers to.

C:\Program Files\Duplicati 2>mklink /d C:\faq\bliep \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy1994\
Cannot create a file when that file already exists.

C:\Program Files\Duplicati 2>mklink /d C:\faq\bliep2 \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy1994\
symbolic link created for C:\faq\bliep2 <<===>> \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy1994\

C:\Program Files\Duplicati 2>dir \faq\bliep2\Users\maarten\AppData\Local\Packages\Microsoft.Microsoft3DViewer_8wekyb3d8bbwe\LocalCache\
 Volume in drive C is OS
 Volume Serial Number is 1A45-49F8

 Directory of C:\faq\bliep2\Users\maarten\AppData\Local\Packages\Microsoft.Microsoft3DViewer_8wekyb3d8bbwe\LocalCache

29-05-2017  18:37    <DIR>          .
29-05-2017  18:37    <DIR>          ..
               0 File(s)              0 bytes
               2 Dir(s)  855.477.706.752 bytes free

C:\Program Files\Duplicati 2>dir \faq\bliep2\Users\maarten\AppData\Local\Packages\Microsoft.Microsoft3DViewer_8wekyb3d8bbwe\LocalCache\ /a
 Volume in drive C is OS
 Volume Serial Number is 1A45-49F8

 Directory of C:\faq\bliep2\Users\maarten\AppData\Local\Packages\Microsoft.Microsoft3DViewer_8wekyb3d8bbwe\LocalCache

29-05-2017  18:37    <DIR>          .
29-05-2017  18:37    <DIR>          ..
               0 File(s)              0 bytes
               2 Dir(s)  855.477.706.752 bytes free

C:\Program Files\Duplicati 2>

Another update - when ran with verbose:

Adding directory C:\Users\Maarten\AppData\Local\Packages\Microsoft.Microsoft3DViewer_8wekyb3d8bbwe\
Storing symlink: C:\Users\Maarten\AppData\Local\Packages\Microsoft.Microsoft3DViewer_8wekyb3d8bbwe\LocalCache\

On the commandline:

C:\Users\maarten\AppData\Local\Packages\Microsoft.Microsoft3DViewer_8wekyb3d8bbwe>fsutil reparsepoint query LocalCache
Reparse Tag Value : 0x80000018
Tag value: Microsoft

Reparse Data Length: 0x0000001a
Reparse Data:
0000:  01 00 00 00 00 00 00 00  14 1f a9 c8 4c fd 45 48  ............L.EH
0010:  ab 4d 49 51 e0 e9 f3 23  00 00                    .MIQ...#..

C:\Users\maarten\AppData\Local\Packages\Microsoft.Microsoft3DViewer_8wekyb3d8bbwe>

In BackupHandler.cs:976 i see the call to snapshot.GetSymlinkTarget - the path is translated to the ‘D:’ drive (I added the parameter to mount the snapshot as a subst drive) in WindowsSnapshot.cs the call to Alphaleonis.Win32.Filesystem.File.GetLinkTargetInfo(spath).PrintName fails immedatiately.

With the exception: “I/O error occured”.

Anybode any idea why the AlphaVSS module seems to have trouble querying this path while Windows itself is able to?


OK another update – apparently the issue with Duplicati being confused about reparse-points which are not hardlinks/junctions is a known issue: UnrecognizedReparsePointException · Issue #1727 · duplicati/duplicati · GitHub

I wonder if it would not be possible to recognize the reparse point to see if it is an actual link as defined by Determining Whether a Directory Is a Mounted Folder (Windows) ?

(new reply because I am only allowed to put 2 links in a post):

And another update. I am now trying to run a full backup of the system and running into “Access denied” warnings. I checked with ProcessExplorer and the SeBackupPrivilege is not enabled on the process which is the most likely reason we cannot backup files we do not have access to.

When I look into the code, I do see native code trying to enable the token, but I do not think it is actually used anymore. I found this thread: Grupos de Google where @kenkendk asks if there is any use for the priv when using VSS. I think being able (when the token is enabled and the user has admin permissions) to backup files without having the appropriate DACL on the file is very useful.

I have re-opened the issue and made a link to your detailed walkthrough of the problem.

Thanks for reopening that issue :slight_smile:

About the null-object reference, I think I have collected some additional information. Sometimes sqlite3 throws an ‘disk i/o error’ (not to be confused with a general ‘i/o error’), and this is the actual stacktrace:

   at System.Data.SQLite.SQLite3.Reset(SQLiteStatement stmt)
   at System.Data.SQLite.SQLite3.Step(SQLiteStatement stmt)
   at System.Data.SQLite.SQLiteDataReader.NextResult()
   at System.Data.SQLite.SQLiteDataReader..ctor(SQLiteCommand cmd, CommandBehavior behave)
   at System.Data.SQLite.SQLiteCommand.ExecuteReader(CommandBehavior behavior)
   at System.Data.SQLite.SQLiteCommand.ExecuteNonQuery(CommandBehavior behavior)
   at System.Data.SQLite.SQLiteTransaction.Commit()
   at Duplicati.Library.Main.Operation.BackupHandler.AddBlockToOutput(BackendManager backend, String key, Byte[] data, Int32 offset, Int32 len, CompressionHint hint, Boolean isBlocklistData) in C:\Users\maarten\source\repos\duplicati\Duplicati\Library\Main\Operation\BackupHandler.cs:line 1280
   at Duplicati.Library.Main.Operation.BackupHandler.ProcessStream(Stream stream, CompressionHint hint, BackendManager backend, FileBackedStringList blocklisthashes, FileBackedStringList hashcollector, Boolean skipfilehash) in C:\Users\maarten\source\repos\duplicati\Duplicati\Library\Main\Operation\BackupHandler.cs:line 1177
   at Duplicati.Library.Main.Operation.BackupHandler.HandleFilesystemEntry(ISnapshotService snapshot, BackendManager backend, String path, FileAttributes attributes) in C:\Users\maarten\source\repos\duplicati\Duplicati\Library\Main\Operation\BackupHandler.cs:line 1064

Because in BackupHandler.cs on line 1273 we close the block-writer, we also dispose the m_compression handler. Because the commit() fails, we don’t have any way to compress blocks anymore, but our exception handler just handles it like a temporary error even though the internal state is slightly broken at that point.

After this error, m_compression of BlockVolumeWriter.cs is null which will start generating the null-object reference exceptions.

So I think we have two issues:

  1. The error in this case is irrecoverable the way we code is currently written. I think we might be better of making the exception handler of HandleFileSystemEntry be a bit more specific (eg: catch UnauthorizedAccessException explicitly and perhaps other errors as well) and if we get an unknown error, perhaps just close down the backup and exit as it is unreliable at that part?

  2. Because this error happens very irregulary, I suspect some sort of race-condition in accessing SQLite library perhaps? When I try to interpret the code, I see we lock in FlushDbMessages but we only lock a single statement if I understand correctly – could it be that we are committing twice because of a race?

Yes, you are correct, this is a bad way to handle it. If there is an error during AddBlockToOutput it should abort the backup and not try to continue.

That is possible, but the lock should guarantee that we do not loose entries while we do the switch of the list of pending operations. The method should only be activated on the “main” thread.

I have another version mostly ready, in which the database is guarded by a single thread and a kind of request-API that should guarantee that there are no races:

https://github.com/duplicati/duplicati/tree/concurrent_processing

If you are adventurous, maybe you can try that version and see if it solves at least the race issue?

8 posts were split to a new topic: Building Duplicati with VS fails

Hi,
I am also trying to switch from Crashplan to Duplicati and I am getting the same error message as maarten. I have successfully created a 400 Gb backup on a removable hard drive. I have tried to do the same backup to Backblaze, but it fails after around 200Gb.

I am getting the following error message:
Unexpected number of remote volumes detected: 0!

I have run this test 4 or 5 times. Most of the time I get the same error. Sometimes it just says "IO Error"I

source: Windows 7 Professional
Duplicati version: Duplicati - 2.0.2.1_beta_2017-08-01
Destination:BackBlaze B2
Advanced Options: thread-priority = lowest
I have between 400 -600 Gb of free space on my 3 local disk drives.

Here is the log output from the last time it failed.

Oct 1, 2017 12:02 PM: Failed while executing "Backup" with id: 2

System.Exception: Unexpected number of remote volumes detected: 0!
   at Duplicati.Library.Main.Database.LocalDatabase.UpdateRemoteVolume(String name, RemoteVolumeState state, Int64 size, String hash, Boolean suppressCleanup, TimeSpan deleteGraceTime, IDbTransaction transaction)
   at Duplicati.Library.Main.BackendManager.DatabaseCollector.FlushDbMessages(LocalDatabase db, IDbTransaction transaction)
   at Duplicati.Library.Main.BackendManager.WaitForEmpty(LocalDatabase db, IDbTransaction transation)
   at Duplicati.Library.Main.Operation.CompactHandler.DoDelete(LocalDeleteDatabase db, BackendManager backend, IEnumerable`1 deleteableVolumes, IDbTransaction& transaction)
   at Duplicati.Library.Main.Operation.CompactHandler.DoCompact(LocalDeleteDatabase db, Boolean hasVerifiedBackend, IDbTransaction& transaction)
   at Duplicati.Library.Main.Operation.DeleteHandler.DoRun(LocalDeleteDatabase db, IDbTransaction& transaction, Boolean hasVerifiedBacked, Boolean forceCompact)
   at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired(BackendManager backend, Int64 lastVolumeSize)
   at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass16_0.<Backup>b__0(BackupResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)
   at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

When you say “do the same backup to Backblaze” how did you do it?

If you took the existing job and just changed the destination from USB to Backblaze then it would be complaining because it’s finding none of the expected USB files to be there.

If that is what happened then copying the USB files to the destination folder on Backblaze should resolve the issue.

But let us know if that’s NOT how you set up the B2 backup job so err can keep looking into it.

That is not how I did it. I did not just changed the destination. I created a new job and backed up the same files to Backblaze.

I tried it again today. First I unchecked some of the files in my backup so the backup would be smaller and so it would complete sooner.
Then I I did a Database Repair. That completed successfully.

I ran a backup and that completed OK.
I then added a directory to the backup configuration with 80gb of data. It ran for a few hours and came up with a Disk IO Error.

Here is the log:

Oct 5, 2017 3:31 PM: Failed while executing "Backup" with id: 2
System.Data.SQLite.SQLiteException (0x80004005): disk I/O error disk I/O error at System.Data.SQLite.SQLite3.Reset(SQLiteStatement stmt) at System.Data.SQLite.SQLite3.Step(SQLiteStatement stmt) at System.Data.SQLite.SQLiteDataReader.NextResult() at System.Data.SQLite.SQLiteDataReader..ctor(SQLiteCommand cmd, CommandBehavior behave) at System.Data.SQLite.SQLiteCommand.ExecuteReader(CommandBehavior behavior) at System.Data.SQLite.SQLiteCommand.ExecuteNonQuery(CommandBehavior behavior) at Duplicati.Library.Main.BasicResults.LogDbMessage(String type, String message, Exception ex) at Duplicati.Library.Main.BasicResults.AddWarning(String message, Exception ex) at Duplicati.Library.Main.Operation.BackupHandler.HandleFilesystemEntry(ISnapshotService snapshot, BackendManager backend, String path, FileAttributes attributes) at Duplicati.Library.Main.Operation.BackupHandler.RunMainOperation(ISnapshotService snapshot, BackendManager backend) at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter) at Duplicati.Library.Main.Controller.<>c__DisplayClass16_0.<Backup>b__0(BackupResults result) at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method) at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter) at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

Are you still running CrashPlan as well as Duplicati? If so, then you might be running into the same issue as mentioned here (where it sounds like CrashPlan is locking Duplicati’s sqlite file during the backup causing Dulicati to not be able to access it):

Yes, I am still running Crashplan also until I am have gotten a complete backup with Duplicati to work. I believe I am seeing the same problem with Crashplan locking the sqlite file as was mentioned in the posting you referred to. I changed my Crashplan backup to exclude that file. I was able to complete a backup with another 150 gb of data uploaded.

I have about 300 gb more to go, but I am adding the data to my backup in small chunks so I can make sure everything is working.

I am hoping this problem is solved. I will run the rest of my backups and post if there any more issues. Thanks for your help.

1 Like

Thanks for the update - good luck!