Hangs at counting files. How to troubleshoot?

When trying to backup 2012 R2 deduplicated files in a Stablebit drivepool, duplicati hangs and I cannot figure out how to get past it. The strange thing is, it works perfectly if the files are in the drivepool but not deduplicated and also works if the files are deduplicated but not in the drivepool.

I have tried all of the symlink options as shown on other posts.

Please help me troublshoot and fix this.

Thanks in advance for any advice,

nlitend1

Does anyone have any ideas or instructions on how to troublshoot this? I have troubleshooted the problem with stablebit and it appears to be a way duplicati is “requesting” the files…

Duplicati appears to be be an awesome product and I want to be able to get my friends and family to use it… please help me show them support is good for the product as well.

Thank you,

nltiend1

How technical do you want to get? For a non-technical approach, try varying the source selection to see what happens. Can you say anything more on typical failure, e.g. “It counts for # minutes” or does it seem random? Google also has some comments about situations like yours. I think some bugs were fixed. Others maybe not? What support or ideas can you get from the Stablebit end? I have neither that nor any sort of Windows server. Or does the below mean their support gave you the quoted answer (which doesn’t help me out a whole lot…)?

Thank you so much for the reply!

I have tried varying the source selection and files that are deduplicated by 2012 R2 and not in stablebit drivepool work and files that are in stablebit drivepool and not deduplicated work. I have also tried various files that are deduplicated and in the drivepool. The backup process hangs indefinitely at “counting files” (for at least hours on end for less than 10 files of no more than 100mb).

I have tried the different options with the advanced options for symlinks with no change in behavior. Stablebit has had me confirm that the file size in the drivepool and the file size on the physical hard drive are in fact the same.

Is there a way to view any logs to see what is happening when it is hanging? I was not able to find an option for that. I have also tried canary and developer builds and they also did not work.

Again, thanks so much for the reply. I’m itching to replace crashplan on all my machines with duplicati :slight_smile:

nlitend1

There are many logging ways. For this, the server log is probably more useful than the job log, and it can be viewed live in a different browser tab. Profiling level gets quite wordy. See Viewing the Duplicati Server Logs.

Logging to a file is also a good plan. It’s a better permanent record, and has added settings and capabilities.

C:\Program Files\Duplicati 2>duplicati.commandline.exe help logging

Duplicati provides information for debugging and logging purposes. By default,
all debugging and logging is turned off.

  --debug-output
    Error messages are more verbose.
  --debug-retry-errors
    If something needs to be retried (e.g. upload failed) this will cause an
    entry in the log file.
  --log-file = <path>
    The path to the log file e.g. "D:\duplicati\log.txt".
  --log-file-log-level = Profiling | Verbose | Information | Warning | Error
    Specifies the log level to use for the log file.
  --console-log-level = Profiling | Verbose | Information | Warning | Error
    Specifies the log level to use for the console log.

New log system with filters describes this new system’s advancements compared to the old (e.g. 2.0.3.3 beta).

For a detailed view of the Duplicati file requests to Windows, you can use Sysinternals Process Monitor, and filter to see only the Duplicati.Server.exe process. Its Windows-level work would likely mean more to Stablebit.

EDIT: Process Monitor not Process Explorer.

Hi @nlitend1, welcome to the forum!

Duplicati uses standard file access methods for backups so shouldn’t be having this issue. That being said, a few users have reported similar oddities when working with third party tools (like cloud driver aggregators).

What happens if you create a test job with a tiny source (maybe a single folder)?

Hi @nlitend1, welcome to the forum!

Duplicati uses standard file access methods for backups so shouldn’t be having this issue. That being said, a few users have reported similar oddities when working with third party tools (like cloud driver aggregators).

What happens if you create a test job with a tiny source (maybe a single folder)?

@ts678 The following is the last of my live log output from a browser window. The job was stuck at “starting”

  • Sep 26, 2018 4:13 PM: ExecuteScalarInt64: INSERT INTO “File” (“Path”,“BlocksetID”, “MetadataID”) VALUES (?, ? ,?); SELECT last_insert_rowid(); took 00:00:00.000

  • Sep 26, 2018 4:13 PM: Starting - ExecuteScalarInt64: INSERT INTO “File” (“Path”,“BlocksetID”, “MetadataID”) VALUES (?, ? ,?); SELECT last_insert_rowid();

  • Sep 26, 2018 4:13 PM: ExecuteScalarInt64: SELECT “ID” FROM “File” WHERE “BlocksetID” = ? AND “MetadataID” = ? AND “Path” = ? took 00:00:00.000

  • Sep 26, 2018 4:13 PM: Starting - ExecuteScalarInt64: SELECT “ID” FROM “File” WHERE “BlocksetID” = ? AND “MetadataID” = ? AND “Path” = ?

  • Sep 26, 2018 4:13 PM: ExecuteScalarInt64: INSERT INTO “Metadataset” (“BlocksetID”) VALUES (?); SELECT last_insert_rowid(); took 00:00:00.000

  • Sep 26, 2018 4:13 PM: Starting - ExecuteScalarInt64: INSERT INTO “Metadataset” (“BlocksetID”) VALUES (?); SELECT last_insert_rowid();

  • Sep 26, 2018 4:13 PM: ExecuteScalarInt64: INSERT INTO “Blockset” (“Length”, “FullHash”) VALUES (?, ?); SELECT last_insert_rowid(); took 00:00:00.000

  • Sep 26, 2018 4:13 PM: Starting - ExecuteScalarInt64: INSERT INTO “Blockset” (“Length”, “FullHash”) VALUES (?, ?); SELECT last_insert_rowid();

  • Sep 26, 2018 4:13 PM: ExecuteScalarInt64: SELECT “ID” FROM “Blockset” WHERE “Fullhash” = ? AND “Length” = ? took 00:00:00.000

  • Sep 26, 2018 4:13 PM: Starting - ExecuteScalarInt64: SELECT “ID” FROM “Blockset” WHERE “Fullhash” = ? AND “Length” = ?

  • Sep 26, 2018 4:13 PM: ExecuteScalarInt64: SELECT “A”.“ID” FROM “Metadataset” A, “BlocksetEntry” B, “Block” C WHERE “A”.“BlocksetID” = “B”.“BlocksetID” AND “B”.“BlockID” = “C”.“ID” AND “C”.“Hash” = ? AND “C”.“Size” = ? took 00:00:00.000

  • Sep 26, 2018 4:13 PM: Starting - ExecuteScalarInt64: SELECT “A”.“ID” FROM “Metadataset” A, “BlocksetEntry” B, “Block” C WHERE “A”.“BlocksetID” = “B”.“BlocksetID” AND “B”.“BlockID” = “C”.“ID” AND “C”.“Hash” = ? AND “C”.“Size” = ?

  • Sep 26, 2018 4:13 PM: ExecuteScalarInt64: INSERT INTO “Block” (“Hash”, “VolumeID”, “Size”) VALUES (?, ?, ?); SELECT last_insert_rowid(); took 00:00:00.000

  • Sep 26, 2018 4:13 PM: Starting - ExecuteScalarInt64: INSERT INTO “Block” (“Hash”, “VolumeID”, “Size”) VALUES (?, ?, ?); SELECT last_insert_rowid();

  • Sep 26, 2018 4:13 PM: ExecuteScalarInt64: SELECT “ID” FROM “Block” WHERE “Hash” = ? AND “Size” = ? took 00:00:00.000

  • Sep 26, 2018 4:13 PM: Starting - ExecuteScalarInt64: SELECT “ID” FROM “Block” WHERE “Hash” = ? AND “Size” = ?

  • Sep 26, 2018 4:13 PM: Adding directory P:\Crypto\Zoin\

Unfortunately, I have tried that single directories without success.
Thanks for the suggestion and reply.
nlitend1

Not getting able to back up when a single folder does imply something getting stalled in Duplicati somehow.

My GUESS is Duplicati is either waiting on a result from Windows that never comes or gets a result it doesn’t understand / processes wrongly.

Maybe @Pectojin or @kenkendk has to into this deduped files in a Stablenit drivepool setup before…

I’d recommend Sysinternals Process Monitor to see what Duplicati.Server.exe is up to at the Windows level, and I’m hoping it shows results in two steps – the file request, then the (possibly never-to-come) Windows response.

I’d also like to know whether P:\Crypto\Zoin\ is special in any ways besides the name, and how many files it has. Presumably from “the last of the live output” there was some earlier output that got further. This one stalled fast.

How well do other programs work with this configuration, e.g. Windows File Explorer, robocopy, zip archiving, etc.

i have tried using the systinternals process monitor, but can’t seem to figure out how to make it show the results you’re hoping to see. It shows information at the bottom panel but does not show log type stuff.

I knew that folder name may look suspicious as soon as i posed it…:slight_smile: but no, it’s simple folder with on exe file in it. I have also tried a folder containing MP3 files from a CD rip with the same results. the contents don’t seem to make any difference.

I have not had any other issues with this configuration with explorer, zip files etc.

The only thing I know about stablebit is it does not work with VSS, but again, this issue only happens when deduplication is involved as well. backup works fine on files within stablebit that aren’t deduplicated.

Below is a Process Monitor screenshot for access to paths beginning with C:\BackThisUp with structure like:

C:\>tree C:\BackThisUp /f
Folder PATH listing for volume Windows
Volume serial number is 5822-1128
C:\BACKTHISUP
+---file1.txt
+---sub1
¦       sub1_file1.txt
¦
+---sub2
        sub2file1.txt

You can see Duplicati walking through the folder and collecting the file timestamps to see if anything is new:

My hope was that you could watch counting on your system, to see if the capture gives error or hang clues.
If you (or StableBit) want driver-level information such as IRP too, check Filter --> Enable Advanced Output

Other Process Monitor setup you want if you choose such a path-oriented plan is to set the Filter to include paths that begin with your path of interest. You can also check “Drop Filtered Events” to stop storing others however this will prevent you from examining the capture with a different filter should you change your mind.

Are you saying you have VSS (snapshot-policy) on or off? Does it make a difference?

I have left the default option for snapshots (which is “off” according to documentation).

I have the same setup as you :

Duplicati - 2.0.4.5_beta_2018-11-28
2012r2 with data deduplication
Stablebit Drivepool (latest version)

It hangs on counting files for huge folders but works for small folder with default options.

The workaround I found is to use the “drivepool poolpart folders” as the source data instead of the pooled drive letter :

C:\Drives\Data1\PoolPart.XXXXXXXXXXXXXXX\Data\Music\
C:\Drives\Data2\PoolPart.XXXXXXXXXXXXXXX\Data\Music\
C:\Drives\Data3\PoolPart.XXXXXXXXXXXXXXX\Data\Music\

Instead of :

T:\Data\Music\

A bit longer since you have to select the same folder with all the drives you have in your pool.

Hi @li442il, welcome to the forum - and thanks for sharing that workaround!

It’s been awhile since I read this topic, but it sounds like Duplicati may somehow be “overloading” Stablebit Drivepool with requests.

Do you see any difference if you use and (or all) of these?

  • --check-filetime-only=true (simplify what Duplicati is requesting from the filesystem)
  • --thread-priority=idle (tell Duplicati to share CPU more)
  • --use-background-io-priority=true (tell Duplicati to share IO more)

I realize these aren’t great FIXES - but they might help with narrowing down the source of the issue.

Hello JonMikeIV,

Using the parameters you provided didn’t change, still stuck at counting files with high CPU usage (60-70%).

When I replace the “pooled” folder with the “PoolPart” folder it immediately starts counting files and is using only minimal CPU usage (10-20%).

Thanks for the follow up. It sounds like you’ve got a bit of a workaround with the PoolPart process. Have you tested how a restore works with that?

Restoring a single file is working great using the file name in the “search for files”. (It finds the right “PoolPart” folder where is the file.

Restoring a folder is a bit more tricky since I have to select the folder I want to restore in all “PoolPart” folders.

Thanks for letting us know.

While a work around is nice, maybe adding an issue to the GitHub page might help drive an actual solution.