Duplicati On Windows Stucks

Hi,
I used 2.0.6.3_beta_2021-06-17 as Service.

There is litte activity from the C:\Program Files\Duplicati 2\Duplicati.Server.exe
Process Explorer shows a lot of things in the stack, a lot of UserRequest which get terminated.

The Log stops with:

2023-03-03 12:17:15 +01 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UploadNewFileset]: Starting - Uploading a new fileset
2023-03-03 12:17:16 +01 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: Starting - CommitUpdateRemoteVolume
2023-03-03 12:17:16 +01 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: CommitUpdateRemoteVolume took 0:00:00:00.718

How can i resolve the problem?

It’s somewhat normal to see drops of activity to 0 bytes depending on the current file size, type, upload state but that’s mostly on a first run or if the data has changed significantly. Also keep in mind that the progress shown is relative to the whole backup process, not just the current task.

This is also a symptom of possibly requiring a larger -blocksize but that will depend on how much data you’re backing up.

Is this the first time this backup job has run?
What -blocksize are you using?
What -dblocksize are you using?
How much are you backing up and to where?

Hello

I’ll only add that if it’s not an initial backup, there is a possibility that your backup involves a rather large compact operation. That’s not a part that I have yet looked at, but first impression is that there is not a lot of reporting on this particular step advancement, that is, it may do a lot of things without much visual indication.

Insides are hard to see, but I think status bar changes. Some statuses that might be seen in backup:

What are you looking at, e.g. Performance Graph, Disk and Network, other ones?

Answers to earlier questions would also help. This might be some troubleshooting.

You may be right, I have not found in the code where this could happen but half an hour of reading may not be enough to find something in this :-). Did you actually see this status displayed in real life ?

To find in code, you can look for the string. I use Notepad++, but GitHub’s search box does very well.

As evidence that the message occurs, Google for

“Deleting unwanted files” site:forum.duplicati.com

Situation for compacting is less clear, but it’s later.

I’m not sure that this status is reflected in the Web UI. Maybe it could be seen with the Python client for example but I have not seen it in the Javascript. I’ll take another look.

If the profiling log can be trusted (I’d trust it more if was running awhile and this is the end) maybe it’s

Since Process Explorer is open, it might be worth using Find handle to see who’s got the database.

My C# reading skills are limited. I was wondering about the end of that chunk. Why two await?
Because I keep a profiling log and other debug records in case something breaks, a sample is:

2023-03-02 07:38:13 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UpdateChangeStatistics]: UpdateChangeStatistics took 0:00:00:00.929
2023-03-02 07:38:13 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UploadNewFileset]: Starting - Uploading a new fileset
2023-03-02 07:38:13 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 21.31 MB in 00:00:05.6437223, 3.78 MB/s
2023-03-02 07:38:14 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b5e15e007b4d545bf8ea82654e73b6bba.dblock.zip.aes (21.31 MB)
2023-03-02 07:38:14 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: Starting - CommitAfterUpload
2023-03-02 07:38:17 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: CommitAfterUpload took 0:00:00:02.807
2023-03-02 07:38:17 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: Starting - ExecuteScalarInt64: INSERT INTO "Remotevolume" ("OperationID", "Name", "Type", "State", "Size", "VerificationCount", "DeleteGraceTime") VALUES (825, "duplicati-i6d465ff4ae49429f9033225d96d0769f.dindex.zip.aes", "Index", "Temporary", -1, 0, 0); SELECT last_insert_rowid();
2023-03-02 07:38:17 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: ExecuteScalarInt64: INSERT INTO "Remotevolume" ("OperationID", "Name", "Type", "State", "Size", "VerificationCount", "DeleteGraceTime") VALUES (825, "duplicati-i6d465ff4ae49429f9033225d96d0769f.dindex.zip.aes", "Index", "Temporary", -1, 0, 0); SELECT last_insert_rowid(); took 0:00:00:00.000
2023-03-02 07:38:17 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: Starting - CommitUpdateRemoteVolume
2023-03-02 07:38:17 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: CommitUpdateRemoteVolume took 0:00:00:00.303
2023-03-02 07:38:17 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i6d465ff4ae49429f9033225d96d0769f.dindex.zip.aes (228.78 KB)
2023-03-02 07:38:18 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 228.78 KB in 00:00:00.4808930, 475.74 KB/s
2023-03-02 07:38:18 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i6d465ff4ae49429f9033225d96d0769f.dindex.zip.aes (228.78 KB)
2023-03-02 07:38:18 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: Starting - CommitAfterUpload
2023-03-02 07:38:18 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: CommitAfterUpload took 0:00:00:00.172
2023-03-02 07:38:18 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UploadNewFileset]: Uploading a new fileset took 0:00:00:05.343
2023-03-02 07:38:18 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: Starting - CommitUpdateRemoteVolume
2023-03-02 07:38:19 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: CommitUpdateRemoteVolume took 0:00:00:00.142
2023-03-02 07:38:19 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20230302T122030Z.dlist.zip.aes (915.34 KB)
2023-03-02 07:38:20 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 915.34 KB in 00:00:01.1729199, 780.39 KB/s
2023-03-02 07:38:20 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20230302T122030Z.dlist.zip.aes (915.34 KB)
2023-03-02 07:38:20 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: Starting - CommitAfterUpload
2023-03-02 07:38:20 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: CommitAfterUpload took 0:00:00:00.117
2023-03-02 07:38:20 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: Starting - CommitAfterUpload
2023-03-02 07:38:20 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: CommitAfterUpload took 0:00:00:00.000
2023-03-02 07:38:20 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT "IsFullBackup", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC
2023-03-02 07:38:20 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT "IsFullBackup", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC took 0:00:00:00.000
2023-03-02 07:38:20 -05 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-StartCheck]: Start checking if backups can be removed

and this can be compared to OP log, if indeed there is more to it. As expected, the backup ends with dblock and dindex upload, comprising the data, and dlist which gives the files and the blocks in each.

Actually, OP log looks like a log-file, so maybe we can see a bit more of it to compare to my sample?
What’s there, though, is enough to see that it’s missing quite a few lines unless they were edited out.

EDIT:

What are the last couple of files at the destination? That would be another clue of how far things got.

About the log, it seems indeed that at the verbose level Duplicati is announcing what it is about to do (in ReportCompactData()). In the OP the live log seems empty, that would be weird.

There seems to be a chunk of a log-file at log-file-log-level=profiling which would be set up in advance.
I’m asking to see a bit more of it, to compare against what mine looks like (also trying to decode mine).

The live log screen was maybe only opened later, to get a second opinion that new logging had ended.

I’m also interested in the home screen, where running backup shows its file reading, and progress in it.

About → System info line lastPgEvent might also give some clue as to what happened most recently.

I don’t think it’s gotten to compact yet. The status line is probably made from progress_state_text here:

and you can also see the lastPgEvent that I mentioned, which might be visible on the GUI screen…

Hello, i don’t know how i can help. I watched the progress of Duplicati for some days and made some tests. The backups allways stucks and there was no progress in 24h so i kill the duplicati.service task. After that i export and import the configs from the backups and change it to backup on an local drive. There was no problem. After that i import an backup config and delete all extra options and set an other empty target on the ftp server. But after that, the backup stucks again at one small file and nothing happend anymore.

It’s not the first time, but I also started the job for the first time as a test.
The target is a ftp Server which i connect via ftps
In the later tests i don’t set any blocksize and dblocksize is 50MB and i testet it with 250MB
The backup has 12GB and no problem when the target is a local drive for testing.

The backup completly stucks at one file.

Last entry from live verbose is now since 30 Minutes this one: 8. Mrz. 2023 12:36: Checking file for changes D:.…\Content\A\A001\1028.dat, new: True, timestamp changed: True, size changed: True, metadatachanged: True, 12.07.2021 14:38:16 vs 01.01.0001 00:00:00

In the Verbose log, going back in time (it’s reverse-chronological), what are last upload (put) lines?

Because it works to local file but not to FTP, maybe FTP stopped. This causes a backlog, and then eventually queue space runs out (even a small file can be the one that finally fills it) and work stops.

Try opening a Command Prompt to run dir %TEMP%\dup-* and look for (default) 50 MB files there.

Here some actual screens from PE.



And and this is the command:
“C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe” backup “ftps://xxx:xxx//…/V3?auth-username=…&auth-password=…&accept-any-ssl-certificate=true” “D:.…\” “D:.…\” --backup-name=“name…” --dbpath=“C:\Program Files\Duplicati 2\data\OKRPK…sqlite” --encryption-module=aes --compression-module=zip --dblock-size=50MB --passphrase=… --retention-policy=“1W:1D,4W:1W,12M:1M” --accept-any-ssl-certificate=true --disable-module=console-password-input

There are some files at the ftp, but only with the same timestamp: 8. Mrz. 2023 12:36:

* 8. Mrz. 2023 13:27: Backend event: Put - Started: duplicati-b783cafbfcd054f6ab6804f54acadbe27.dblock.zip.aes (49,96 MB)

* 8. Mrz. 2023 13:27: Renaming "duplicati-b75899f4c97a54a758784ab80c02172d0.dblock.zip.aes" to "duplicati-b783cafbfcd054f6ab6804f54acadbe27.dblock.zip.aes"

* 8. Mrz. 2023 13:27: Backend event: Put - Rename: duplicati-b783cafbfcd054f6ab6804f54acadbe27.dblock.zip.aes (49,96 MB)

* 8. Mrz. 2023 13:27: Backend event: Put - Rename: duplicati-b75899f4c97a54a758784ab80c02172d0.dblock.zip.aes (49,96 MB)

dir %TEMP%\dup-* => no file found

Process Explorer information shown is almost meaningless because it’s showing status of the CLR Duplicati runs under. C# winds up being run by a runtime, and you’re looking over Microsoft’s code.

Thread wait reasons is possibly what it means inside Windows, but that’s below the Duplicati code.

File information is a little useful. You’re at default 50 MB dblock size, but where are the dindex files? Generally one uploads after every dblock upload, though it can finish first because it’s much smaller. Duplicati by default does parallel uploads (which maybe is bothering FTP). You could try setting to 1:

image

The log is showing what look like retries for upload errors. Generally there’s some explanation nearby. When a destination problem exists, live log at Retry level is easier to follow because there’s less info, however without another run your have to manually ignored the unrelated source-side info in Verbose.

There’s no positive proof source side isn’t involved, but you’re definitely seeing odd destination action.

Duplicati.CommandLine.BackendTester.exe

Before you start using a particular backend to use as a backup target, you can use the Backend Tester to get an indication of the integrity of that backend.

might be worth trying, but I’m not sure how well it does ftps. Make an empty folder and adjust URL.

EDIT:

This is strange. Even if somehow it didn’t have any dblock files in the making, it should have a dlist.
Are you running this as the user that’s doing the backup? There should be a flow of temporary files.

Another unrelated comment: FTP is an old beast and not very specified, with lot of variants, so there could be some incompatibilities. People have reported success in replacing the basic FTP backend with the “FTP (alternative)” backend. It should be possible to do it while keeping your existing backup (I hope so at least).

1 Like