Thanks, I have tried to incorporate your findings in the edit of my previous post.
That is correct, the chosen log level seems not to show it as detailed as in your example but I also have the UploadSyntheticFilelist-PreviousBackupFilelistUpload entry at the beginning of the job run:
2024-04-27 00:20:37 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started
2024-04-27 00:22:23 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started: ()
2024-04-27 00:22:27 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed: (2.82 KB)
2024-04-27 00:22:27 +02 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-RemoteUnwantedMissingFile]: removing file listed as Temporary: duplicati-20240424T134522Z.dlist.zip.aes
2024-04-27 00:22:27 +02 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-SchedulingMissingFileForDelete]: scheduling missing file for deletion, currently listed as Uploading: duplicati-20240424T134523Z.dlist.zip.aes
2024-04-27 00:22:27 +02 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-KeepIncompleteFile]: keeping protected incomplete remote file listed as Temporary: duplicati-20240425T103609Z.dlist.zip.aes
2024-04-27 00:22:36 +02 - [Information-Duplicati.Library.Main.Operation.Backup.UploadSyntheticFilelist-PreviousBackupFilelistUpload]: Uploading filelist from previous interrupted backup
The dlist file size continuously increased up to 62.8 MB in March when the job was retired. Same size is observable for the interrupted backup dlist file. For the successful run afterwards, as you already noted, it decreased considerably to 2.28 MB.
To clarify modifications in the job configuration:
I had removed source folders from the job two times: 1) Before un-retiring the job (that is, before the first partial job run). 2) Before the second partial job run, hoping for faster progress. Afterwards, I posted in this forum and following your advice, I did no more reduce the source folder list. I did also not re-add any source folders to keep the scenario constant and not add another variable for investigation.
My interpretation of the dlist file sizes is that the change in source folder list is only reflected in the dlist if a run has completed successfully (that at least is what the data of my case suggests).
The only modifications in job configuration after posting here in the forum are:
Regarding the odd put:
Well spotted! This is indeed also reflected in the log file (there are no other log messages in between those two):
Correct, I saw that file and as you already referred to it by its file size in your previous post, I am also pretty sure that it already had a size of 39.88 MB. Even though Duplicati reports the Put to be completed at 05.45 AM, the file meta data (last modified date) is still 00.24 AM (but I can’t tell whether this really means that the file was no more changed at 05.45 AM on the remote).
I would also be interested in developers’ thoughts about that.
If that helps, I can change log-file-log-level from verbose to profiling. Verbose level resulted in 85k lines in a 25 MB log file.
Block size of both jobs are:
Job 1: 250 KB
Job 2: 400 KB
I have set the size at the initial configuration before running the backup for the first time. When creating job 2, I decided to choose a larger block size (mainly because I read that too small block sizes may cause problems when the number of overall blocks grows large).
I can test both but to be honest, I do not really know what I had to do specifically. If I understand correctly, for cache size there exists an environment variable I could modify. If I know its name and which value format it expects, I can test increasing the cache size.
For the PRAGMA analyze I am even more clueless. I recall having read about that pragma from time to time when browing in the forum but am not sure what to do precisely: Is this something I should use manually and if so, at which point in time? How can I run it (within the Duplicati GUI, as command-line option or even by directly accessing the SQLite database when Duplicati is not running)?
I think I will rerun the backup job one more time with log-file-log-level profiling and in case it appears stuck, check out whether SysInternals tools provide any clue about what is actually happening.
EDIT: Corrected reported log file size (it’s MB, not KB)
EDIT 2: The rerun has completed in 2 minutes, processing time was thus not suspicious this time. As expected, not that much of data had to be transferred to the remote (one dindex/dblock pair and a single dlist file). I think I may now gradually extend the source folder list to the configuration prior to the partial backup job runs and may also enable auto-compact again. I will report whether I am successful. If there is anything else I can do to help reproducing the original issue, let me know.