Backup completed with warning

I wonder what this message exactly means and to what extent I should be concerned.

2020-04-03 19:26:40 +02 - [Warning-Duplicati.Library.Main.Operation.Backup.UploadSyntheticFilelist-MissingTemporaryFilelist]: Expected there to be a temporary fileset for synthetic filelist (10, duplicati-i901c2ae1e4fe40aeac6965025213ee07.dindex.zip.aes), but none was found?

–disable-synthetic-filelist is the manual’s word on what you’re missing. It’s a feature, but not essential:

If Duplicati detects that the previous backup did not complete, it will generate a filelist that is a merge of the last completed backup and the contents that were uploaded in the incomplete backup session.

The term filelist refers to what the UI would show as a backup version (file listing) for the Restore.

v2.0.5.104-2.0.5.104_canary_2020-03-25

Fixed storing dlist file after interrupted backup, thanks @seantempleton

is the fix, but it’s not in any Beta yet. What Duplicati version are you on, and was there an interrupted backup prior to this warning? On 2.0.5.1 Stop button, “Stop after current file” is the slower safer stop, however there have been some fixes to other ways of stopping (again, only in Canary release so far).

Even though you found the warning after backup, I would expect 19:26:40 to be closer to the start, as that’s the time when a synthetic filelist would be created (were it not for bugs). If subsequent backups are quiet, then I wouldn’t worry too much that you didn’t get the bonus of the synthetic filelist one time.

If by “exactly” you really want lots of details, please see Fix synthetic filelist not being uploaded #4114.

I’m running v2.0.5.1_beta_2020-01-18.
Before this warning was issued connection with the server was lost and I had to restart Duplicati as well as the backup task.

The thing is the Source is about 19 Gb but the Backup only 13 Gb. That’s why I’m wondering if something’s wrong, like the difference means 6 Gb have actually not been uploaded.

So updating to v2.0.5.104-2.0.5.104_canary_2020-03-25 should fix it?

Possibly, if the server connection was lost before upload finished. In a normal case, size of backup can be smaller than source due to compression and deduplication, or larger as many versions accumulate.

That’s the theory, if you really need the synthetic filelist, but Canary is always very new, and users who decide to run it (preferably on non-critical backups) are the first to get the new fixes plus any new bugs.

If you’re up for that, feel free to give it a try either by fresh install or Settings change of Update channel.

After I updated and added more files to the source backup operation seems to flow smoothly enough. However the ca. 6 Gb difference is always there. How can I let Duplicati know there’s something left to be uploaded?

Alternatively, is there any way to get a full list of the backed up files so as to enable an easier comparison with the source and somehow add the missing files?

You’re presupposing that something is missing, however small size is really not proof of that because compression will make individual files smaller (unless already compressed) and deduplication will turn identical portions of files (or even whole files) into roughly zero extra space except for record-keeping.

Features covers this and more.

The FIND command with * for <filename> should do. Quote that if needed (especially on Linux, etc.) however Using the Command line tools from within the Graphical User Interface is easier, and avoids having to Export As Command-line to carry the job’s options over into a true OS command line usage.

It’s easier to do if you do it in the GUI (change Command and Commandline arguments (no quotes are required here to protect against shell), but the output in the browser might not be quite what you want.

Restoring files from a backup is always a good idea, to be sure you can do it. You could compare files. Default restore should not even suffer from Internet speed because local file blocks (e.g. from source) will be used in the restore (use a different folder – don’t test-restore on top of originals) when available. Setting –no-local-blocks true will stop that optimization if you prefer to test using actual remote backup.

It’s easier to do if you do it in the GUI (change Command and Commandline arguments (no quotes are required here to protect against shell), but the output in the browser might not be quite what you want.

Indeed, the list was so long it scrolled off the top of the page so I need to pipe the result to a file. Maybe I’ll have a go at the command line but first I must read that part of the manual.

It turned out to be straightforward enough and I got that list. I checked for missing files and I was surprised that only very few (and small sized) were missing -maybe some files added or modified after their respective folders had already been uploaded.

Still I’m surprised by the 6 Gb difference of a total 32 Gb. The source was documents of all sorts, Office/text/pdf/etc files, (except multimedia type files). Other backups I’ve run have only very minor differences, for example, a photo backup shows a 0.5 Gb difference.