That was just the rough writeup, to gauge interest. There are other steps, but ultimately it’s still an experiment, and just the blocksize change means you might wind up with a performance problem.
Please get a log as described earlier. If you prefer, using verbose level is more informative without completely overwhelming size (like profiling will make), but you’d have to sanitize before posting it.
Logs at retry level might be enough, and are likely postable. OTOH I’d sure hate to repeat this drill.
While it’s backing up, you could watch the DB grow, and you should see dup-*
files in temp area.
Most will probably be transient files that are created, uploaded, then deleted. Some are long-term.
I’m curious what type "Could not find file “/tmp/dup-6f255783-2945-47fe-8786-8f3f19ece462” was,
however the naming doesn’t distinguish. There are some extreme measures to take to gather info,
however they don’t scale up. I’ve used Sysinternals Process Monitor to watch temp file action, and
also used a batch file to copy all dup-* files to another folder in case I wanted to look at one later…
If you have extra TBs of space available, that might work, but in any case we’d still want the log file.
For anything very valuable, two backups (done differently) is a good idea. Software is never perfect.
Do these videos have valuable old-version info? If it’s all just-the-latest, maybe direct copies will do.
That would give you more choice on how to get the copies, although 100 GB size may pose issues. Duplicati’s strength include deduplication and compression to keep multiple versions compactly, but
video is good at defeating both of those. Unless you edit and want to undo, versions may be overkill.
OTOH versions are handy if ransomware clobbers things. You don’t want to clobber backup as well.
Yes, and I picked 5 MB because of my initial test results that showed a slowdown at 1 million entries.
There’s not much solid data, except that large backups get slow in their SQL queries and Recreates.
Currently the blocks are inserted one at a time, and recreate has to do every one of them that way…
Backups are better at (sometimes) hiding the slowness because they start with an existing database.
How does Duplicati Resume After Interruption? is an old post from the Duplicati author suggesting not using --auto-cleanup, but possibly that being set explains why repair ran (and maybe went to recreate).
That also describes the experiment that we’re not trying, which is to try to patch up DB for continuation.
Initial backup is especially troublesome (as you saw) due to no dlist file, so one way to proceed is to do smaller subset backup initially (maybe most important first?) to get at least the initial backup in the files.
If something breaks later, at least you can recreate the DB, but we hope the DB loss issue is over now.
Processing is parallel. At higher log levels, you first see the file getting past the exclude and other filters. Actually reading through the file is (I think) log-file-silent. Data blocks collect in temp files, then upload as filled to 50 MB (default dblock-size a.k.a. Remote volume size on the options screen), but they queue…
asynchronous-upload-limit controls how many queue. It’s a good question, but difficult to answer without even much understanding of whether it’s related to a source file, a file-made-for-upload, or the uploading.
What’s interesting is that SpillCollectorProcess is trying to do something. Ordinarily its job is to collect the leftovers after concurrent file processors have finished their work of filling dblock files. The end isn’t even. Your backup was seemingly nowhere near the end so I’m not sure why the code is running where it looks.
Here’s an example Verbose log of a small backup. I’m posting lines with three backticks above and below which help with the formatting (and scrolling), but for a really big file you can zip it up and drag to window.
2020-10-06 16:03:48 -04 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started
2020-10-06 16:03:51 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started: ()
2020-10-06 16:03:51 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed: ()
2020-10-06 16:03:51 -04 - [Verbose-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-IncludingSourcePath]: Including source path: C:\backup source\length1.txt
2020-10-06 16:03:51 -04 - [Verbose-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-IncludingSourcePath]: Including source path: C:\backup source\short.txt
2020-10-06 16:03:51 -04 - [Verbose-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-IncludingSourcePath]: Including source path: C:\backup source\length1.txt
2020-10-06 16:03:51 -04 - [Verbose-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-IncludingSourcePath]: Including source path: C:\backup source\short.txt
2020-10-06 16:03:51 -04 - [Verbose-Duplicati.Library.Main.Operation.Backup.FilePreFilterProcess.FileEntry-CheckFileForChanges]: Checking file for changes C:\backup source\length1.txt, new: True, timestamp changed: True, size changed: True, metadatachanged: True, 10/4/2020 1:42:38 AM vs 1/1/0001 12:00:00 AM
2020-10-06 16:03:51 -04 - [Verbose-Duplicati.Library.Main.Operation.Backup.FilePreFilterProcess.FileEntry-CheckFileForChanges]: Checking file for changes C:\backup source\short.txt, new: True, timestamp changed: True, size changed: True, metadatachanged: True, 10/4/2020 6:56:47 PM vs 1/1/0001 12:00:00 AM
2020-10-06 16:03:51 -04 - [Verbose-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-NewFile]: New file C:\backup source\length1.txt
2020-10-06 16:03:51 -04 - [Verbose-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-NewFile]: New file C:\backup source\short.txt
2020-10-06 16:03:51 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b2c52d2a1185c4cd280e6f6b14133f540.dblock.zip (1.11 KB)
2020-10-06 16:03:51 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b2c52d2a1185c4cd280e6f6b14133f540.dblock.zip (1.11 KB)
2020-10-06 16:03:51 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20201006T200351Z.dlist.zip (748 bytes)
2020-10-06 16:03:51 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ib2eaa999e8d44fe08d64af2e88704c82.dindex.zip (688 bytes)
2020-10-06 16:03:51 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20201006T200351Z.dlist.zip (748 bytes)
2020-10-06 16:03:51 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ib2eaa999e8d44fe08d64af2e88704c82.dindex.zip (688 bytes)
2020-10-06 16:03:51 -04 - [Verbose-Duplicati.Library.Main.Database.LocalDeleteDatabase-FullyDeletableCount]: Found 0 fully deletable volume(s)
2020-10-06 16:03:51 -04 - [Verbose-Duplicati.Library.Main.Database.LocalDeleteDatabase-SmallVolumeCount]: Found 1 small volumes(s) with a total size of 1.11 KB
2020-10-06 16:03:51 -04 - [Verbose-Duplicati.Library.Main.Database.LocalDeleteDatabase-WastedSpaceVolumes]: Found 0 volume(s) with a total of 0.00% wasted space (0 bytes of 285 bytes)
2020-10-06 16:03:51 -04 - [Information-Duplicati.Library.Main.Database.LocalDeleteDatabase-CompactReason]: Compacting not required
2020-10-06 16:03:51 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started: ()
2020-10-06 16:03:51 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed: (3 bytes)
2020-10-06 16:03:52 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20201006T200351Z.dlist.zip (748 bytes)
2020-10-06 16:03:52 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20201006T200351Z.dlist.zip (748 bytes)
2020-10-06 16:03:52 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-ib2eaa999e8d44fe08d64af2e88704c82.dindex.zip (688 bytes)
2020-10-06 16:03:52 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-ib2eaa999e8d44fe08d64af2e88704c82.dindex.zip (688 bytes)
2020-10-06 16:03:52 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b2c52d2a1185c4cd280e6f6b14133f540.dblock.zip (1.11 KB)
2020-10-06 16:03:52 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b2c52d2a1185c4cd280e6f6b14133f540.dblock.zip (1.11 KB)