Tree traversal can be done many ways with potentially differing impacts on caching below.
Can you describe Duplicati visit by terminology in that article, or writing your own phrases?
My other worry is Channel Pipeline spreading access across time, maybe upsetting cache.
Does Duplicati test-filters command without filters do same walk as a regular backup?
If it hasn’t already been run, some pure walk like tree /f or Get-ChildItem might help inform.
NetWare is probably now used to seeing those run, so how exactly is Duplicati upsetting it?
What’s the difference in the total size, folder structure (name length, depth), anything else?
If some volumes work but Q: doesn’t then questions are why – then can we give any help?
I have one question. HOW does duplicati determine which files to backup. Let’s assume that the tree walking works, how does duplicati determine how to backup a file. On most OSs, the “backed up” bit is set on the file. Then when the file is modified, the backed up bit is cleared and the file timestamp is modified. I have long suggested that there is some special extended file status that is being returned that duplicati is not handling properly.
I ask because the standard backups that don’t work, are NOT seeing files that have changed on the volume. They are now only backing up a very specific group of files.
Finally, on a stopped job, the software is NOT releasing disk space. There seem to be files somewhere taking space. You mentioned previously that the compressor will store files on the volume as it waits to transmit them to the remote server. How do we “clean up” from an abandoned run?
Any questions on the earlier answer? I might not be able to explain all, but it’s a start.
(etc.)
What are standard backups? Do you mean some non-Duplicati backups that work differently? Language on “don’t work” and “NOT” also confuses me. I’m not following that paragraph well.
I thought of suggesting you have Duplicati backup specific group, maybe some deep subfolder.
Stopped how?
Space where?
Not on Q: but on C:. Did you ever figure out, refute, etc. the theory that Q: was being written to?
If you mean files in the Temp folder on C: (or per tempdir option if used), most start with dup-.
You might also find some SQLite temporary files with etilqs in the name, but be careful, as it’s harder to tell who they belong to. SQLite is used by a lot of other programs too. The dup- is an attempt to be more specific, but it’s certainly not guaranteed. I think there might be an automatic cleanup eventually, as I have no buildup except an empty file from Dec. Manual cleanups should ideally be done with Duplicati down to avoid deleting any files that might still be in use somehow.
A bit off-topic here, but it is closest to “Breadth-first search”. But since we have two different kinds of nodes (files and folders) the terminology does not really match.
Duplicati will start by adding all source folders to a queue to process.
When visiting a folder, Duplicati will first process all files in the folder. Any folders discovered will be added to the queue. Once a folder has been fully processed, the next folder from the queue is extracted. Since the queue is FIFO, folders with the same parent will be processed before any of their children, givin a BFS-like traversal.
That is certainly possible, but we could not see the effects in performance testing.
Yes. It is the same traversal routine.
It is possible that it would work better in this case, but in general, the .NET enumeration should work the same. It does not help if the files cannot be accessed anyway though.
That is what I still do not know.
Duplicati is not using attributes on the files as a marker. Essentially it treats every file as one that should be backed up. There are only two cases that can make it skip a file: filters and unmodified timestamp+size.
Interesting observation. It could be something related to the file metadata then? At the time the exception PathProcessingFailed is thrown, the metadata has already been read, and Duplicati is just opening the file.
But! When the exception FileAccessError is thrown, Duplicati is just trying to read attributes.
I see both errors happening in your listing above.
The files should be removed automatically as @ts678 describes, even when stopped. If not, they are in the system temp folder and can be removed with “Disk Cleanup” or Windows → Storage → Temporary files.
Based on this, it seem that the list command should return 24252 entries for previously backed up files. I went to the original backup and this was the log.
Are you using the “–full-result” option on the “list”? I forget exactly which operations the “–full-result” is necessary to see everything, but “list” may be one. I’m also pretty sure that the web gui output window is limited, so even with “–full-result” some will “scroll” off the top. This is something you’ll want to use the command line for and throw the results in a file for analysis.
So if you don’t use that option you see about 1900 entries, but when you do use it you don’t get anything?
I just realized that the forum decided to turn two dashes into a single big dash. So just in case that’s what bit you, the option is --full-result (two dashes at the beginning)
If you mean in Command Prompt, redirect output by > name.txt at end of command line.
Command Line Interface CLI gives general usage, e.g. provide at least your GUI dbpath.
Possibly you’ll need more. One trick to composing is to edit an Export As Command-line.
If you go back to the threads that I have been involved with, it all boils down to making sure that the backup is actually backing up all the files. Since we are unable to get a complete log of all backed up files, I moved to the list command and now the find command. Unfortunately running it from the dos prompt is not a solution as the program will not run. THERE MUST be some way to see all the backed up files. Otherwise there is NO way to a) validate the backup and b) Validate that duplicati actually does what it claims! Sorry, but when I had an SQA department, these are what we needed to validate the new versions of code.
Thanks for your time but at this point with over 100 hours invested in this, I don;t know what else to do.
“Unsupported 16-Bit Application” (at least as searchable by text) was seen in 2018 and 2020 mysteriously, with both fixed by uninstall and reinstall. If you do so, best to stop Duplicati first.
Although it probably won’t be today, maybe the developer will have some better ideas on that.
Your next problem if you get CommandLine.exe going is you need to double quote strings with spaces. This is standard command line practice, and would be done if you’d edited the Export.
EDIT:
You might also have --disable-module attached without a space to stuff before it. If so, space it.