When running a local backup, on the status bar the following line is displayed, for example:
… 359207 files (72.82 GB) to go at …
That’s fine. But when the backup process is approaching its end the status line suddenly looks like this:
… 76673 files (-78945379 bytes) to go at …
Eh? Looks like a not initialized variable.
And a minute later the status line changes to:
… 0 files (-177084 bytes) to go at …
Am I supposed to understand that number? Or is this just a duplicati bug?
Btw. this symptom also was present in version 220.127.116.11.
This can happen when files are added or increase in size during the backup. At the start of the backup, Duplicati scans the filesystem for a list of files to back up and calculates the expected size of data to back up. By the time it gets to certain folders, if files have gotten larger it throws off the “bytes remaining” calculation.
One solution is to use snapshots so that Duplicati is working with a frozen view of the filesystem, but it may not be an option for you unless you’re using LVM.
I think the web UI should just display a question mark or not show the bytes to go at all if the calculated value goes negative.
Wonder if its possible to recalculate it instead?
Does Duplicati not keep a record of the files to upload (or add that to fix this) and couldn’t it be matched to that to see if its already known or not and adjust accordingly? Isn’t that how it gets a total size in the first place?
Mind you I didn’t look into the code but if it has a total size it should be possible to code and adjust it as needed. Doing so should slow it down though which might want to be avoided.
It does not intend to wait for a full file scan before starting file processing. That would slow things down.
Channel Pipeline was my attempt (maybe wrong) to document how data flows through the processing.
There might be a heuristic possible that does a last second check for file changes, but it’d need writing.
A file could grow or shrink while you’re reading it through, so maybe you take the final read size instead.
Problem is I don’t know if you know what size the size-estimator originally had. You’d have to read code.
I agree. It would definitely slow things down. There might not be a fast way of doing that. I can’t think of any way without putting and getting info.
The end result is what might dictate what is done though. Slow it down for a nicer experience or keep it as fast as possible for an okay one (of size use).
I looked at this code a while back, and from what I recall Duplicati uses a very simple approach for the web UI progress bar: total bytes from the file scan phase stored in one variable, and total bytes processed so far in another variable. This can’t adapt to files that change during the backup process. It could be done but not sure it’s worth the development time (I think some other issues are higher priority, but that’s just my opinion).
I’m not arguing there
Going below 0 isn’t even that big of a deal even though its bad looking. It doesn’t keep Duplicati from running or backing up correctly.