Backup folder much larger in size than listed in Duplicati

I have Duplicati tasks backing up a bunch of different folders, but one back up task is behaving strangely. Duplicati reports that back up as “8.23 GB / 7 Versions” but the folder itself contains almost 300gb of files. This is very different from all the other tasks where the folder matches the reported sizes.

The settings are all exactly the same except for the folders each Duplicati task is working on. They are all set to smart back up retention. With a little bit of investigation, 99% of those files were suddenly generated in one day. Those files are not showing up as part of any back up in Duplicati. There was an error the day those files appeared

System.IO.IOException: Backup aborted since the source path V:\ does not exist.  Please verify that the source path exists, or remove the source path from the backup configuration, or set the allow-missing-source option.
   at Duplicati.Library.Main.Controller.ExpandInputSources(String[] inputsources, IFilter filter)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.<Backup>b__0(BackupResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)
   at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

I am running on Windows and the folder is being stored on my OneDrive.

Any recommendation on what steps I should be taking next?

Welcome to the forum @Possum8863

System.IO.IOException: Backup aborted since the source path V:\ does not exist. Please verify that the source path exists, or remove the source path from the backup configuration, or set the allow-missing-source option.

What’s supposed to be in backup? Files on internal C:\ drive? What’s V:\ that’s in configuration?

You could get a second opinion in your job logs. Find the last successful one, see Complete log

      "KnownFileSize": 9631686882,
      "LastBackupDate": "2024-10-07T07:19:31-04:00",
      "BackupListCount": 28,

9631686882 / 1024 / 1024 / 1024 = 8.97

Home page:

Last successful backup:Today at 7:25 AM (took 00:05:47)
Next scheduled run:Tomorrow at 5:20 AM
Source:7.58 GB
Backup:8.97 GB / 28 Versions

An error might prevent updating home page. It might also skip a job log. Do newer backups run?

How are you looking? The Destination files never show up on (for example) the Restore screen, however it would be worth looking there for any unusual Source files that shouldn’t be in backup.

You could also use The AFFECTED command, giving it an old file (dblock filename) for practice. After that says what source files are in that dblock, then give it a name from excess dblocks pile.

GUI Commandline is a good place to do this. Are Source files in the excess dblock as expected?

v: is a mounted Cryptomator drive hosting a large number of document and text-type files such as PDFs.

This error triggered when I accidentally triggered a manual backup without unlocking the cryptomator drive. I ended up forgetting about it and did not go back to run another back up after that. The first thing that appeared in the log for the day in question was a list that looks the same as the one I just made today. This is followed by a massive list of “Oct 7, 2024 2:52 PM: put duplicati-b883136c39a0f4df7bb8c720e47a59d28.dblock.zip.aes”.

The back up was assigned to a different drive letter previously, so there is a potential that the old drive was never fully removed from the job.

I have also attempted to make a new back up using that job and it failed with the following error:

System.UnauthorizedAccessException: Access to the path ‘\?\X:\OneDrive\Backups<folder>\duplicati-ide561ec3be5441d0956b8d0db284e124.dindex.zip’ is denied.
at Duplicati.Library.Main.BackendManager.Delete(String remotename, Int64 size, Boolean synchronous)
at Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable1 protectedFiles) at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable1 protectedFiles)
at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
at Duplicati.Library.Main.Operation.BackupHandler.d__20.MoveNext()
— End of stack trace from previous location where exception was thrown —
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
at Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.b__0(BackupResults result)
at Duplicati.Library.Main.Controller.RunAction[T](T result, String& paths, IFilter& filter, Action`1 method)
at Duplicati.Library.Main.Controller.Backup(String inputsources, IFilter filter)
at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

I double checked that my Windows account has ownership of all those files and that the files are actually on my computer locally and not just backed up in OneDrive.

I have a back up set up for another Cryptomator drive. I gave that other back up a test today and confirmed it is working as expected.

The Affected command on the excess blocks showed no file was associated with those dblocks.

Unfortunately, the Affected command for files from previous back ups refer to a completely different database file that does not exist for that job or bring up 0 related log messages even with --full-result. Making a copy and renaming the database currently attached to the job has not helped at all.

I suspect at some point I may have mauled the settings when I was trying to switch the drive letters around and troubleshooting on my own may have made things worse. I imagine the fact I attached it to the regular OneDrive folder instead of using the direct OneDrive connection (didn’t know about that at the time) did not help at all. I think my best option right now is to cut my losses and start fresh with a completely brand new back up task as I still have multiple back ups elsewhere.

Is this posted or described here? Are you saying you keep your own external Duplicati log-file?

So after the mysterious “looks the same” list, there’s another list where all entries are the same?
Normal backup can make a list of put entries, but to different files and with mix of dblock/dindex. There’s a dlist at end end, and at that point you’ve probably finished backup onto the destination.

I’d note that many of my questions weren’t answered, including whether backups finish after this.

I don’t run Cryptomator, but an assigned drive letter would be more stable than its random one.

is vague about time. If it’s just the same day, could something else have been at V:\ to backup? You can look in your job logs to see when it grew, e.g. KnownFileSize mentioned, however the easier to see clue is on the job log summary. A big jump in Source Files might be an accident.

It seems like some steps were skipped.

however if you don’t use Commandline and you don’t specify --dbpath yourself, you can get:

because true CLI such as Command Prompt invents a dbpath based on the destination you give.

Not sure what is really happening here. The message above seems to indicate that Duplicati has indeed uploaded the additional 300gb of files. But since the overview shows “8.23 GB” it looks like the files are from another backup (or at least recorded in another database).

Did you kill the process at some point, so the records are partial? Did you create new backup configurations?

I did not kill the process (or at least intentionally), but the back up configurations did change at some point to reflect the change in drive letters. I suspect I may have changed the drive letters after Duplicati made one back up which has contributed to the issue.