No backup without "--no-backend-verification=true"

Hi,

i have the following problem when backup 300GB to Microsoft Sharepoint (V1):

Using “–no-backend-verification=true” the job is succesfull.

Using “–no-backend-verification=false” ends with a message like “database can not be used because another process uses the database”

Failed: Der Prozess kann nicht auf die Datei zugreifen, da sie bereits von einem anderen Prozess verwendet wird.
Details: System.IO.IOException: Der Prozess kann nicht auf die Datei zugreifen, da sie bereits von einem anderen Prozess verwendet wird.
   bei System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
   bei System.IO.File.InternalMove(String sourceFileName, String destFileName, Boolean checkHost)
   bei Duplicati.Library.Main.Operation.RepairHandler.Run(IFilter filter)
   bei Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
   bei Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__19.MoveNext()
--- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde ---
   bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   bei CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
   bei Duplicati.Library.Main.Controller.<>c__DisplayClass13_0.<Backup>b__0(BackupResults result)
   bei Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)

Log data:
2019-03-17 20:08:20 +01 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-b49042829b0684ee68352c795f5417bd4.dblock.zip
2019-03-17 20:08:20 +01 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-b750c16c636f248cd9fd77ecaaa8c0df3.dblock.zip
2019-03-17 20:08:20 +01 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-b6514213f699b47638338a313613defc6.dblock.zip
2019-03-17 20:08:20 +01 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-b587d6af597fa4f51b98160de481acb2f.dblock.zip
2019-03-17 20:08:20 +01 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-ic391c195a3b840ebaffdb9b9b0ad9848.dindex.zip
2019-03-17 20:08:20 +01 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-b8121be92a18c42a39a2afc048fa57a7f.dblock.zip
2019-03-17 20:08:20 +01 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-icb7c3e9dec8b482990829a78fb75c0b9.dindex.zip
2019-03-17 20:08:20 +01 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-bce85529728bf4dd78f302fd4d80ea5bb.dblock.zip

Any ideas what to do?

These two things fine really make sense together. The “in use” error is likely due to trying to back up you Duplicati folder while it’s running a backup. Try adding an exclusion for your Duplicati database (or Duplicati folder).

The missing dblock and dindex files are due to Duplicati not finding some expected files at the destination.

This might be due to the files actually missing (something deleted them), the “file list” feature making them appear missing, or Duplicati deleting them during retention policy cleanup then “forgetting” they had been deleted.

Are you using a custom retention policy?

Policy is “keeping specific nuber of backups = 1”.

Runs as service, so the database (now 800MB) is found here:

C:\Windows\System32\config\systemprofile\AppData\Local\Duplicati

I am backing up drive “D:” only…

I have done one backup (300GB, about three days). After this the problem comes up.

The first backup was with “no-backend-verification=true” because I have learned this is better with Duplicati. The second backup with “no-backend-verification=false” shows the problem.

“Better” is relative - yes the backup may run, but you are telling Duplicati to ignore any errors it finds on already backed up content.

With a retention policy of keep 1 it’s possible this means you have some files that are not restorable. Though more likely it means there’s a big in the retention policy process where it deletes no longer need files then forgets it deleted them so complains it can’t find them.

One code for for this might be to check pending deletes when a file is missing from the destination. If the missing file was already flagged for deletion, just mark it as deleted (as if Duplicati had done the deletion) and move on.

Of course such a code change would improve user experience at the expense of covering up for a bug somewhere else, so it’s not really the best option…

Ok, I have understood a little bit. But how can I fix the problem? Running without no-backend-verification=true brings the error.

One additional point: The server with the problem is a “weak” one. He is one year old, has 32GB RAM, only five users… but: The CPU utilization is very high when using duplicati, and the RAID is only onboard RAID (two volumes, 2x RAID 1). I can feel, that the server is not very powerful.

On the other hand: We are using CRM with Microsoft SQL, which works fine. Maybe Duplicati’s database is the point of failure?

Server “power” shouldn’t cause issues - and we can certainly look at why do much CPU is in use after we resolve the errors.

Have you confirmed the files are actually missing from the destination? If so, we can take out a communications problem.

Oh, and what version of Duplicati are you using? I recall an older version had a bug where it would include additional folders in certain cases - possibly if a root folder was involved.

Thank you, JonMikelIV. I am using 2.0.4.15_canary_2019-02-06.

Via Commandline “list-broken-files”:

Running commandline entry
Finished!

            
The operation ListBrokenFiles has started
No broken filesets found in database, checking for missing remote files
Backend event: List - Started:  ()
  Listing remote folder ...
Backend event: List - Completed:  ()

ErrorID: CannotPurgeWithNoRemoteVolumes
No remote volumes were found, refusing purge
Return code: 100

Additional info:
–log-retention=20h
–retry-delay=3m
–backup-name=Daten zu SharePoint
–dbpath=C:\Windows\system32\config\systemprofile\AppData\Local\Duplicati\73686582906775696684.sqlite
–encryption-module=
–compression-module=zip
–dblock-size=50mb
–keep-versions=1
–no-encryption=true
–snapshot-policy=Required
–auto-cleanup=true
–concurrency-max-threads=1
–rebuild-missing-dblock-files=true
–no-backend-verification=true
–log-file=C:\duplicati.log
–log-file-log-level=Profiling
–console-log-level=Verbose
–disable-module=console-password-input

Inability to see files (hinted at in the original post, but more clearly visible in this message) could be the infamous Microsoft restriction of 5000 files (at their default) in a single list. Although two experimenters have seemingly avoided the limit by changng to Duplicati’s OneDrive v2 storage (which uses a newer Microsoft API), there don’t seem to be any claims that I can find that this is supposed to avoid the limit.

From a math point of view, 300 GB at 50 MB per file means 6000 dblock files of source file data, but a dblock also has a dindex file to index its contents, so you might be at 12000 files on SharePoint. It may refuse to list 12000, so makes Duplicati believe there are none. Do you have another way to check it?

Switching to OneDrive v2 and allowing backend verification can test whether it helps any in your case.

Manage large lists and libraries in SharePoint describes workarounds which an administrator may do.

How to resolve the 5000 item limit on a list. discusses limit rationale, and mentions using multiple lists.

Logic and usecase for remote subfolders is a Duplicati discussion of how someday that may be done.
This post within it references seeming successes where switching to OneDrive v2 fixed the file listing. Another workaround mentioned in the topic is to use a larger --dblock-size to stay below 5000 files, but another workaround might be to split your 300 GB backup into several smaller sections. This may also improve performance, especially for local database Recreate in the event of loss. Smaller goes faster.

Choosing sizes in Duplicati gets into –dblock-size and –blocksize selection. --dblock-size can change whenever you like. This applies to future dblock creation, so won’t easily resolve current small dblocks. Restarting would be easiest, but if not possible there’s a chance the current can be repackaged blindly.

Thank you, ts678. Problem solved:

I am using bigger blocks (500MB), so I have less files in Sharepoint.

I am using “Microsoft Sharepoint v2” now. It was difficult to configure because I have a special document library, not the default one.

I thought the “Destination path” should be “frank.sharepoint.com/sites/Duplicati” or something like this. It works using the “site-ID” and only a short path, look at the screenshot.

This was helpfull to find the “site-ID”: sharepoint online - How can I get the siteId of the current site with Microsoft Graph API? - Stack Overflow

Duplicati often causes small problems, but overall it is a great program!

1 Like