- The process cannot access the file because it is being used by another process


Even if I change the job to backup just a single file, like a word document or something, I still get the “locked file” error.

So this is not a locked file among the files to be backed up, which is more common to come across.

I can confirm that “System.IO.IOException: The process cannot access the file because it is being used by another process.” because the database file is locked by Duplicati itself can be solved by:

  • First run a repair job
  • Then run the backup job

If I run the backup job directly, the repair starts first in that job and then fails on the locked database file. Locked by Duplicati itself :slight_smile:

This problem started with the beta. The error started popping up when restarting jobs I had stopped mid run. I think but I’m not 100% sure that the jobs I stopped mid run where jobs without a local database but with a complete set of backup files.

Log entry:

System.IO.IOException: The process cannot access the file because it is being used by another process.
   at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
   at System.IO.File.InternalMove(String sourceFileName, String destFileName, Boolean checkHost)
   at Duplicati.Library.Main.Operation.RepairHandler.Run(IFilter filter)
   at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
   at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__19.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass13_0.<Backup>b__0(BackupResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)
   at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)
1 Like

I am having the same issue. Computer restart does not fix it. Just updated to

“System.IO.IOException: The process cannot access the file because it is being used by another process.”


Found this from @magnust


Yeah, first I verified it wasn’t a file in my backup job that was locked by temporarily changing backup source to a single word file that for sure wasn’t locked. Still got the error.

But I fixed the problem by first running a repair job only. And when that was done I started the backup job. No errors for me after that.


There’s a problem with somewhat similar symptoms that was being chased in Error “database is locked” when trying to run a backup after a force stop #3445 for awhile, and was mentioned in Restore fails with “Failed to connect: The database file is locked database is locked” although there was concern over version difference @mikaelmello did a historical version test for the GitHub issue and thought that the issue arrived in While I wouldn’t suggest doing this in anything but a test environment, testing the issue in and might give some clue as to whether these problems are related, or maybe someone else will offer their guess.


I’ve got a theory and related question…

Multi-threading is somehow allowing the backup process to start BEFORE the test process has completed. This would explain why a test backup of ONLY a single non-Duplicati document causes the issue since as different threads, one would end up blocking the other as we are seeing.

Why is this not happening to “everybody”?

@magnust, is the issue still happening if you don’t run a repair job first?

If so, what happens if you:

  1. repair
  2. backup
  3. backup again

Does it still happen if you use any of these?

  • --no-backend-verification=true
  • --backup-test-samples=0
  • --snapshot-policy=on (or required)
  • --concurrency-max-threads=1

This sound backwards but: Sorry, the problem is gone. Hehe.

What I mean is sorry I can’t test and verify your theory, which by the way sounds very logical! Since running a combined repair+backup failed with the error. But first running just repair and when that was done starting a backup, all worked fine. And after that backups runs just fine.


I had the same issue (details on my setup to follow) In my case simply using
--concurrency-max-threads=1 and running reapir before running the backup solved the issue for me.

Origional issue was I was getting the same error when tring to run backup or repair. I had reboot the computer to try to release the lock before making the change. no difference. I will note that in the backuo processes message at the top of the webpage we progressed to (Deleting unwanted files) before it would error out.

After changing the setting to 1 thread I tried to backup which had the same error (didn’t make sense to me) but then decided to do a repair (which succeeded) and then a backup (which succeeded)
Backup Destination SFTP

1 Like

Thanks a lot for posting this! I’ve been postponing to fix this issue for months, and when I finally sat down to fix it, your post solved it in two clicks! Thanks a bunch!