I am in the process of switching from CrashPlan to Duplicati. Unfortunately I have run into a problem.
I have been struggling to complete a backup of ~500 GB to a local USB-connected drive. I’ve tried multiple times and each time the job has ended with a disk I/O error with a stack trace like below:
System.Data.SQLite.SQLiteException (0x80004005): disk I/O error
disk I/O error
at System.Data.SQLite.SQLite3.Reset(SQLiteStatement stmt)
at System.Data.SQLite.SQLite3.Step(SQLiteStatement stmt)
at System.Data.SQLite.SQLiteDataReader.NextResult()
at System.Data.SQLite.SQLiteDataReader..ctor(SQLiteCommand cmd, CommandBehavior behave)
at System.Data.SQLite.SQLiteCommand.ExecuteReader(CommandBehavior behavior)
at System.Data.SQLite.SQLiteCommand.ExecuteNonQuery(CommandBehavior behavior)
at Duplicati.Library.Main.BasicResults.LogDbMessage(String type, String message, Exception ex)
at Duplicati.Library.Main.BasicResults.AddError(String message, Exception ex)
at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)
at Duplicati.Library.Main.Controller.<>c__DisplayClass16_0.<Backup>b__0(BackupResults result)
at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)
at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)
After some additional investigation I think I know what the root cause is. As I stated at the top I am in the process of moving from CrashPlan to Duplicati. I have two backup destinations, cloud and local USB-drive. CrashPlan used to perform both, but I have disabled the local backup in CrashPlan and tried to replace it with Duplicati. This means that CrashPlan is still active and runs a backup job to their cloud every hour. That job backs up the entire user folder, including the AppData\Local folder where Duplicati keeps its SQLite database.
Unfortunately CrashPlan locks the SQLite database file, causing Duplicati to get a disk I/O error.
I’m pretty confident this is the problem and I am taking steps to work around it right now (stop CrashPlan from backing up that folder). I just wanted to post here so that someone else in the same situation might find it.
To explain how I came to the conclusion that CrashPlan is causing the problem:
When I got the I/O error earlier this evening I almost immediately tried to delete the database from within Duplicati. This resulted in an error message telling me the file was locked by another process. I then used Process Explorer from Sysinternals to see what process was holding that lock, and it turns out it was CrashPlanService.exe.
This is why I am so confident this is the root cause.
Well, most people will probably not have a backup of AppData\Local running. I happen to have an app or two that puts useful info there so I went out of my way to enable backup of that folder. It requires you to enable showing hidden folders in CrashPlan.
Most backup solutions will probably only backup AppData\Roaming (if even that).
Welcome to the forum! I edited your post to improve the formating. (Just added ~~~ before and after the output you pasted, please see here for details).
You can count me out of “most people” then - I have pretty much the same setup, except my CrashPlan is once-daily so it hasn’t intersected with my Duplicati backup yet. (lucky me).
You seem to have found the cause but didn’t mention if you resolved it yet - but assuming you haven’t two things I would suggested trying are:
Take a look at the --snapshot-policy parameter - setting it to On or Auto might help with (I’m not sure)
Add a “Duplicati/*.sqlite” exclusion to CrashPlan - either through the folder selector or the regular expressions.
It’s odd that Crashplan is the process that keeps the file locked. When starting a backup, Duplicati will use the associated SQLITE DB’s exclusively. There are several reports that Duplicati logs errors for these files, even with --vss-snapshot-policy set to On or Required.
In the latest Canary, a new advanced option --default-filters is added:
--default-filters
Exclude files that match the given filter sets. Which default filter sets should be used. Valid sets are
"Windows", "OSX", "Linux", and "All". If this parameter is set with no value, the set for the current operating
system will be used.
This option excludes most filetypes and folders that do not need to be backed up (like Temp files/folders). @kenkendk had the idea for adding files to the filter that Duplicati itself writes to, like the SQLITE databases:
If you use the latest Canary, does it help to add the --default-filters option?
I ended up removing the AppData\Local\Duplicati folder from my CrashPlan backup job. To verify this I wiped my local Duplicati backup and started from scratch. It takes 5-6 hours to complete the ~500 GB backup and for the first time ever it finished without any errors early this morning.
Having it succeed just once does not prove anything, but if I was confident before I am now super-confident I found the root cause.