I can’t tell it that sqlite file is the one for the current back, but if so I think even snapshots may not help thereb and you night just have to exclude it from it’s own backup.
There’s a chance that snapshots will avoid the #1400 error, but a backup of an actively-changing database is a poor candidate for a restore because even if it’s not viewed as corrupted the next run, it will be rather obsolete. People who really intend to backup the database for their main job (e.g. to avoid the hassle of DB recreation, if their drive breaks) sometimes do the backup of one job’s database (find path in Database tab) in a second job. I’m not sure if @rocco is trying to do that, or just backing up a large set of files, inviting Duplicati to trip here…
Obviously Duplicati should not trip this way, though there are others (such as locked files) that can happen due to OS limitations. For example, locked files might require snapshots in order to be read by some other program.
If you don’t have a specific desire to backup the Duplicati databases, try something like –snapshot-policy=On to workaround Duplicati’s issue. If you do have a specific intent to be be able to restore Duplicati databases, make another job so that you can backup while neither the database nor the destination are in the middle of updates.