Stuck on *.sqlite-journal file that disappeared

Backup has been stuck for several days. Ended backup with Task Manager and restarted. Things were chugging along fine and then the speed started to slow down noticeably. Right now it is stuck on the following file:

Current file: C:\Users\XXXXX\AppData\Local\Duplicati\VCHJHDUSPI.sqlite-journal

Funny thing is, when I looked for the file with File Explorer, it was less than 100 kb in size. I looked back later and the file wasn’t even there anymore even though Duplicati was still hung up on it. What is the sqlite-journal file anyway? Is it some kind of a temp file that shouldn’t even be backed up?

Yeah, definitely avoid backing up the Duplicati databases directly. There is no value in doing so (as they can be regenerated from the backend data).

A future version of Duplicati will blacklist these files from backup. But for now just don’t include them.

2.1. Rollback Journals describes it, and as you can see it’s quite dynamic, which might be one way it leads to your situation. I suspect Duplicati still has an occasional problem with certain file change patterns during backup, however the situation has not been reproducible by me – even backing up the Duplicati databases.

What would be excellent is to find how to reproduce the hang for the developers so that it can be looked at. Yours has already been cleared, and field debug can be hard. Does your problem happen easily or rarely?

Creating a bug report and posting it with the hang time is one way that past hang can maybe be looked at. Chances are that there’s not much in the way of other logs, but if you can find any, those would also help.

Always exclude job-specific database files from backup #4390 was in progress before a delay to see if we could understand the hang better. You can see some discussion of the pro and con of doing workarounds.

Do you think we ought to just do it (and ideally, document it some, but where) while looking for root cause?

One do-it-yourself workaround is to use snapshot-policy to use VSS after setting up Administrator privilege.

I went back and looked and the only way that I can see that Duplicati should have even tried to backup that file was if I had checked my Windows - C: drive which I haven’t. Why is it even trying to back it up?

I have stopped and restarted the backup several times and it continues to stop on the same file EVERYTIME. I’m wondering about if it may be because of the way that I installed Duplicati on my laptop. When it installed, I had it installed in:

C:\Program Files\Duplicati 2

Could it be that because I did it that way when it is backing up the User Data\Application Data that it goes to that folder that I created to hold the program files? If that is the issue, how should I properly install Duplicati on my laptop?

I guess that’s as reproducible a problem as one can get (which is good), but other things confuse me.

I’m confused. C:\Program Files\Duplicati 2 is the installation spot, you don’t create it, and typically there’s no Duplicati data in there unless you use –portable-mode to make a data folder there to hold databases.

Typically, databases are in C:\Users\<user>\AppData\Local\Duplicati, unless it’s Windows service install.

Could you say more about how you installed? Installs can be either quite simple, or elaborate user setups.

You can also look at your job’s Database page to see where the database is for that job. Is it as expected?

Original post had path above, which looks like a Duplicati database, so how did the question below arise?

You don’t create the AppData folder either, and when I hear term “program files” I think C:\Program Files".

Please clarify. You can also get a log by About → Show log → Live → Verbose to see where program is getting its files from. It “should” use your Source data configuration. If it’s not, something seems very off.

A more permanent and somewhat nicer log is possible with log-file=<path> and log-file-log-level=verbose.

EDIT: After figuring out the basics of the situation, a higher log level might help, but they can get enormous. Verbose level does show some paths, which might be sensitive, but for a small log maybe you can redact.

How are these repeat stops done? Are they hard stops, such as killing Duplicati (not good, but sometimes needed). Regardless, I think restart tries to not repeat backup already done, but pick up where interrupted.

When I originally installed Duplicati on my laptop, all of the files installed in the root directory of C:. When that happened, I deleted all of the files and created a new folder in C:\Program Files\ named “Duplicati 2” and then reinstalled Duplicati and directed the install to that folder.

There are three folders and a couple of files at C:\Users\user>\AppData\Local\Duplicati but there is a LOT more folders and files at the installation location, C:\Program Files\Duplicati 2. It appears that this is where the databases are as there are as there are a LOT of .dll files.

They are not hard stops. When it gets to that file, the transfer to the hard drive stops, processor usage by the program goes to zero and the Google Chrome Duplicati interface stops with the number of files remaining, GB’s remaining and the last transfer speed showing. In order to get it working again, I have to use Task Master to stop the program and restart it. While I do have ability to select “Stop after Current File” or “Stop Now”, neither of these commands do anything.

Will run it again and look at the verbose log live.

Installing Duplicati on Windows shows the install, assuming you did it that way. Are you saying that the:

image

either didn’t come up that way, or did but wasn’t respected, so you had to Browse? I’ve never tried that…

.dll file location does not determine the databases location. .dll files are program code and belong there.

If you want to know where the databases are, look at your Database screen, or look for the .sqlite files.

That’s a hard stop. You’re simply killing it… Proper shutdown is right click on tray icon, then select Quit, and this would be after “Stop after current file”, but those don’t work, so you hard stop, and risk damage.
Presumably you mean “Task Manager”, but if a “Task Master” exists, I expect it also only does hard stop.

Thank you.

I just installed duplicati a few weeks ago, and this problem is happening for me EVERY TIME.
I’ve killed and restarted it a few times, and it’s still happening. Wasn’t this supposedly fixed?

Duplicati - 2.0.7.1_beta_2023-05-25

I figured out how to exclude that file, so I can hope the backup will work this time.

Got a citation? I don’t think it’s even been formally reported as an Issue, which is the usual first step.

is what’s wanted in a good issue, provided you can write easy steps that happen every time for a dev. Observing a problem is a good first step to fixing it, otherwise one is just guessing about the problem.

was one such guess, which could perhaps be reopened, but there are drawbacks there to consider.

is still the advice AFAIK, and I see that there were words written about “A future version of Duplicati”, which might have been going out on a limb a bit. Regardless About → Changelog doesn’t show a fix.

Ok, that issue 4390 marked as closed is what I thought was supposed to be the fix.
As for steps to reproduce:
A) Install Duplicati on a Win 10 systems, using defaults.
B) Create backup spec for C: D: (and 2 remote drives) - going to a locally attached 8TB USB drive.
C) Run it.

doesn’t quite qualify as

Presumably you’re just describing your setup. Got anything smaller and simpler to reproduce this?
I’m pretty sure I’ve tried a bit, but maybe I went too simple whereas your steps seem kind of large.
Helping the too-few volunteer developers can help with solutions, but it’s always the user’s choice.

Yeah, that is the ONE backup I’ve setup, and it hits it every time.
No Encryption. 500M size.

I’ll see if I can make a smaller one that does it.
Maybe you guys could add a log line that has a stack trace when it takes more than 1 hour for 1 file…

Thanks.

I also got one going starting from a smaller base that I had handy. Basically, C: to a USB drive, however database was on the drive and I was using VSS via snapshot-policy=Required which I think would avoid your collision if it’s Duplicati’s work and SQLite’s work colliding. The snapshot resides at a different path. Database got moved USB to C: for this test. It had been USB to share load, and ease disaster recovery.

This is still not what I’d call small, but if I can get it to fail, then I can try trimming. Good luck on your test.

EDIT 1:

A little off topic, but this USB drive used to have occasional image backups from Macrium Reflect Free, which is mere hours away from a January 1, 2024 end of support. New plan for USB backup is unclear. Frequent backup are carefully selected (this can be challenging) things of interest to cloud destinations.

Backing up everything can be a safety net, but is likely to hit locked files and backup more than needed (saying this while watching Duplicati backup C:\Windows for quite awhile now). Some relief is available from filter, but I’m not sure how widely used filter-groups are. Both have nice help text available.

EDIT 2:

Backup finished with 282 warnings (most on locked files because I had turned VSS off in hope of hang).
It picked up its own database journal file fine, but couldn’t get the database itself because of file locking:

2024-01-01 09:59:07 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-PathProcessingFailed]: Failed to process path: C:\tmp\VAULQQSUZR.sqlite
System.IO.IOException: The process cannot access the file because another process has locked a portion of the file.

   at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
   at System.IO.FileStream.ReadCore(Byte[] buffer, Int32 offset, Int32 count)
   at System.IO.FileStream.Read(Byte[] array, Int32 offset, Int32 count)
   at System.IO.Stream.<>c.<BeginReadInternal>b__39_0(Object <p0>)
   at System.Threading.Tasks.Task`1.InnerInvoke()
   at System.Threading.Tasks.Task.Execute()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at System.IO.Stream.EndRead(IAsyncResult asyncResult)
   at System.Threading.Tasks.TaskFactory`1.FromAsyncTrimPromise`1.Complete(TInstance thisRef, Func`3 endMethod, IAsyncResult asyncResult, Boolean requiresSynchronization)
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at Duplicati.Library.Utility.Utility.<ForceStreamReadAsync>d__31.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at Duplicati.Library.Main.Operation.Backup.StreamBlockSplitter.<>c__DisplayClass2_0.<<Run>b__0>d.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Duplicati.Library.Main.Operation.Backup.StreamBlock.<ProcessStream>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.<>c__DisplayClass1_0.<<Run>b__0>d.MoveNext()

Here’s a different style of locked file problem, which might be this Duplicati trying to backup its temp file:

2023-12-31 21:28:32 -05 - [Warning-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-PathProcessingFailed]: Failed to process path: C:\Windows\Temp\dup-151d8d46-524e-4c61-b1e4-ba892e42f4e9
System.IO.IOException: The process cannot access the file '\\?\C:\Windows\Temp\dup-151d8d46-524e-4c61-b1e4-ba892e42f4e9' because it is being used by another process.
   at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
   at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy, Boolean useLongPath, Boolean checkHost)
   at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share)
   at Duplicati.Library.Common.IO.SystemIOWindows.FileOpenRead(String path)
   at Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.<>c__DisplayClass1_0.<<Run>b__0>d.MoveNext()

This sort of problem is normal when VSS isn’t enabled to allow access to the locked files. A hang isn’t.
Still looking for a good recipe.