One missing file (Dropbox destination)

I’ve been running Duplicati 2.0.3.3_beta_2018-04-02 backing up a NAS folder to Dropbox for many months now without a problem. Suddenly this morning I saw a fatal error:

Failed: Found 1 files that are missing from the remote storage, please run repair

Examining the log, I see this:

  • Sep 18, 2018 1:53 AM: Message

Found 1 files that are missing from the remote storage, please run repair

  • Sep 18, 2018 1:53 AM: Message

Missing file: duplicati-ib615555c64ba49669434976c0905494d.dindex.zip.aes

I don’t touch the destination folder on Dropbox. I have selective sync set up for all my devices, which means I can’t even see the destination folder; I have to sign into the web interface in order to access that folder so there’s no way I could have deleted something in there by mistake. Duplicati is showing that duplicati-ib615555c64ba49669434976c0905494d.dindex.zip.aes was Deleted at 12:13pm on 12/17.

The Duplicati job says the last successful run was 12/16 and took 48 seconds. So clearly something went wrong on 12/17. The server log shows this at 12:13pm on 12/17:

Newtonsoft.Json.JsonReaderException: Error reading JObject from JsonReader. Path ‘’, line 0, position 0. at Duplicati.Library.Main.BackendManager.Delete (System.String remotename, System.Int64 size, System.Boolean synchronous) [0x00081] in <ae134c5a9abb455eb7f06c134d211773>:0 at Duplicati.Library.Main.Operation.CompactHandler+<PerformDelete>d__7.MoveNext () [0x0006f] in <ae134c5a9abb455eb7f06c134d211773>:0 at System.Collections.Generic.List1[T].InsertRange (System.Int32 index, System.Collections.Generic.IEnumerable1[T] collection) [0x000ea] in <b0e1ad7573a24fd5a9f2af9595e677e7>:0 at System.Collections.Generic.List1[T].AddRange (System.Collections.Generic.IEnumerable1[T] collection) [0x00000] in <b0e1ad7573a24fd5a9f2af9595e677e7>:0 at Duplicati.Library.Main.Operation.CompactHandler.DoCompact (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Boolean hasVerifiedBackend, System.Data.IDbTransaction& transaction, Duplicati.Library.Main.BackendManager sharedBackend) [0x004b6] in <ae134c5a9abb455eb7f06c134d211773>:0 at Duplicati.Library.Main.Operation.DeleteHandler.DoRun (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Data.IDbTransaction& transaction, System.Boolean hasVerifiedBacked, System.Boolean forceCompact, Duplicati.Library.Main.BackendManager sharedManager) [0x00350] in <ae134c5a9abb455eb7f06c134d211773>:0 at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired (Duplicati.Library.Main.BackendManager backend, System.Int64 lastVolumeSize) [0x000a5] in <ae134c5a9abb455eb7f06c134d211773>:0 at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String sources, Duplicati.Library.Utility.IFilter filter) [0x008e0] in <ae134c5a9abb455eb7f06c134d211773>:0 at Duplicati.Library.Main.Controller+<>c__DisplayClass17_0.<Backup>b__0 (Duplicati.Library.Main.BackupResults result) [0x00036] in <ae134c5a9abb455eb7f06c134d211773>:0 at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String& paths, Duplicati.Library.Utility.IFilter& filter, System.Action`1[T] method) [0x0014b] in <ae134c5a9abb455eb7f06c134d211773>:0 at Duplicati.Library.Main.Controller.Backup (System.String inputsources, Duplicati.Library.Utility.IFilter filter) [0x00068] in <ae134c5a9abb455eb7f06c134d211773>:0 at Duplicati.Server.Runner.Run (Duplicati.Server.Runner+IRunnerData data, System.Boolean fromQueue) [0x00494] in <670b0e4a9ae144208688fcb192d20146>:0

Does this json error mean there was a communication problem between my duplicati machine and Dropbox? I’d like to understand what happened before I repair.

Edit: I see this post:

which leads me to believe I might not be alone in this, though OP in that thread determined some gateway changes were the cause. In my case I haven’t made any network changes and my “test connect” works fine.

edit 2: Since this seems to be an issue between Dropbox and my server and not something specific to my server, I ran repair:

  • Sep 18, 2018 8:51 AM: Running Repair took 00:00:18.379

  • Sep 18, 2018 8:51 AM: RemoteOperationTerminate took 00:00:00.000

  • Sep 18, 2018 8:51 AM: Starting - RemoteOperationTerminate

  • Sep 18, 2018 8:51 AM: RemoteOperationPut took 00:00:01.934

  • Sep 18, 2018 8:51 AM: Backend event: Put - Completed: duplicati-ib615555c64ba49669434976c0905494d.dindex.zip.aes (541 bytes)

  • Sep 18, 2018 8:51 AM: Uploaded 541 bytes in 00:00:01.9338730, 279 bytes/s

  • Sep 18, 2018 8:51 AM: Backend event: Put - Started: duplicati-ib615555c64ba49669434976c0905494d.dindex.zip.aes (541 bytes)

  • Sep 18, 2018 8:51 AM: Starting - RemoteOperationPut

Now I’m running the backup and it appears to be working normally. Very strange; I wish I knew what happened.

So you confirmed that the file really was missing from Dropbox so that rules out a simple file issue.

My GUESS is that something went wrong during the compact step of the previous run causing the likely correctly deleted file to not be recorded as deleted in the database.

Unfortunately, confirming stuff like that is hard without detailed logs from both runs.

I wonder if we could add a setting to allow detailed(ish) datetime stamped log files to be created and automatically cleaned up when the next run is error free…

I don’t see that in the Delete stack trace. Was it somewhere nearby? Does this mean you log at Information? That adds quite a bit. You could even search for other mentions of that file so we can see more of its history.

If you don’t have that level of log, the job database also has per-file history if you’re willing to look at that file. RemoteOperation table is what you want, plus a decent UNIX timestamp converter (found quite a few online).

For an SQLite browser, you can try http://sqlitebrowser.org/. If on Windows, start Duplicati with --unencrypted-database or you won’t be able to get in. I think Linux for some reason doesn’t do this rather weak encryption.

I’m positive I saw it in the logs but I can’t find it now. I paged through a hundred pages of logs going back to the 17th and I don’t see it now. I have trouble following where Duplicati logs information (General vs Remote, Live vs Stored).

I’ll try the sqlite browser and see if that helps.

The job is trying to compact now but it’s taking days for some reason; I’m not sure what changed.

Wow, thanks for paging that much. On Windows, the “find” command might make that easier, or “grep” on Linux.

I’m currently using glogg to rummage around in the big log files that Duplicati can create (especially at Profiling).

I tried the sqlite browser but it crashed running from my windows laptop trying to access the log on the machine running Duplicati. Thank you for the suggestions.

I didn’t even know that HAD a remote option (although one that crashes isn’t much use). Can you run locally?

Note that even if you find some sort of SQLite browser, there’s no guarantee the history we’re after is there…

Running out of ideas here, and you’re certainly free to stop when you want (however you did sound curious).

Definitely curious - just short on time right now :slight_smile:

I don’t recall if I requested this as a feature yet, but this is the time when an “export db logs to files” feature would shine.

That and a warning that doing a database recreate will reset your job specific logs… That one bit me a few times during my testing. :slight_smile:

Or maybe even a button that would expand everything on the log page. I just found you can save page as a file and that makes everything visible for searching through without having to open the entries by hand one by one. Note we’re probably talking different logs. This comment is about the logs in the job DB, not a long server log…

A “show all” button would work, but I worry some logs could get too big for a browser to handle well.

I imagined an export button for each of the 4! log sources as well as a single “export all” that would produce 4 files.

At this point, either option would be an improvement. :slight_smile: