"...files...missing from the remote storage, please run repair"


#1

I’m not understanding how to fix this. I searched in the forum but found nothing immediately helpful, and a lot that was frankly beyond my understanding. I’m now into my fourth hour trying to deal with this - obviously in need of help.

It’s likely relevant that this sort of error is occurring on all 3 of my backups. In the beginning (first two runs on the first backup) clicking the “repair” link provided in the error message appeared to help. but now when I do this it quickly exits and seems to be doing nothing. A rerun simply regenerates the same error, in all three cases.

Environment:
I’m backing up from my laptop to a USB harddrive that has plenty of room. Using 2.0.4.5_beta_2018-11-28.
Running on x86_64 hardware, GNU/Linux, kernal 4.15.0-39-generic, LinuxMint 19 (tara) (Ubuntu bionic)

Is there some relatively simple way to make some progress with this error? Is it s known problem with the current beta version?


#2

How large are your backup sets? If any of them are farily small (less than 10 gigabytes perhaps), you might try a “delete and repair” to test. I’ve done this recently for one of my backups (totalling around 6 gigs) and it only took a few minutes to complete the rebuild. Others here can advise if there are more precise measures you can take to try to recover your existing database, though.


#3

Thanks for the quick response. I’m certainly willing to try what you suggest, but I don’t find a “delete and repair” functionality anywhere in the interface. Can you advise me how to do this?


#4

first:
image

then:


#5

Ahh. Thanks! I had no idea that was there. This is a new interface for me, and I’m returning to duplicati after many months away.


#6

OK, that worked. My backups ran to completion, after applying your fix, albeit with 2 errors, in one case. But in the main, all’s well.

I’m grateful for your help. Thank you!


#7

I’ll be marking @drakar2007’s reply as the solution. If there are any more problems please let met know


#8

I’d like to call @kenkendk’s attention to this if just to point out that the normal “repair” function probably needs to be tweaked to handle more cases, and/or give more meaningful feedback if it can’t complete successfully.


#9

I too had this error recently, although now seems to have rectified itself. Only reporting in case a pattern emerges as not happened before.


#10

It’s happened again, just like yesterday. Scheduled backup produced error message about ~150 files “…missing from remote storage”, etc., for each of my 3 backups.

I run the delete and repair function, then the backups run OK, when asked to. Having to do this every day will defeat the idea of an automatic backup that just runs in background. I hope this won’t keep happening but fear it will.


#11

And again, the same problem as before, today. Getting these backups to run is pretty time-costly for me right now. Am I doing something wrong? Do I need to provide more information?

Backup 1 - 3.33 GB - no errors today

Backup 2 - 6.91 GB - “Found 18 remote files that are not recorded in local storage, please run repair”

  1. Clicked the “repair” button in the error message - “Got 8 error(s)”.
  2. Ran database “Recreate (delete and repair)”.
  3. Ran backup to completion but got 13 warnings:
    [Warning-Duplicati.Library.Main.Operation.Backup.UploadSyntheticFilelist-MissingTemporaryFilelist]: Expected there to be a temporary fileset for synthetic filelist (4, duplicati-b01c7281a3baf48d4b8e472a52eff9908.dblock.zip.aes), but none was found?, 2018-12-05 15:17:19 -08 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: /home/tomc/.cache/doc/, 2018-12-05 15:17:19 -08 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: /home/tomc/.cache/doc/, 2018-12-05 15:17:19 -08 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: /home/tomc/.cache/dconf/, 2018-12-05 15:17:19 -08 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: /home/tomc/.cache/dconf/, … ]

Backup 3 - 6.51 GB - “Found 181 remote files that are not recorded in local storage, please run repair”

  1. Clicked the “repair” buton in the eror message - a larger number of lines were output into a modal window which went out the top of the screen.
  2. I clicked “dismiss”, then ran database “Recreate (delete and repair)”.
  3. Ran backup to completion, with no errors or warnings.

#12

@JonMikelV or @ts678 do you have any idea of what might be happening?


#13

Starting with (maybe) easier messages, what permissions do the /home/tomc/.cache/dconf/ and /home/tomc/.cache/doc/ directories have, according to stat or ls -ld? On my system, dconf is root-only, and I don’t have the other one (or know what it is, after some search). Is Duplicati running as root? If not, it has access limits.

What does this warning signify? points to synthetic fileset explanations. Possibly these were just a byproduct of the original problem causing the “interrupted backup” treatment to be used. You can see if that fits history.

I think that to know what might be happening, it would be helpful to distinguish between files not being put to remote storage, versus them being put then disappearing, versus them being there but somehow not found.

This is a case where use of backslashes might have caused a problem seeing files in WebDAV, however on the Linux laptop I’d expect forward slashes. I doubt this is the cause of the current problem, but I’ll mention it.

Another question would be how and when the USB drive is mounted then unmounted. Do missing files occur when a backup is re-run in quick succession (maybe with intentional file change) while drive stays attached?

Reports seem to show varying numbers of missing files, and perhaps this is related to the amount of change, meaning this two-in-a-row plan might have a small number. Even better would be to run the backup by using the Commandline option of the job, checking that Command is backup, then adding –console-log-level=Retry before pushing the Run button. For a small number of changes, this should show a small number of files that get Put to the backend, and should also name any files it thinks are missing. Any file that gets Put should be actually on the drive (which can be confirmed), and reported missing files on the drive can be confirmed too.

Setting up a –log-file and –log-file-log-level is an alternative to doing tests on Commandline, if it’s preferred.

Example output from my system (where I’ve been inserting errors on purpose – also you won’t see Profiling level messages unless you set Profiling log level, but it does create a large amount of output when you do):

2018-12-05 10:18:39 -05 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-20181204T202210Z.dlist.zip
2018-12-05 10:18:39 -05 - [Error-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteFiles]: Found 1 files that are missing from the remote storage, please run repair

2018-12-05 18:41:53 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20181205T233955Z.dlist.zip (723 bytes)
2018-12-05 18:41:53 -05 - [Profiling-Duplicati.Library.Main.Operation.Common.BackendHandler-UploadSpeed]: Uploaded 723 bytes in 00:00:00.0761505, 9.27 KB/s
2018-12-05 18:41:53 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20181205T233955Z.dlist.zip (723 bytes)

If you don’t want to jump directly into looking at file names, you could start with file counts. For any job, there should be a “Show log” button leading to summary information. Some sampled lines from one of my backups:

    BackendStatistics:
        RemoteCalls: 7
        BytesUploaded: 3569
        BytesDownloaded: 2100
        FilesUploaded: 2
        FilesDownloaded: 3
        FilesDeleted: 0
        FoldersCreated: 0
        RetryAttempts: 0
        UnknownFileSize: 2702
        UnknownFileCount: 1
        KnownFileCount: 14
        KnownFileSize: 11208
        LastBackupDate: 12/4/2018 6:59:14 PM (1543967954)
        BackupListCount: 6

so you can match things like KnownFileCount and KnownFileSize against what your Linux file tools think you actually have for your dlist, dblock, and dindex files. My UnknownFileCount is probably 1 because I have set –upload-verification-file which creates a duplicati-verification.json file after the backup listing what should be there. Match can then be checked with the DuplicatiVerify tools provided in the Duplicati utility-scripts folder. You’d presumably use the Python (not PowerShell) one, and a USB drive is easy because it’s already there.


#14

Thanks for that long, detailed response. Much appreciated! It’s also a lot for me to digest. I won’t have a moment to do that until tomorrow evening at the earliest, but it’s definitely on my schedule. Thanks again!