Found files that are missing from the remote storage, please run repair

I’m getting this error but I don’t know how to resolve. I run repair and next page says Finished but no files listed.
Here’s the text version it runs:
–auto-vacuum=true
–asynchronous-concurrent-upload-limit=1
–backup-test-samples=0
–list-verify-uploads=false
–no-auto-compact=false
–threshold=0
–backup-name=G
–dbpath=C:\Users.…\AppData\Local\Duplicati\ARBWAMRBSP.sqlite
–encryption-module=aes
–compression-module=zip
–dblock-size=50MB
–passphrase=…
–keep-versions=1
–disable-module=console-password-input

Execute a run and I get the same missing file error.

I tried purge and I see this:
Listing remote folder …
Missing file: duplicati-b094370756b6041dfbaeb381b751ddfde.dblock.zip.aes
Found 1 files that are missing from the remote storage, please run repair

ErrorID: MissingRemoteFiles
Found 1 files that are missing from the remote storage, please run repair
Return code: 100

It’s an endless cycle!

Is there a definitive set of sets to resolve this? How to prevent it?

Source is Windows 11 folder. Target is Storj.

Welcome to the forum @KryptDo0x

You’re turning off an awful lot of the checks on the backup. Any reason? That hurts reliability.

Saves some download expense, hurts checking file integrity, although small check helps little.

What’s the intent? This tells it to automatically compact like crazy, possibly in an infinite loop.

Kind of defeats the whole purpose of Duplicati, which is to store multiple versions compactly.

Features

Incremental backups
Duplicati performs a full backup initially. Afterwards, Duplicati updates the initial backup by adding the changed data only. That means, if only tiny parts of a huge file have changed, only those tiny parts are added to the backup. This saves time and space and the backup size usually grows slowly.

2.0.7.101_canary_2024-03-08

Updated Uplink for Storj to 2.12, thanks @gpatel-fr and @kenkendk

possibly helps, but Canary releases are always a bit of an unknown when initially released.
This one looks to be pretty good from test so far, however it’s not yet made its way to Beta.
There’s always a question of how much more stuff to keep throwing in, but I don’t decide it.

I sort of recall problems with Storj in the current Beta, but I don’t have a pointer to specifics,
beyond the pull request and release note that I mentioned. Maybe a Storj user knows more.

EDIT 1:

https://docs.storj.io/dcs/third-party-tools/duplicati might be guiding you on some of those, but

list-verify-uploads=true If a file upload fails for any reason, a final listing will catch it.

is what they suggested. That’s also mentioned in the cited pull request. Your config has it off.
There should still be a final check for missing files, but that’s too late to retry the uploading…

EDIT 2:

  --threshold (Integer): The maximum wasted space in percent
    As files are changed, some data stored at the remote destination may not
    be required. This option controls how much wasted space the destination
    can contain before being reclaimed. This value is a percentage used on
    each volume and the total storage.
    * default value: 25

I checked to see if 0 is special, e.g. equivalent to never. Doesn’t look like it per the help text.
What happens if you set the threshold too sensitive is it can be a tremendous compact load.
I’ve seen it get in loops somehow, maybe because it can never achieve what you told it to…

EDIT 3:

[Storj DCS] Update to latest uplink.NET #4755 wasn’t final update, but had larger discussion.
Update is now in Canary, but I’m not sure if we have heard feedback from users on it helping.

I tried making changes. None of those fix the missing files issue, right? I went to the Database setting and clicked Repair. There was an error “The backup storage destination is missing data files. You can either enable --rebuild-missing-dblock-files or run the purge command to remove these files. The following files are missing: duplicati-b094370756b6041dfbaeb381b751ddfde.dblock.zip.aes.” I tried both repair and purge with the flag set to true and that didn’t resolve anything.

Another thing I notice is that commands don’t stick. I changed threshold=25 and clicked “Run Backup now”. The command runs but says missing file. Confirmed the setting is in edit as text. When I go back to the edit page, all the settings are what they were before! ex threshold=0.

EDIT:

I have to edit the job itself not use the CommandLine option for the settings to stick.

If you have an occasional missing files issue, Recovering by purging files is the thing to do.
Adding console-log-level=information will help get back the headers lost in new log design.

The *-broken-files commands are for missing dblock files. If you lose dlist or dindex,
database information (assuming a good database) can upload those again with Repair run.

purge is a different command, and that message might be aiming at purge-broken-files.

About → Changelog says:

Removed automatic attempts to rebuild dblock files as it is slow and rarely finds all the missing pieces (can be enabled with --rebuild-missing-dblock-files).

and I think it was removed from Repair, but it’s not documented on that command. Help is:

  --rebuild-missing-dblock-files (Boolean): Rebuild dblock files when missing
    If dblock files are missing from the destination, you can attempt to
    rebuild them using local source data. However, since the local data may
    have changed, it may not be possible to retrieve all the required data
    and the process may be slow. Use this option to attempt to rebuild
    missing dblock files.
    * default value: false

Developer input would be nice. The message shown appears weak in several different ways.

Yes. CommandLine is for one time runs like a true operating system command line would do.
For convenience over the OS version, it’s pre-populated from the job settings (only one way).