[RESOLVED] Keep getting 'remote files' error with AWS

I’m trying out Rockstor and want to use the Duplicati Rock-On to keep backups in AWS.

I first set this up a couple of months ago and ran a test backup which failed but I didn’t have time then to investigate further but I’ve just tried again and I’m getting an error message “Found 179 files that are missing from the remote storage, please run repair”
Running ‘Repair’ doesn’t seem to do anything.
Since the last backup failed I amended the details of the job to delete any previous backups more than a week old so that it would be a ‘full’ backup rather than some corrupted incremental. Same result.

If I run the Test Connection one the backup job settings that is successful so I don’t think I have a problem connecting to AWS.

I’m using ‘built-in AES-256 encryption’, ‘Amazon S3’ server and ‘Glacier’ storage class and the ‘Amazon AWS SDK’ client library.

The only thing that looks odd to me is that when picking the files to be backed up under the ‘Computer’ icon there is a ‘config’ folder (which I don’t remember setting up) which doesn’t appear on the CLI of the server (it’s mnt2 on the CLI). But /mnt2 doesn’t appear as a folder when selecting the source files.

Have I somehow picked inaccessible files or missed something obvious?

Thanks.

I found another post that said there were limitations with the implementation for Glacier on AWS and suggested BackBlaze or Wasabi. A quick look and BackBlaze looks like a good match for my requirements and the same test backup worked a treat!

And I discovered/remembered where the ‘config’ folder came from so all good.

1 Like