Backup failed: Found 3 remote files that are not recorded in local storage

I use Duplicati on a linux server with CLI to send compressed database dumps to an S3 bucket.
The script dumps the database in a temp folder and then moves it to the duplicati backups folder with all the other dumps of other databases on the server.
Lastly the whole folder is compressed and sent to the bucket.
It all worked until recently when i got a “Failed: Found 3 remote files that are not recorded in local storage, please run repair”.
The 3 files are duplicati-20240228T000029Z.dlist.zip.aes, duplicati-[alphanumeric_string].dblock.zip.aes and duplicati-[alphanumeric_string].dblock.zip.aes. I tried looking in the backup directory but the files are not there and i even tried running a find command looking for the cited names but it returned no results.

Any help is appreciated, thanks in advance.

Welcome to the forum @adobr

Terminology is a little confusing. GUI uses Source (files to backup) and Destination of backup.

This sounded like Duplicati Source folder.

This sounded like Duplicati backup, except without any mention.

Do you mean Source? The backup is at Destination, so “remote”.
Can you look at S3 Destination? Any other backups going there?
If so, use a folder in the bucket. Don’t mix them all in bucket root.

By

duplicati backups folder

and

backup directory

I mean Source and the Destination of backup in this case is the S3 bucket.

Can you look at S3 Destination?

Yes, I can and

Any other backups going there?

Yes there are, backups from different servers are sent each to their own folder inside the bucket but dumps from different databases on the same server goes to the same folder.
I’ll go on and create a folder for each dump, preventing them from mixing up.

Sorry for the late response and thanks for the answer.

You use a different folder for each backup job, unless you set prefix to tell job files apart.
This doesn’t have anything to do with how many dumps a single job can backup though.

1 Like