S3 backup (Minio) - remote file list misses first character

I’ve been a happy user of Duplicati as a docker container on Unraid, but recently I ran into a problem for which I’m looking for some assistance.

I’m backing up data from an Unraid server running Duplicati as a docker container to another local server running Minio. The backup seems to run fine, and a new bucket was created and many new files were created. However, Duplicati reports that the backup job failed, because it cannot find the remote files, apparently because it doesn’t read the first character of the file name.

Found 21 files that are missing from the remote storage, and no files with the backup prefix duplicati, but found the following backup prefixes: uplicati

When I check the Minio server, all files are present and have the correct name, starting with duplicati (and not “uplicati”, without a “d”).

However, the remote log shows the following:


The remote log also shows the following for the put commands:

* Aug 30, 2021 3:17 PM: put duplicati-i50036ed7e81e49b2a16877e7d086f844.dindex.zip.aes

So that doesn’t seem to be the problem.

Deleting the job and creating it again doesn’t solve the issue.
My Minio server is running on Ubuntu 20.04, but I got the same issue with Minio running on FreeNAS (built-in service).

Any thoughts? Thanks!

1 Like

Hello and welcome! What a weird issue. If you don’t get any other responses, I can try setting up a test environment to see if I can reproduce the issue.

I have exactly the same issue. Ubuntu 18.04 Minio backend. It cuts the first letter of the name for some reason.

Nov 22, 2021 3:33 AM: list
Nov 22, 2021 3:33 AM: put duplicati-20211122T023302Z.dlist.zip
Nov 22, 2021 3:33 AM: put duplicati-i68c29439260f497bb13e71e3f1dc0ebe.dindex.zip
Nov 22, 2021 3:33 AM: put duplicati-bb80d5c220e38416c9e8c3375372f344d.dblock.zip

I do also have to mention this error only appears when a subfolder is selected in the minio bucket. When the root folder of the bucket is chosen, the issue does not appear

Are you possibly using backslashes in your path in the Duplicati config? For S3 and most other back ends you want to use forward slashes.

sometimes no slashes are necessary, for example on a single-level subfolder you can just type folder.

If you want to see what an S3 URL should look like:

S3 Compatible


and you can compare that to Target URL on the Commandline screen. I “think” the only double slash is supposed to be the one at s3://, anything in the middle is single forward slash, and I’m not sure of end.

You can also take that URL (Export As Command-line can also be used if you want it quoted for you) to:

Duplicati.CommandLine.BackendTool.exe to see if you do operations without losing the first character…

1 Like