Wasabi and Duplicati

I am new to both of these applications/companies. I am trying to configure the Backup destination. I am using WASABI Hot Storage(s3.wasabisys.com) as the server and bucket create region is US East (Northern Virginia) (us-east-1). What is the AWS Access Key and where do I find it? I have tried my Wasabi User ID, Wasabi Account Number, Access Key and Secret key. None work. What is the AWS Access Key and where do I find it? I have tried my Access Key and Secret Key. They did not work, but could be because of incorrect AWS Access ID.

The AWS Access ID is your Wasabi Access Key ID, the AWS Access Key is the Wasabi Secret Key

2 Likes

You need to log into the wasabi console and generate an access/secret key pair. I believe the standard account allows you to request up to 2 of them at a time.

I have an access/secret keys. Do I need to run both applications on the same computer?

To Whom It May Concern,

Found my problem, I could not distinguish the difference between the lower case L and an upper case I. They were both straight lines. Ok I got the connection to work and now I get the following error:

The database was attempted repaired, but the repair did not complete. This database may be incomplete and the backup process cannot continue. You may delete the local database and attempt to repair it again.

I tried to repair the database and got the following error:

No files were found at the remote location, perhaps the target url is incorrect?

If the target url was wrong I would not have been able to successfully run a test on my configuration, Correct???

There are no files up at the remote location because I have not uploaded any files yet. I was in Wasabi and it said if I was going to upload more than 5000 files to use something like Duplicati. That’s what I am trying to do now. Do I have to upload a small amount of files using Wasabi and then use Duplicati?

Not necessarily. The “Test connection” button is typically just a very basic test, and just does a file list.
There’s no attempt to validate the list against other records because there might be none (new backup).

If you get to where you have an existing backup, you could edit the destination folder to an empty one to check what happens under Test connection (will probably be happy) and Verify files (complains).

Not really, although it’s good because it allows some interruption recovery and might speed up database.

OK, I think I finally have it working. I used CyberDuck to upload files
to my bucket in Wasabi. I also used Duplicati to upload files to the
same bucket. I tried a backup last night using Duplicati and checked
this morning and no new files showed up in my folders. I did notice that
there is a large amount of files that look like the following at the
root of my bucket:

duplicati-20210304T102312Z.dlist.zip.aes
duplicati-b0043ed008d714bb2bea7eed65c47a826.dblock.zip.aes

Can you please explain to me what these files are?

I also noticed that my Wasabi utilization overview on 3/2/21 was 57.23G
and on 3/3/21 was 117.90G. Is it backing up the whole drive again and
not what was changed. Like I said earlier, no new or changed files were
updated in the backup.

I have read most of the documentation I could find on Duplicati and
Wasabi. Am I missing something, is Duplicati and application that can
backup a unit to the cloud and be able to restore files or file in the
event of corruption or deletion?

The term “upload” concerns me, if this was more than just an upload test. Duplicati is a backup program.
While it can share a bucket with other uses, each backup should have its own folder path within a bucket.

That’s a Duplicati backup. How the backup process works. You should also have one dindex per dblock.

You can move them to some other folder if you like, and adjust the Folder path on Destination screen.
I don’t know what you have set currently that let your files wind up in the root, but try something different…

In terms of when they arrived, see if Wasabi or CyberDuck can get a date, but beware of time zones. The time code on the dlist file is when that backup started, expressed in UTC (GMT). I don’t know your zone.

There is one dlist file per backup, and it’s uploaded near the end of the uploads, to say what’s in backup. Sorting by date, or searching by name (whatever your tools allow) will let you find where your backups are.
Possibly you managed to leave Duplicati backups in several folders while you were trying to get this going.

Additional runs of a given backup will only upload changed data and the dlist file to record the latest view.

Yes, it’s a file-oriented backup and restore system, meaning not a “bare metal” system image restore one.