Newbie question

I’ve set-up Duplicati 2 with my Amazon S3 account to do some one-time backups to Deep Glacier.

I did a 4 byte test file to start with, and instead of my text file in the S3 management console, the text file was split into a .dlist, a .dblock, and a .dindex ZIP file.

I was expecting to just see the text file in the console.

Where have I gone wrong?

this is normal behavior for duplicati, it wont copy the exact file as you see it in your computer.

It’s all part of the deduplication, compression, and encryption processes. Files will NOT be in native format on the back end (S3). But what you get from this is efficient storage (dedupe, compression), versioning, and retention.

For example on one of my PCs I protect 35GB of data. I now have 152 backup versions yet it only takes 69GB on the back end.

If you want direct replication of your data in native format, you can look at tools like rclone.

Sorry for interfering with an old thread - but I am missing the option to choose the storage class “Deep Archive”. With the latest Duplicati docker container (2.0.4.23_beta_2019-07-14), I cannot choose it.

So - how I can I do so?

Thank you for your support,
best regards,

Connor

It shows up for me. I checked the source code and it looks like Duplicati pulls a list of AWS storage classes automatically, maybe that failed for some reason.

That being said, even regular Glacier support seems to be iffy. Deep Archive could be even more problematic.

Here’s a recent thread: Glacier backup support

I also tried on my Mac - no chance. What was interesting to see is that also not all regions are showing up (in my case, I am missing Stockholm).
Thank you for providing the link - I’ll read through it.

Note that v2.0.4.23-2.0.4.23_beta_2019-07-14 is basically v2.0.4.5-2.0.4.5_beta_2018-11-28 so the AWS feature New Amazon S3 Storage Class – Glacier Deep Archive announced Mar 27, 2019 is not available. Duplicati queries the Amazon .dll it uses. Unsurprisingly, you need a new enough .dll to see new features.

Ignoring the early unstable canary releases, a big .dll update is in v2.0.4.28-2.0.4.28_canary_2019-09-05, however it’s only out three days now so is a bit of a risk. I’d also consider Duplicati + cold storage risky… Duplicati is intended for backup not archiving, and it likes to interact with storage to ensure things are OK.

If you go this route, please be sure you have a plan for restoring or for verifying the integrity of the backup.

Thank you for the suggestion. My idea was to have a disaster recovery in the cloud - files that only need to be accessed when the house is burning down or drowning… On the other hand, I would need access to some metadata to ensure that “new” files are added to the archive without needing to upload the whole 10 TB again…
If you feel that Duplicati isn’t the right piece of software for that - what would you recommend?

You can dig through Glacier and other cold storage posts here and on the Internet. Opinions from here:

I don’t know if any backup programs give you any help in figuring out what files you need to get back from cold storage or even making the request for you, or if it’s always a manual process of analysis & retyping.

If up to you, the fewer the better, i.e. for disaster recovery you may prefer to do something like image with Macrium Reflect Free to a USB hard drive, then upload/track large images. Between image backups you can use Duplicati or something to inexpensive hot storage such as Backblaze B2 or Wasabi, whose cost might not be unbearable for smaller amounts of data. Some people even try to have a hybrid solution with older files backed up in a more economical way, and newer files backed up on hot storage. Duplicati does not yet have built-in support for that, but there are scripting solutions given in the forum if you’re interested.

Backup files only if modification date > certain date in the past

One drawback of big images in cold storage is that downloading can take a long time. Keeping some local backup can help with that, provided it’s not where the same disaster is likely to take out the local backup…

EDIT: Is this lots of data or multiple computers? Do you prefer simple backups or more configurable ones?

It is simply a bunch of data (roughly 10TB) that I want to repossess in case of emergency. These are my home videos that are currently sitting on my NAS. And while my photos are duplicated to OneDrive and Adobe Cloud, I haven’t found a reasonable solution for the videos. If I place another NAS in my basement, I gain nothing if something is happening to my house. Additionally, I am unable to set the secondary NAS somewhere else. And from the economical approach roughly 10 USD/ month (just in case of never touching the files) sounded better to me than an investment of roughly EUR 1.300 for a NAS and two HDD…

Rclone v1.47 release

Add support for “Glacier Deep Archive” storage class (Manu)

Have you looked at rclone to just clone whatever video file tree your NAS has (and never rearrange it…)?

The typical backup program (or at least Duplicati) tries to do better than just copying files, with things like block-level deduplication,. uploading only changes, compression, encryption, etc. Most of those are poor fits for videos which are typically already compressed, and are hugely different from each other unless a direct copy exists or a file tree is rearranged in which case the new file would use all the blocks from old.

Assuming videos are usually written once and not changed, Duplicati and cold storage fit a bit better than backup of a highly changing file environment, where cold storage would get in the way of recycling space. Disaster-recovery aspect might ease its use of not-human-friendly filenames if asking for “all” is possible. Still, a simple clone might be all you need, and the simpler the software run, the less that might go wrong.

For retrieval, if the time ever comes for that, you’ll probably still need to use the Amazon S3 console since rclone seemingly doesn’t know how to ask. I don’t keep track of software to suggest something that might.

How Do I Restore an S3 Object That Has Been Archived?

General S3 FAQs would also be worth a read. S3 Glacier Commands is some rclone forum discussions.

The list of providers and the storage classes is baked into the code. I have not seen any service that allows us to dynamically pull the list of supported regions and storage classes.

The source is here: https://github.com/duplicati/duplicati/blob/master/Duplicati/Library/Backend/S3/S3Backend.cs#L58

I only skimmed it but I thought this section was dynamically pulling a list of storage classes:

Yes and no. It pulls the list of storage classes (not the storage regions/hosts) from the S3 library, not from the S3 server. It is updated when the libraries are updated (and maybe that is recent enough).

Ok, thanks for the clarification!