AWS S3 Backup - Upload files as is

I have now installed Duplicati on my NAS as it was the only one with possibility to upload to AWS S3 and have the support of continuous backups. The problem is that everything is uploaded in “chunks” like in general settings upload volume size is default 50. Is there a way and if so just upload files as is? As my customer want to have the possibility to get one and one file now and then without doing it the complex way.

Hi @frilogg - welcome to the forum!

Dupliacti only works by “chunking” the files then compressing / encrypting the chunks before sending them to the destination. There is no possibility for keeping files “as is”.

If the normal Duplicati restore process (which can allow for single file restores) is too complex for your client then unfortunately you’ll have to find another solution elsewhere.

If you do, please feel free to post here what you found just in case other visitors might want to do the same thing! :slight_smile:

1 Like

It sounds like you want a sync tool rather than a backup tool. I hear there are good syncing tools out there, though I haven’t personally used any of them.

I use Syncthing (with minimal versioning) where a full backup tool like Duplicati is overkill, but it requires Syncthing be installed at both ends so won’t work with S3.

Plus, it’s a sync tool - not backup - so “restoring” a file means manually copying it out of the versioning folder. Not hard, but not exactly GUI-wizard easy either. :slight_smile:

1 Like

I don’t quite understand “the complex way” in terms of user experience (internal operation complexity frightens some people away). From the user’s point of view, it’s a tree-view navigation and pick-a-file either way, isn’t it?

How about rClone? I’ve never used it but a quick check shows it supports S3.

I guess what you need to decide first is whether it’s more important to you to have a robust backup system with deduplication and version history (combined with less reupload burden when some files are changed only slightly), or whether you need a solution that will allow easy and/or direct access to specific files on an occasional or frequent basis. In the former, I’d go with Duplicati; in the latter, I’d consider a sync tool as mentioned before.

That being said, Duplicati doesn’t make it overly difficult to restore an individual file here or there. Assuming you have access to the GUI on the system that does the routine backups, it’s pretty straightforward and quick. Just not as quick as, say, navigating to a certain Dropbox folder and copying a single file out.

I keep forgetting about rclone - yes, I’ve heard that works well. Though as you mentioned in your other post, I believe it’s purely a sync tool (no versioning like Duplicati and Syncthing have) so deleting a file in your source means it gets deleted in your sync destination.

1 Like

Yeah if you want an exact mirror, don’t use Duplicati. rclone is great.
But with rclone you won’t get any versioning, deduplication, etc.

I’ve been using Rclone for some time, it works very well and it’s a very powerful and reliable tool.

In fact it has no deduplication, it synchronizes files as is.

But using the --backup-dir option and folders automatically generated with the date in the folder name, you can save the versions of the files.

You could also use data lifecycle management policy on the bucket to keep previous versions of files.