How can Duplicati be used to Move folders to Storage Location?

First, really nice piece of work, I’ve got Duplicati working well on my Mac-High Sierra, to backup files to Amazon Drive, it’s working much better than the Amazon provided AmazonDrive Backup. So, great work there, thanks.

Second, can someone provide Guidance on how Duplicati might be used to Move folders to Storage, so not just a backup but a move?

What options would I use to accomplish this? To start, I’m going to turn off encryption, as I want to see this working, that is, compare the move to make sure/verify it’s all there and nothing missing. And, I’m hoping there is something I can use in Duplicati to verify moves going forward.

But, other than that, how do I configure a Move? I’ll settle for a simple Move now, but ideally, long term, it would be great if folders are replicated, then confirmed/verified, and then the source deleted only on successful verification?

Thanks,
KWL

Duplicati isn’t a cloning/migration tool so there is no way to move files to a new location.

Additionally, you’ll find that if you turn off encryption the destination files are not readable. This is due to the block based algorithm used to deduplicate the files.

There are lots of tools that can help you sync, mirror, or move files, but Duplicati is not one of them.

Something like rsync, synctoy, or maybe syncthing might be relevant here.

Hi Pectojin,
Thanks for the quick reply,

And the note regarding the block copy mechanism, makes sense, didn’t think about it then but will keep that in mind,

As for Clone/Migration, I don’t get the reference, Storage Management is the thing, I think, moving folders to various storage repositories for long term safekeeping,

But, regardless,

Point taken, I’ll see about another tool, it’s just that it would be great if I could take advantage of the Duplicati connection to get it done, avoid adding yet more tools to do essentially the same thing, get files to other places,

Thanks again,
KWL

Hi @KWL, welcome to the forum!

If / when you find a tool that does what you want feel free to stop back over here and share what worked for you. :slight_smile:

Thanks JonMikeIV,
Will do,
And, just some background for future reference when I do repost on the topic,

My interest in Duplicati doing the work is because Duplicati doesn’t seem to have any issue with maintaining the connection to AmazonDrive and writing its files. In fact it went through 199GB without issue. Whereas every other means I’ve tried simply fails or stops reconnecting and the action never completes.

In fact, the AmazonDrive client itself has this problem,

And, every Sync product I’ve tried has the issue, but also, fundamentally won’t work because if I delete the source they delete the destination, so not a good Move.

Drive Mounts, mount AmazonDrive as local volume? Same issue, they eventually lose the connection and never remount, which, also means whatever I’m running to do the action also fails, finder move, pathfinder move, etc…

So, as it stands, Duplicati is simply the #1 product for this currently, and if I could work out how to use it as such I would. And, if I could take advantage of deduplication also, what a great Storage Manager that would make!

I’ll post again if I have any updates,

Thanks,
KWL

Thanks for the clarification. My guess is you’re uploading large files over a possibly shaky connection. Since Duplicati chops the files up into smaller chunks you’re less likely to run into a connection issue.

I don’t know if it would do want you want but Duplicacy is another tool that does similar chunking and deduplication.

While I wouldn’t recommend it. You COULD use scripts to remove local files that Duplicati had backed up and a “keep all” retention policy to have them not be deleted from the backup since they’re deleted from the source.

But with a scenario like that restores became annoying because you have to hunt for the most recent because that still has the file before it was deleted from the source.

Thanks JonMikeIV,

I’ve checked out Duplicacy but it doesn’t support AmazonDrive, they were set to, but some issue with developing it in 2017 brought them to a full stop apparently.

If I’m following you on the script approach, as it’s a Move, I wouldn’t have versions, there would be just the one version that was created as the Move.

And, as I’m just trying to get Folders of stuff that have a very low use, once or twice a year for the next 5-10 years, off of my laptop to free up the space, all I need is a good upload that gets 300-400 GB up and then 10-20 GB down when needed. In fact, if I find something that works well, I may even upload just a bunch of stuff that I might need every month, but at least it wouldn’t be taking up my laptop’s space.

And, of course, as the original moved files will still be on AmazonDrive, I’ll be able to simply delete locally when I’m done with the files, or worse case, create a new version then of a very much smaller upload.

If you hear of anything else please ping me here, else I’ll be looking to get this dome somehow,

Thanks,
KWL

Hi @KWL,

Possibly that’s similar to the rclone Amazon Drive issue. rclone supports many storage systems, and commands like move, check, and mount. It’s one of the suggestions that often appears on Duplicati cloud to cloud migration threads. While it looks like cloud storage services generally didn’t add support for legacy file transfer methods, it looks like some of the legacy clients added cloud support. For example FileZilla added it to the paid Pro version.

Hi TS678,

Possibly, I know rclone ran into an issue whereby they stored their encryption secrets in their code, and Amazon took them out of the partner/dev program because of it,

Whereas, in this case and a couple of other products it appears they simply can’t get added to the Amazon partner/developer program to begin with.

Thanks,
KWL