One way sync and don't mirror?

Hi all,

I’ve only just recently found out about Duplicati and started using it as of this week. It’s a fantastic tool, but the documentations are a bit lacking, so I’m hoping someone can help me out with a few questions…

I understand that Duplicati does a 2-way sync, i.e it mirrors the source to the destination. Is there a way to configure the settings so that if a file/folder gets deleted from the source, Duplicati doesn’t automatically delete the same file/folder on the destination? I backup my data to Google Drive (I have unlimited storage), so I don’t really care about storage space issues on the server side. On the local side however, I sometimes download files to my M2 SSD (32GB), which is of course tiny. I want to just backup whatever files I have on this 32GB SSD and wipe it so I can get more free space. Why not swap it for a bigger SSD, or download files to a NAS, or whatever? Well, that’s beside the point…

And as a side question, what is the recommended block/chunk size to configure for large files? I’ve currently set it to 100MB for video files (they are around 80~150MB each and total size of the folder is about 49GB). I also have some Veeam backups that are around 25GB single file. For this, I’ve configured the chunk to be 500MB. I’ve read some forum posts about this and they get a bit technical. I just need a rough recommendation / template to follow.

It’s not a 2 way sync. Files are chopped into bits, compressed, and encrypted before being put on to backend. It’s impossible to use the backup files without basically restoring them somewhere else. And even if you could modify the backup in place your backup software would detect the tampering and refuse to continue until it’s been resolved.

That’s exactly what it does. It keeps the old versions around forever unless you add retention policy or limit the number of backup versions to keep.

This seems like a really bad use case. It can be done like that but it’s not at all the intention of the software and you will run into all sorts of trouble. Besides, if you just want to store the data in Google cloud why not just use their 2 way sync client and unsync the files you don’t need? Much easier and more accessible.

Ok, let me put it this way. I have certain folders that I need to be backed up in specific ways:

  1. Photos folder - I dumps photos taken from my phone and camera into this folder. Duplicati is configured to run a backup on the Photos folder at a certain time every week. Every now and then, I will go into the Photos folder on my local drive and delete photos that I don’t like. I want this to be reflected on the server as well. I don’t want to have to go into my Google Drive and then search for the photo and delete it.

Why don’t I just delete the photos I don’t like, prior to backing them up? Well, habit is one thing, plus I don’t have all the time to do this every time I add photos to the folder. I just want to dump my photos onto my local disk, have them backed up automatically so that I have a copy. When I have time later on, I can go through and remove the ones I don’t like.

  1. Movies folder - compared to photos, movies aren’t important to me. I don’t care if I have a movie collection full of AAA and meh movies. If I’ve downloaded a movie and it is somewhat decent, I’d back them up and add it to my collection. If I delete the movie from my local disk, I don’t want Duplicati to also delete it from the server. I only delete it from my local disk to free up space and I don’t need to do that on the server side.

These are just some examples of 1-way and 2-way (mirror) sync. I need a bit of flexibility here.

Yes, I understand that Duplicati chops the original file into smaller ones, compresses them and encrypts them. But I don’t understand how that is relevant to a 1-way or 2-way sync? If I backup a 20GB file, it gets chopped into say 1000 smaller ones and are uploaded on the server. If I delete the file on the local disk, then I presume the 1000 files on the server will get deleted if it’s a 2-way sync. If not a 2-way sync, then the 1000 files will just stay on the server. If I were to restore that 20GB file, I’m guessing it’ll either let me download a single 20GB file or download 1000 files and merge them into one. I haven’t actually restored a file before, but I did go through the restore menu to look at what it’s like.

Main reason I’m using Duplicati is because of the encryption it does. My Google Drive storage is a business account and I don’t want the IT admins snooping in on my data. The fact that the files get chopped up and encrypted, means that I don’t have to worry about admins having access to my files. Furthermore, I use Duplicati to backup to other clouds and my NAS as well, there are multiple reasons here.

That’s fine. That makes sense as it’s actually a backup use case.

That’s what makes no sense to me. If you back them up with Duplicati and then delete them from the local disk you will have to write down or remember when you backed up the file to even find it in your backup. It’s going to be ridiculously difficult for you to find these movies without knowing when they were backed up.

You have two completely different use cases and you’re trying to force them into 1 software that’s made for the first use case. Use two different softwares.

Because it literally cannot do 2-way sync. Duplicati, by definition, is a 1-way sync tool.
https://www.tgrmn.com/web/kb/item34.htm

In one-way sync, files are copied only from a primary location (source) to a secondary location (target) in one direction, but no files are ever copied back to the primary location. Replication and Backup (=Mirroring) are one-way sync methods in ViceVersa.

1-way sync can delete files from the destination, the differentiation is that no changes made to the destination will ever be copied back to the source.

I would look into a solution to encrypt the files on disk before uploading them in Google drive. Encryption is not unique to Duplicati and there are plenty of tools that can encrypt your files while still allowing you to browse your files in a more useful way than trying to figure out which Duplicati backup your file is in.

Thanks for your reply.

That’s what makes no sense to me. If you back them up with Duplicati and then delete them from the local disk you will have to write down or remember when you backed up the file to even find it in your backup. It’s going to be ridiculously difficult for you to find these movies without knowing when they were backed up.

I’m not quite sure what you mean. If I go into the Restore menu, I can select to restore from a specific Google Drive folder. If I go into my Google Drive folder, I can then choose to restore a specific file or folder. There is also a search field to search for the files. The next step asks me “Where do you want to restore the files to?” and I can pick the original location, or save to a new location.

That all seems pretty straightforward to me. If I need to search for a specific movie, I can just search by the file name, or by whatever subfolder category it is under. The entire root folder gets backed up, so it retains all my subfolder structures.

1-way sync can delete files from the destination, the differentiation is that no changes made to the destination will ever be copied back to the source.

How would I configure 1-way sync to delete files from the destination? Is there something in the advanced settings that I need to set?

Yes, but when you have multiple backup runs you have to pick the right one when restoring. If you make two backups and delete files inbetween then you won’t be able to see the deleted files in the newest snapshot.

Duplicati deletes files on the backends only when they’re removed by retention policy or manually purged using the Command line option

It sounds like you’re trying to use a backup tool as a sync tool. As @Pectojin says you might be better off using an actual sync tool (with encryption) for at least one of your scenarios. But if you really want to use Duplicati it sounds like you’d want to set up two backup jobs.

One job would back up your photos and have a Backup retention of “Keep a specific number of backups” set to 1 meaning if you delete a photo from your local drive, then at your next backup it would be flagged for deletion. Once enough photos are flagged Duplicati will do the actual chunk removals from the destination (in your case Google Drive).

Of course this basically eliminates one of great features of Duplicati which is the ability to back up different versions of a file. For example, if you edit your photos Duplicati could allow you to restore the original AND modified versions of the file.

Your second scenario sounds like you’re trying to set up a “cold storage” solution rather than a backup. It’s not a great fit but your second job could back up your videos with a Backup retention of “Keep all backups” which means nothing will ever be automatically deleted from the backup.


You should also know that I believe depending on how you set up the OATH to your Google account (such as Google Drive vs. Google Docs) it may be that Google limits access to those files to only the app that created them. So if you set up Duplicati to back up to Google Drive, you won’t actually be able to find the files by browsing through your Google Drive - but if you tell Duplicati to back up to Google Docs then you COULD browse to the files.

Of course I don’t work at Google or manage business accounts, so I can’t say for sure what happens in that scenario.

1 Like

I don’t know a whole lot about OATH, but FWIW, i run a ~4gb backup job of my work laptop to my corporate account’s Gdrive and I never ran into any option along these lines - I can just browse into the duplicati backups folder from my drive root directory. Not that this is the only way, of course.

Honestly I don’t have a way of testing this with a business account - I was pretty much going off of what has been said in these posts:

Let’s say I backup 100 movies on Monday and delete all 100 movies from my local drive once the backup has completed. Then I download 50 movies on Wednesday and add to my backups. Assume backup runs every day and the retention policy is set to keep unlimited copies. If I understand what you’re saying correctly, since this is an incremental backup, I would only be able to see 50 movies in the latest backup set, and I’d have to select the Monday backup set to view the other 100 files? There isn’t a way to combine all the incrementals and view them as one big backup set for restore, then you can just search for keywords against one single backup set?

One job would back up your photos and have a Backup retention of “Keep a specific number of backups” set to 1 meaning if you delete a photo from your local drive, then at your next backup it would be flagged for deletion. Once enough photos are flagged Duplicati will do the actual chunk removals from the destination (in your case Google Drive).

If I were to set the “Keep a specific number of backups” to 2, and I delete the photos from my local drive, would I have to go through 2 backup runs until Duplicati marks the photos on my Google Drive for deletion?

I think the main problem I’m concerned about is storage space and disaster recovery. I’ve only ever had 1 or 2 hardrive failures and that was a long time ago. But because I’ve had these experiences, I don’t trust my NAS or any of my external hardrives / local storage. Hence why I want to backup to the cloud, as I believe their backed servers are more reliable and probably does some sort of replication within data-centers and across the globe, etc.

I don’t want to continuously spend money on local storage. If I have unlimited storage on the cloud (Google Drive), then I’d prefer to just store most of my stuff there and use it as cold storage, and keep most of the stuff that I access often (e.g. my photos) on my local storage. However, I still want some flexibility on my Google Drive, such as having only 1 copy of my movie but have a continuously growing collection, and having 1~2 copies of my photos but only mirror / replicate my local drive structure.

So basically, I download stuff and backup and then delete from local storage, so I can download more and repeat. I want to make sure that my backups on the cloud can retain specific data with unlimited copies, but at the same time also retain other data but mirror instead.

Exactly. There is currently no way to show files across multiple snapshots, so while it might not be so bad for the first couple of backups you will eventually have a very hard time finding your files if you make many of these backup runs.

I would consider using Google Drive’s built in sync for the less important media content that you just want to store off the local system. To secure them I think 7zip or VeraCrypt would work nicely to encrypt them before uploading them to Google. Both use AES when encrypting, so they’re equally secure to Duplicati in terms of encryption

Thank you very much for your help.