I’m a complete newbie in Duplicati, I just installed it on my OMV.
I would love to have the following setup: my NAS downloads the files I need from the internet (using JDownloader, NZBget or whatever), processes it (unpacks, cleans, things like that) and throws into the “complete” directory. Then Duplicati picks up those ready-to-eat files and uploads them to my Google Drive. After it does that successfully, it removes the uploaded files, leaving space for new downloads.
- Is it possible? How to achieve that? Is Duplicati a right tool for my job?
- If so, how should I set up my backups? Can I make one backup, name it “downloaded things” and let Duplicati incrementally upload any new files, making one giant backup? Would I be able to download only a few files from this backup easily?
Thank you in advance for your help, good people.
Hello @anonymous, welcome to the forum!
Theoretically, you could use a
--run-script-after parameter to generate a list of files added during the backup and delete those files from the source, but there’s no built in feature to do this.
So chances are Duplicati is not the right to for what you want to do.
Yes, one giant backup would be created, but with caveats listed below that I think will underscore why Duplicati isn’t the best tool for what you want to do. On the plus side, yes - you would be about to restore just a few files (though ease is often in the eye of the beholder).
Perhaps if you flesh out you whole usage scenario some other users might have suggestions on more appropriate tools.
I’m guessing from here on as to what you might be wanting to do with Duplicati, so please forgive me if I’m incorrect…
It should be mentioned that when Duplicati backs up files, it (by default) chops then into little 100kb blocks, gathers enough of those together to make a 50mb zip file, and encrypts that before uploading it to the destination (in your case, Google Drive).
So the backed up files are only useful to Duplicati - for example, you couldn’t play an mp3 that Duplicati had backed up without first restoring it.
On top of that, Duplicati (be default) stores the current state of the source files along with previous versions. So if you did have a batch job delete all the files that Duplicati had backed up, when the next backup ran Duplicati would no longer see those files in the source folder and would flag them as deleted in the backup as well.
Even though the backups could still have the previous (undeleted) versions of the files in the backups, depending on your retention settings those would eventually be cleared out of the backup completely.
Well, the usage scenario is plain old data hoarding. I want my NAS, with the help of JDownloader, NZBget and whatever it takes, to chew up the pile of links that I feed to it. Then, after each of the movies, music albums or TV series is downloaded, wait a bit for me to clean up the directory – I want to have similar directory structure and naming scheme for everything. Then I would move things I’m done with to other directory, let’s call it “final”. And what I need is a tool that would encrypt and upload things from “final” directory to Google Drive or other cloud service and remove files on my NAS after doing that, automagically.
I want to be able to restore only some of the files, of course – for example, if I’m in the mood for some of the movies that I have there – but I still want the encryption, so it’s not like I can’t wait.
So, with the appropriate
--run-script-after and disabling default behaviour of “storing the current state of the source files along with previous versions”, Duplicati would seem like a decent tool for my job.
Does anyone have a better solution in mind?
Hopefully some other options will be offered, but it sounds to me like you would do better with an on-the-fly cloud storage encryption tool like Cryptomator or BoxCryptor. Box tools have videos on YouTube that help explain how they work.
One of those combined with a simple scheduled script to move stuff from your “incoming” folder to the encrypted BoxCryptor folder would probably be a simpler setup to use for this than Duplicati.
However, if you really do want to use Duplicati you’ll want to look at settings like “Retention Policy” which can allow you to tell Duplicati to never delete a version of a file.
Note that when it comes time to restore you may find yourself having to hunt through backup versions to find one that includes the file you want to download.