Use Duplicati with a cold cloud storage

Hello,

I would like to use duplicati with a cold cloud storage (“OVH public cloud archive” in particular), i.e. files are stored frozen (not accessible) and a request must be done before accessing to the files. With a delay of several hours between the request and the access to the files. It is a very cheap solution.

I have tried, and it seems Duplicati is not able to handle this kind of storage. Can you please confirm?

The only solution I have is to send the unfreeze request manually before the backup, but it means I can’t use automatic planned backup.

Thank you for your help

According to their website, OVH supports uploading data via SFTP and HTTP so technically Duplicati should support it. Also you could get away with cold storage if you disabled the automatic verification after each job. I’m not sure how restores would work, but it would seem you would need “wake up” all your files to perform a restore. I personally wouldn’t recommend this for backups, but it seems pretty suited to file system archiving.

@samw is right about the cumbersome restore process cloud storage would entail. There’s some more info about it here:

Thank you

Having to “wake up” the files before doing a restore is not a problem according to me. I will do a restore only once so I can afford this manual task.

I think to make the next incremental backup, Duplicati needs to access to the previous files to know what has already been backuped. This is the biggest problem according to me.

I will try this https://sites.google.com/a/duplicati.com/duplicati/news/howtouseglaciertostorebackups with amazon glacier. What do you think about it?

If the files are stored on tape in the back end they can take up to a few days to recall. Most cold storage solutions use a combination of MAID and LTO Tape, it’s the reason they are so cheap compared to anything else. If that’s not a problem for you and you have a very reliable transmission medium then it will work just fine.

It doesn’t. The data that has been backed up is kept locally in a SQLite database so it knows what files have been backed up.

You can use any backend that is supported by Duplicati. Personally for my remote backups I use an offsite FreeNAS box running FTP over SSH. For the money that you spend on cloud storage you can just set this up.

Ok great ! So it works.

  • I have just put the following options: --no-auto-compact and --backup-test-samples=0
  • The restore process is simple. I follow the normal process, it does not work the first time obviously but as soon as Dupticati tries to access the file, OVH starts “waking up” the file. The second trial several hours after should work. I will test it.

Thanks
Antoine

That depends on how the back end storage is configured. Most likely you will need to leave the restore process to run for a few hours and hope that Duplicati doesn’t timeout or anything like that. Either way test the restore process thoroughly to make sure everything is working as it should.

I have just met an other problem:
When Duplicati uploades big files, it split the file in several files, upload them and reassemble them on the backend. In case of a cold cloud, the assembly cannot be done. Therefore after the download is finished, I still have a folder “nameofthefolder_segments” containing files xxxxx/000001, xxxxx/0000002 and so on.
Is there a solution?
Thank you

I’m not sure what’s doing that, but it isn’t Duplicati as it only uploads single files at a time as a single chunk (it doesn’t split things up). Additionally, it’s files all start with duplicati- (unless you’re using the --prefix parameter) and should end with your chosen compression type and/or encryption (so .zip, .7z, .aes, etc.).

Ok good to know that it is not done by Duplicati. Maybe it is done by the back-end itself. I saw that the size of the xxxxxxxxxxx/0000001 files was 256Mo. I will try to limit the size of the dblock files to 200Mo. Maybe it will solve the issue.

Sound like a good test, let us know how it goes.

Hi,
I have tested: putting the size to 200Mo removes the issue with the xxxxx/00001 files that I described above.
But now I have an other issue with an upload of 37Go (i.e. upload in several times). Sometimes I get the error message "Found 1 remote files that are not recorded in local storage, please run repair"
The remote file that causes the issue (dblock) is much less than 200Mo so probably the transfer has been stopped in a wrong way? I am supprized that any transfer stop is not correctly supported by Duplicati.
I don’t know any “repair” button therefore I delete the file manually and then it works.
Do you think this issue is linked to the fact it is a cold storage ? How to solve it?
Thanks

Good to hear! And that makes sense if the provider’s upload tool was automatically splitting files larger than a certain size into chunks for the upload.

It’s possible to have dblock files smaller than your volume size (in this case 200MB) such as at the end of a backup when there’s not a full volume’s worth of data left to upload or after a compacting operation when multiple “small” files are re-compressed into a single larger one.

However, those files should still be recorded in the local Duplicati databse so if an unexpected remote dblock file is found the most likely reasons include:

  • upload failed but destination didn’t delete the partial file
  • upload succeeded but Duplicati didn’t record it in the local database for some reason
  • multiple backups are pointing to the same destination folder

I don’t know how they handle cold storage so I suppose it’s also possible a file that Duplicati used to use and deleted (likely due to version control or compacting) never got removed out of cold storage and has somehow been moved to live again…

One way to try and narrow this down might be to use the --log-file and --log-level parameters to store a detailed log file of the backup. If the error happens again, look at the previous log file for the name of the “unexpected” dblock. If it’s there then there should also be record of it being saved into the local database.

Thanks JonMikeIV
In my case the backup is running for the first time and has not been finished yet. So the small file can’t be explained by a end of back-up. Also I have disabled the campacting because of the cold storage. I have only one back up on this destination. Also because it is the first run, version control can’t explain the issue.
As a summary the two possibilities from what you say are:

Thank you for the idea to use the log. I will do it.

Indeed there go half my ideas. :slight_smile: (Sorry if I forgot you mentioned that & disabling compacting earlier.)

Please let us know what comes out with your log. While I described what I believe Duplicati is SUPPOSED to do, that doesn’t mean that’s what it’s ACTUALLY doing (in other words, you very well might have found a bug). :smiley: