Database backup question

I looked through the manual and forums but can’t seem to find an answer to this question. Is there a way for an unzipped version of the database to be automatically included in the backup? I realize in some scenarios, this perhaps wouldn’t be desirable from a security standpoint but in others, having an unzipped version of the database with any necessary config files etc would make a bare metal restore much faster I think.

For example, doing a BMR where the backup was on a portable drive would mean little more than booting a USB stick with Duplicati installed (or setting up a Windows PE environment recovery partition), pointing it to the USB drive as the source and the wiped hard drive as destination and it would all happen pretty quickly rather than doing the same thing but having to rebuild the database from scratch. Unless I’m misunderstanding something?


You wouldn’t want it right in the backup destination because it would cause a complaint being an unknown file, however you could put it somewhere else remotely to a parallel and similarly-named folder, or just somewhere, using –run-script-after where some people do a database backup as a second job. I’m not sure if a script gets any helpful environment variables set, e.g. “%REMOTEURL%”, but you can look. Possible crude transfer tool:

C:\Program Files\Duplicati 2>Duplicati.CommandLine.BackendTool.exe help
Usage: ://:@ [filename]
Example: LIST ftp://user:pass@server/folder

Supported backends: aftp,amzcd,azure,b2,box,cloudfiles,dropbox,file,ftp,googledrive,gcs,hubic,jottacloud,mega,msgroup,onedrive,onedrivev2,sharepoint,openstack,rclone,s3,od4b,mssp,sia,ssh,tahoe,webdav

I solved this by moving Duplicati’s database to a folder in Nextcloud (one could also do this with Dropbox or Google Drive I guess). Since Nextcloud automatically syncs the folder to a drive on my server, the database itself now exists both on the machine being backed up as well as on a completely separate machine.

Thanks for sharing solution. But I see several possible disadvantages.
Nextcloud client cant only sync whole file, not only changed parts like Seafile or Dropbox. So if you have Nextcloud in LAN, you are probably fine. Because after any changes whole size of DB will be transferred to Nextcloud.
You may also come across some file locking problems.

Also doing recover without DB is not as slow, it’s definitely faster than whole DB rebuild. You can test it by myself if you run “Restore” and “Direct restore from backup files …”

My use case is worst-case scenario BMR’ing a laptop either at home (LAN) or on the road (WAN). My DB file isn’t huge, a few hundreds MBs, meaning I could quickly download it along with a Windows PE environment onto a USB stick and restore from there.

And Nextcloud simply retries locked files until they’re unlocked so I haven’t seen a problem in that regard so far.

Can Duplicati even perform BMRs? Even if you had the entire drive backed up, it’s file level not block level. Wouldn’t that still necessitate rebuilding the MBR? Or am I just not understanding what’s happening here?

Better to use a product that does image level backups or is designed for BMR. Duplicati is awesome for protecting file level user data. But if you need image level backups use another product on top of Duplicati. (I use Macrium Reflect)

WinPE and duplicati as Disaster recovery solution is trying to take WinPE plus Duplicati and turn it into a BMR solution which previously used BURP which is also file-level. Windows disaster recovery with WinPE and burp. Initially I was skeptical that Duplicati could create all the odd files, but apparently it does. Not sure about Linux.

Personally I take a Macrium Reflect image on a portable drive (especially before a Windows version upgrade), however my only remote backup is file-level of my data only, the theory being that the rest is easier to replace.

Boot into Duplicati via CD or disc on key? is a current discussion from someone wanting something near BMR, however there’s also a discussion of how to get Duplicati to do fast file-level restore without big database wait. Please feel free to toss on any ideas I missed. To me, this is an impediment to those who want fast recoveries.

I use Veeam Agent as my imaging backup. It works by creating one big file then a series of smaller files for the incremental backups which it merges into the main backup on whatever schedule you set. The problem is that it needs free disk space that’s double the size of the big file in order to do the merge. In other words, if you have a 750G file on a 1TB drive, it won’t work. Additionally, if you’re backing up over a WAN, it means backing up 750G (in my example) plus a smaller incremental file every single day, which is a huge amount of bandwidth.

With Duplicati, everything is smaller and designed for network backups. My hope was that I could use Duplicati in lieu of Veeam but it sounds like I can’t at this point?

I guess the other alternative is to have an image backup of my laptop and then a Duplicati backup of the image files, which would allow me to more easily restore the image files if my laptop dies on the road.

For upload, the amount Duplicati uses depends on how extensive the changes are in the files compared to previous versions. Duplicati only uploads changed data, but some files gain a lot easily. You can test yours.

For dropping Veeam, that depends on how you prefer to restore the OS and apps, but that may depend on whether this is a corporate image or a personal system. It used to be that retail PC vendors would include a CD to get things back up, but these days most seem to use a recovery partition. Can’t use it if drive breaks.

Drive breaking on the road is even tougher unless you’re actually willing to carry (and risk) a portable drive, because 1 TB or 750 GB over WAN is going to be slow (but speed may vary depending on where it’s done).

I didn’t quite understand the Duplicati restore of the image file on the road. You’d need Duplicati, a place to restore the image to, and Veeam to restore it to the blank drive. Sounds hard, but all solutions may be hard.

Sorry to speak in generalities. If you can say more about the specifics of your situation, thoughts may come.

Veeam is incremental meaning I’d have to download the initial full backup plus all incremental backups to restore the drive, all in huge file sizes meaning if internet conked out, I’d have to start the download from scratch. Duplicati would only have me download the files needed for the latest backup (or whatever I wanted) and in small file sizes where an internet glitch wouldn’t be the end of the world.

Neither scenario is great but better than being on the road, having the drive die or computer get stolen or something and having no recourse whatsoever.

Block-based storage engine explains how Duplicati 1.3 was incremental in that way (full, then a series of incrementals). It’s still sort of incremental, but not in the standard way. The first backup is a block-based incremental-from-nothing. Restore still needs to download all the blocks of a file (an issue for huge files), however it wouldn’t download a series of huge files to get to the last, and you can set –number-of-retries