Best practices for backing up virtual machines?

I’m considering either running duplicati inside a Windows VM running in VMware, or just backing up the VM files from the host. There will be a lot of Windows system-y stuff that isn’t typical the typical “user files” backup use case, but getting this VM up and running after a failure would require significant time and there is a definite attraction to having a backup of the whole thing.

Are there any best practices for backing up a VMware VM? It seems like if I could figure out the block size VMware uses for the VMDK file and match that with Duplicati I would have a better chance of deduplication working. Anything else?

Hey kbyrd,

I don’t think Duplicati is the tool appropriate for what you want to do. Duplicati doesn’t support the backing up and restoring of entire Windows machines, which would require the backup of the “System State”. If you install Duplicati within each VM you would forgo the deduplication between many VMs. Basically it would only dedupe the data within each VM. Depending on how your VMs are hosted it’s best to grab the .vmdk files. I personally backup my Hyper-V machines which are cold (shut down during the backup) and I’ve achieved best performance from setting the block size to 700KiB and reducing the compression level to 1 when using the default zip compression. How are you hosting the VMs, on ESX or Workstation and which operating platform is used for the host?

1 Like

I’ve achieved best performance from setting the block size to 700KiB and reducing the compression level to 1 when using the default zip compression. How are you hosting the VMs, on ESX or Workstation and which operating platform is used for the host?

This is just for personal backups running VMware Fusion on OSX. I would only do the backups with the VMs shut down. How did you pick that 700KB block size?

Mainly due to the sizes of the Hyper-V machines themselves and the changes within the container file. When I was testing a simple power-on and then power-off would yield around 30MB of changes. I could have pushed it to 1MB as I’m backing up locally to a directly attach storage device, but it didn’t really matter much in terms of performance. Maybe do a few runs and see what you get in terms of compression and dedupe ratios. Although I suspect it won’t be much, but it’s all down to how the data is laid out in the container file (.vmdk).

You might use the free veem virtual machine backup or another software to nas or usb, and let Duplicati handle the changes for upload veem images from nas to the cloud.

There’s that or changing the storage, which is what I done. I now have my VMs sitting on a FreeNAS and backup through snapshots. I then send the snapshots to another FreeNAS at a remote location.