New in duplicati looking for ideas on how to manage a backup of large data (4TB HDD)

Im looking for a backup solution to backup large amount of data. My current scenario is a 2x4tb hdd in RAID 1 SATA 3 (6gb) useing intel raid controller. Every day SQL tables and other important data gets backed up to remote folder (1gbps network) and once a month I perform manual backup (full system server image) to an external HDD HD via USB3.0 interface. The problem is speed, current image useing clonezilla and no compression or verification takes almost 48hs to complete and days backups takes 1hour aprox.

The server runs a software with a database and when I perform a full backup it is not accessible I know SATA HDD as a destination would be a lot faster but server software blocks itself when it detects a hardware change if it boots to OS, so using an internal hdd to perform backup is not an option (since if pc accidentally boots to Windows Software will detect hardware change and block itself demanding time to wait for license from provider).
At the current growing I’ll need to add 2 more 4 TB hdd in march or april and 2 more by the end of year, current backup will no longer be usefull since backup procedure will take more than a weekend to perform.

I am looking for ideas to perform a backup method that in case of failure allows me to put the server operative as fast as possible, maintaining the relative high security I have now (move the image hdd to external site and store it in safe box).

Any ideas on how to manage a backup fast enough will be appreciated. Internet connection is low speed here so using cloud is not an option (1Mb upload only).

Hello @Ramiro85, welcome to the forum!

It looks like you’ve got a pretty good setup going right now, other than the manual intervention and downtime.

Let me start by saying that Duplicati is a file level backup tool, so if you have a hard drive failure you’ll still have to manually install the OS, programs, etc. (or restore from Clonezilla) then restore from Duplicati before you’d be back online.

If that’s still adequate for you then you might want to consider how much data changes in the various parts of your backup source. If it’s all pretty static EXCEPT for one area (such as the database) you might want to consider two backup jobs - one of the static data and one of the more dynamic stuff.

Alternatively, if you’ve got critical and non-critical level data (as in as-fast-as-possible restores vs. can be a day or two later) then splitting them up that way might make more sense.

If the reducing destination disk space usage is of primary importance then a single big backup would give the best deduplication, though honestly exactly how much benefit that would provide is hard to estimate and would depend on your data.

One other thing to consider is versioning - are you looking to keep multiple versions of files or just a single one?

1 Like

The idea from useing Clonezilla is to avoid reinstalling OS and software since every other pc in the facility uses server to store data and run the software if server is offline, medical stuff can’t work, the company can’t cash… etc… . So minimizing downtimes is my priority now.
Basically two areas change, one image data witch increseas overtime (there is little change to recorded data, but data gets added at 10GB /day aprox ) and database where change is frequent over short time periods.

Blockquote you might want to consider two backup jobs - one of the static data and one of the more dynamic stuff.
Blockquote
Actually I have 2 separate backups tasks one for sql and one for image data folder.

Currently I keep 2 version of full backup one (lastest clonezilla) on the external HDD and one In my own home server (both out side of the server working place) + the incremental backup on the remote pc (witch is 30 days incremental and then + 1 full size) on the working site.
So the idea of current setup is to have it operative in case of failure ASAP, and maintain as fast as possible the backup procedure.
I was thinking of adding a Fiber network card to the server and to the remote pc to get and increase of the backup speed. The question is how fast would it be and if it is the logical step or there are another options I am missing. ¿Anyone has any kind of experience performing backups over fiver optic?
The current clonezilla is limited to the usb 3.0 speed Im thinking moveing to fiber would give me more speed to perform the backups…

I’m not sure on this but I suspect you’ll find Duplicati backing up 10GG of new data per day to be slower than you need.

If you don’t already know, I’d suggest checking all your communication point speeds before installing a fiber card. For example, if the remote PC is being backed up to over the internet and your bandwidth is 20MBps then having a 2GBps internal network won’t give you any improvement since the internet chunk would still be 20MBps.

While I personally use Clonezilla for backing up one of my servers, that server can handle being offline for 5 or so hours while the backup is made. In your case, you might want to consider some other imaging tools that allow for live images. I’ve seen 3 or 4 of them, though I’m pretty sure unlike Clonezilla they’re all non-free for business use.