Performance Issue mit large Database Dump

I have to backup a MS-SQL database dumpfile (64GB),created by a backupscript, from a Microsoft Server to a Buffalo-Nas over 1Gb Lan every night. Duplicati runs on the Server, Nas is mapped over smb. When I copy the file with explorer it takes arround 1,5h. First Duplicati Backup-session took arround 7h. I thought the next incremential backups should be much shorter, but it isnt. Last Backup (the 7th) took 8,3h. Im using default setting. I only changed the Volumesize to 200MB. Any Tips?

Hello @redbull99 and welcome to the forum!

Are you familiar with what a dumpfile contains? If it’s a sequential dump, an insertion will mis-align everything after that point, which will prevent it from being recognized for deduplication, leading to a complete backup…

If MS-SQL means SQL Server, there’s quite a bit of advice on backups around to be searched. For example:

Back Up and Restore of SQL Server Databases

Choosing sizes in Duplicati describes the usual tradeoff between heaviest deduplication and other goals, but if you have an insertion situation, the design of the deduplication (which uses a fixed blocksize) isn’t effective.