I’ve been using Duplicati on a Windows machine for a couple years but only doing backups of small files (1TB worth of document type files and photos). Now I’ve setup a new computer with a lot more storage and want to start backup up my NAS which currently has around 20TB of large files (media) that are between 1gb and 10gb each (small percent larger, equal number of smaller files with metadata that go with each media file).
Currently I setup, and ran successfully, a backup using blocksize 50mb, dblock-size 5gb. I set “remote volume” size to 5gb as well as I believe this is the same as dblock-size but it gave a warning (but still let me do it). Note that the backups are all on my local LAN, both NAS and backup box have 10gbe. These files are not very compressable as source and backup are same size but they are x264 and x265 encoded data mostly. Probably less than 10gb if you add all the small metadata files up (one or two for each larger file).
My question is this: Does my blocksize and dblock-size seem reasonable? Is remote volume and dblock-size the same? My goal is to not end up with 20TB being 20,000 files in one folder, so I went with 5gb each which means it’ll be 4,000 files which still seems a bit excessive. Currently I’m backing up individual sections the largest was 2TB but I now have one section that is 10TB and it’ll take a day to back up so I’d like to get it right the first time if possible. Any suggestions appreciated.
Ohh, lastly since these files are large media files they do not change often. I’m not too worried about a few blocks needing to be redone if a file is changed since that’ll be infrequent. I understand restore will take longer, but I don’t plan to restore much unless there’s a total failure in which case I’ll be restoring it all. I mean I’d hate a single file restore to take 5 hours, but this is on a local machine which is otherwise performant. Any tips appreciated!