I’m looking to define a backup for a drive that has a lot of fairly large (>= 10MB) files.
The other important thing about the files is that they don’t change that often (or more than once in most cases). In fact, the larger the file the less frequently they change (i.e., most of the larger files are Photoshop PSD files I’ve created/optimized and don’t edit further).
Here’s the breakdown:
File Size Range
Number of Files
<= 1 MB
18,854
>1 and <=2 MB
5,358
>2 and <=5 MB
15,028
>5 and <=10 MB
2,333
>10 and <=25 MB
15,499
>25 and <=50 MB
4,697
>50 and <=100 MB
254
>100 and <=250 MB
298
>250 and <=500 MB
82
>500 and <=1000 MB
86
>1000 and <=2500 MB
108
>2500 and <=5000 MB
8
>5000 and <=10000 MB
3
>10000 and <=100000 MB
-
>100000 and <=1E+99 MB
1
What would be a good block size to consider? And by block size I mean anything size related that I can specify when defining a backup (I’m new to Duplicati and don’t know the terminology all that well yet).
Choosing sizes in Duplicati covers some of this, but beyond that try to set blocksize large enough to have fewer than a million (or maybe a few million) blocks in your backup, otherwise database work can be slow.
Thanx very much for your reply. I’d read that article but it wasn’t clear, at least to me, how to apply what it contained to the issue I raised.
In any event, I’ve abandoned Duplicati in favor of Arq Backup, which does everything I need and is much, much easier, at least for me, to configure and use.
Interestingly, since this afternoon when I posted the same info in a reply I’m now being forbidden to post links to the Arq Backup site. How weird.
Your posts were automatically flagged by the system as suspicious activity (making multiple posts containing the same URL). It looked like spammer behavior to the system. I restored your posts because nothing was wrong with them. Thanks!