I’m trying to backup my full ubuntu VPS which has around 20GB of data. At some point of the backup, the total size goes to a huge number "399290 files (128.01 TB) to go". I have excluded some folders so I don’t really understand why is it happening. Excluded: "/dev/*","/proc/*","/sys/*","/tmp/*","/run/*","/mnt/*","/media/*","/lost+found"
Did the backup somehow finish despite that (i.e. a false alarm), or did you have to kill it somehow?
It looks like the trick is to set dry-run=true (which seems a bit odd to me) to get the file sizes logged.
Also set log-file=<path> and log-file-log-level=verbose. Look for lines saying “Would add” and “size”.
Note that Linux sparse files may have huge sizes (without having much non-zero data). An example:
Do what? That’s a big quote. If you mean just backup your VPS, it seems like many providers can do so.
The ones I looked at from a web search seemed to do drive image backups, not files as Duplicati will do.
Frequency of backups/snapshots also varied. You’ll probably need to decide your needs, then go looking.