Hello everyone,
I want to back up 3TB of data. I don’t have any issues with the backup process itself—after tweaking the configuration and trying different suggestions from GPT, I’ve reached a relatively stable backup setup. The problem arises when restoring the backup. A full backup is taken every two days, but when I try to restore, the process just stays stuck on loading and eventually fails with an error.
Is Duplicati not suitable for handling this amount of data? If not, what is its recommended limit?
Both the backup source and destination are on the same server but stored on separate SSDs, so speed shouldn’t be an issue.
I’d appreciate any guidance. My OS is Linux, and I’m running Duplicati in Docker. I’m also using the latest version, which I updated a few days ago.
Thanks in advance!