I configured Duplicati successfully.
The issue is now that the data transfer from my server to OneDrive for the backup is ridiculously slow: 621848 folders to go at 2.61 KB/s !!!
Also it’s written that there is 128 TB of data, but I have only 20 GB on the server, including only 13 GB that are used !?
If anyone have an idea of what to do, I would be thankful!
If anyone could have an idea of what you do, it would be helpful. At the moment all that is known is that you have a ‘server’. That’s as vague as it can get. That your drive is reported as having 128 TB seems to point that it’s not a vanilla SATA drive, so an exotic and non standard backup source could explain the low speed. Also you ‘configured Duplicati’. How more precisely ? Duplicati can run in many different contexts.
That’s a lot of folders. Where are you seeing a folder count? On status bar, I see a file count.
Written where, and if it’s something that changes, any idea what it was saying at earlier point?
Status bar numbers (if that’s the source) are many, and are constantly changing. Need info…
When it says “Counting”, it’s figuring out its workload. After that, the work backlog goes down.
At some point there’s enough data for first upload (and a speed), but uploads later might vary.
It’s certainly logically possible to have file sizes that exceed space used, e.g. with a sparse file.
Files with apparent 86TB file size, how to shrink them? was a recent real-life question on Linux.
Maximum logical file size varies with OS and the filesystem on the OS. What are you running?
That sounds like the upload rate on the status bar. Are its other numbers going down? How fast?
It would be worth looking on home screen at the job, to see if it’s slowly reading a big sparse file.
The gaps in a sparse file are zero-filled, so would not require any uploading, and speed will drop.