It was not the Docker hanging, it was the Docker using so much CPU cycles and memory that at one point the whole NAS got dragged to a crawl.
That being said:
A backup to USB uses a lot less memory and thus works at around 6.7 mb/s when the first 670 mb MOV file is processed, rather slow but at least working. Here is the Docker status right after the first MOV file is processed:
Seeing this I deleted the Sharepoint backup job and created a new Sharepoint job. This one still used too much memory and runs slower than the USB job at only around 2 mb/s. But it did work and even started uploading several files before I quit the job. Here is the Docker status right after the first MOV file is processed:
Looks like uploading to Sharepoint uses more memory, albeit that Docker number does not correspond to the real NAS memory usage anyway. At least it works now and even leaves the NAS with about 1 gb free memory to work on with.
So why did my last backup job not work when this one does (albeit performing slowly)? I suspect that the old backup jobâs file got corrupted. It was an import from my exported Windows PCâs job (JSON file) to begin with. Then I added and removed a couple of advanced options and in the end it all went down the drain.
This should not have happened, but at least we know now that old backup jobs can get corrupted and thus creating new ones can fix some issues.
And here we go again. I changed one (1!) option by switching to Sharepoint âbinary-direct-modeâ and my NAS got drained out of memory again. The Docker status only claims 600 mb being used, but my NASâ memory was maxed out nonetheless.
Next I shut down the Docker container, restarted it, removed the âbinary-direct-modeâ option and this time itâs working again. So either had had that option set in the old backup job without noticing (most likely under âDestinationâ advanced options then) or it got stuck in the job file after being removed.
This time itâs running at about 2.8 mb/s, which my routerâs throughput monitor roughly corresponds to. Advantage of using the NAS is that I can prioritize that one easier in my router.
I just noticed that Dockerâs âOverviewâ tab lists the real CPU and memory load, as opposed to the numbers listed with each container. So I will take another look at the real differences between the USB and Sharepoint target.
Overall Docker seems a bit memory hungry, starting at 240 mb without any container running. HyperBackup surely is less demanding on memory.
There is going to be some overhead with the docker engine. If your NAS is really that memory starved then maybe itâs not a good idea. You could go back to using Synology packages for Duplicati and Mono. Perhaps itâs a bit less memory intensive.
Unfortunately the Mono + Duplicati combination keeps crashing even with the latest Mono 5 + Canary packages being installed.
Searched a bit about the âLost connectionâ error, but only found the solution of making sure the Synology user is part of the âhttpâ group. That was already the case, so I am currently out of idea why it keeps throwing me out of the UI.
Since my last Windows based backup failed due to file number limit (aka too small a volume size) I am now in the process of doing another upload via Docker. Memory usage currently is 440 mb according to Docker Overview and the backup just passed the first 670 mb MOV file.
No idea why the Sharepoint âbinary-direct-modeâ option sucked the life out my NASâ memory, but it seems I will have to live with lower upload numbers and rather high memory usage if I want to use Duplicati on the NAS.
I have to admit that I would prefer being able to just use HyperBackup at this point, but there is no way of making it connect to Sharepoint Germany (not even WebDAV).
Duplicati still seems too unreliable yet and the Docker solution is too resource-hungry and slower than my upload bandwidth. So for the time being I decided to try a workaround for uploading my already present Synology HyperBackup files to Onedrive Business Germany (aka Sharepoint) instead.
Because HyperBackup and CloudSync do not support connecting to Sharepoint I am using the Onedrive app on my desktop PC being pointed to the NAS share with the HyperBackup files (via symbolic link). That way the Onedrive desktop client is responsible for uploading any changes. It costs a bit of CPU load on the NAS for both the SMB and NTFS services (HyperBackup files from NTFS based USB drive on the NAS), but overall it seems rather resource friendly and fast (enough) on first tries.
Glad you found a solution that works for you!
Not sure why you had so many issues. I ran the Duplicati Synology package on my NAS when it only had 2GB⌠I was using B2 for the back end though.
Well, the solution is not working as hoped. After a reboot OneDrive turns the symbolic link into a real directory, which means that it is not synchronized to the NAS anymore. Would have been too easyâŚ
The alternative is to create a CIFS mount point on the NAS and then automatically copy the files via USB Copy. But that stupid program only allows to setup one job per destination, so it only works for one large HyperBackup source, not for several smaller ones.
I specifically looked into Duplicati because of the Sharepoint integration, but from what I have seen I would still prefer HyperBackup (both in setup and reliability). The Docker solution is one that works (if binary upload mode is not used), but several hundred mb memory usage just for a backup seems a bit excessive and there are still the reliability issues.