So, something odd going on. Using Duplicati - 22.214.171.124_canary_2021-03-10. I have 3 backups on same machine to same machine running daily. If I click restore on 2 of them, a second or two later it shows me a list of backups and the directories. On the 3rd backup, if I click restore, it sits a very long time on “getting file versions” (more than 10 minutes) and eventually gives me a missing XSRF token error and shows a list of backup versions. I presume the error is due to how long it takes to get the file versions list. I have deleted and recreated the database. The backup destination is a remote machine via sftp. Backup actually works fine, just the restore. While the file version list is ongoing, I can see UPLOAD link from my site to the remote machine is saturated. To list file versions, I would have thought it would be download (or from local db), but maybe that somehow makes sense. I don’t know what else to look at. Upload can’t possibly go any faster where I live, max is 3-3.5 mbps, and why should a file list be using so much upload bandwidth?
Both other backups list file versions in a couple seconds. I don’t understand why that might be, backup sizes and versions are about the same across the backups. The sqlite database for the one that fails is 1/4th the size of another backup set that works (to restore). It’s the smallest sqlite file of the 3 backup sets.
Additional info: source is 19GB, backup is 30GB, versions is 19.
UPDATE: This is a non issue as was my fault. I found an error in the after backup job that was doing stuff when it shouldn’t be, didn’t check the operation being done.