Was something actually failing? If you are hitting the infamous 5000 file list view threshold (courtesy of underlying Sharepoint), switching to a destination that uses Microsoft Graph API seems to resolve that.
The guidance for Storage Providers is a bit vague on which to use, but the expert in the area says that:
and just above that you can see my concern about such huge dblock files. You might download many to restore even one file which requires data gathered from several dblock files. Choosing sizes in Duplicati.
How big is this backup? Before you reach the 5000 file list view limit you would be at roughly 12 TB and be having slow performance unless you also use a blocksize of at least 10 MB (up from default 100 KB). Unfortunately you can’t change blocksize for an existing backup. You can change Remote volume size, but a smaller value will only affect newly uploaded dblock files,.but that should be enough to test things.
In your backup log’s Complete log
the "KnownFileCount"
will show how many files you now have. If you aren’t at risk of hitting 5000 soon, you could try reducing Remote volume size to see if error clears