It is possible to have more bytes in the buffer that has the maximum configuration buffer: 2147483647.

Giga is in billions. Meta is in millions. Your thousand-fold increase is probably the cause of your problem.

An answer to this would be nice. I found a OneDrive problem, but now it looks possibly more generalized.

These are unrelated. Please follow the link on the Remote volume size option you raised a thousand-fold:
Things seem a little messed up in the web site changes, but what it was aiming for was probably section

Choosing sizes in Duplicati - articles - Duplicati

Rather than storing the chunks individually, Duplicati groups data in volumes, which reduces the number of the remote files, and calls to the remote server. The volumes are then compressed, which saves storage space and bandwidth.

and earlier on, it talks about The block size, a.k.a. chunk size in this article. A large file gets lots of chunks.

You didn’t need to change anything to backup your 10 GB file, but now you’re probably trying to put the whole backup into one file that is too big, and this will cause all kinds of trouble. Can you look at the destination files to confirm that they’re sometimes about 2 billion bytes (2 GB)? If so, restoring files will need to download the huge file, even if it only needs a tiny bit of data for a tiny file. That’s why the article is telling you not to do that.