Buffer Error Backup to OneDrive Business

I got the follwoing error if i tray to backup to OneDrive Business:

No more bytes can be written to the buffer than the configured maximum buffer size: 2147483647.

I do not know which buffer to increase and whats happening here. Sometimes everythink is working fine but most time I got these error message.

Thank you.

That message isn’t in Duplicati source code, and Google can’t find it on the Internet (except for here).
2 GB is very large. Do you have anything set that high, e.g. Options screen 5 Remote volume size?

My Remote Volume Size for OneDrive is 5GB because there are to much files in the OneDrive directory. But if I look at the One Drive directory after the first backups there are duplicati files from 10GB to 1MB.

Whats your recommendation?

Steffen

Was something actually failing? If you are hitting the infamous 5000 file list view threshold (courtesy of underlying Sharepoint), switching to a destination that uses Microsoft Graph API seems to resolve that.

The guidance for Storage Providers is a bit vague on which to use, but the expert in the area says that:

and just above that you can see my concern about such huge dblock files. You might download many to restore even one file which requires data gathered from several dblock files. Choosing sizes in Duplicati.

How big is this backup? Before you reach the 5000 file list view limit you would be at roughly 12 TB and be having slow performance unless you also use a blocksize of at least 10 MB (up from default 100 KB). Unfortunately you can’t change blocksize for an existing backup. You can change Remote volume size, but a smaller value will only affect newly uploaded dblock files,.but that should be enough to test things.

In your backup log’s Complete log the "KnownFileCount" will show how many files you now have. If you aren’t at risk of hitting 5000 soon, you could try reducing Remote volume size to see if error clears

Deleted all and starting with

–blocksize=500KB
–dblock-size=5GB

First Backup is running fine and at OneDrive there are 369GB and 149 Files.

Blockquote
In your backup log’s Complete log the "KnownFileCount" will show how many files you now have. If you aren’t at risk of hitting 5000 soon, you could try reducing Remote volume size to see if error clears

I donnot found the Complete log with KnownFileCount. Where is it?

Regards, Steffen

If the problem before complaining about 2 GB size was related to your 5 GB setting, it may still fail.

Viewing the log files of a backup job then click Complete log then you can search in web browser.
Or if you would rather read, it’s under the BackendStatistics section along with many other stats.

Thanky you now I found the Complete log and I filtered some dirs so now I have one running backup. After some updates I get again other errors like orphaned file and failed to connect SQL.

Running to OneDrive for Business with:
–blocksize=500KB
–dblock-size=5GB

Steffen

I’m confused. Is this one backup that ran awhile then broke, or multiple backups?
Is an update a run of a backup job? If not, what are you doing? Any exact errors?

“Unable to start the purge process as there are {0} orphan file(s)”

is the closest I can find to your “orphaned files” note, but what’s purging the files?

Often the first backup is running fine and then I get vary errors. I think OneDrive is no good destination for Duplicati.

Now the first backup is running fine and then I shows these erors:
2022-03-10 17:46:13 +01 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-b503e0f9de9184a179cedf0037c6d2328.dblock.zip.aes

An other point is that there are 0byte copies from the DB and duplicati changes to one of the 0byte versions. so nothing works, till I manually switch backup the DB.

Many problems and i do not know whats going up…

Terminology is still confusing, but I’ll assume you have not run this backup before (or did, then deleted it completely including database and destination). If so, you’ve made it through the backup but all was not running fine (in actuality), and you’re finding out about problems at Verifying backend files, described as:

At the end of each backup job, Duplicati checks the integrity by downloading a few files from the backend. The contents of these file is checked against what Duplicati expects it to be.

I recently asked for opinions, after my terrible experience. Nothing arrived. Maybe yours is feedback?

One difference is that my test was on OneDrive Personal, but yours is OneDrive Business. What exact Storage Type on Destination screen is in use when you get your warnings? I had proposed change.

If you are technical enough to try to compare your “bad” file with my detailed description, that’d be ideal, because I might be able to call the OneDrive expert in, if there’s a (weird) issue multiple people can get.

Hi,

I starting completly new (deleting all files and database but it dot now work anymore in the way it worked at the past.

I’m a it professionell and tryed much thinks, but now I can not using One Drive v2 as a stroage type. Everytine there are different errors after backing up some GB’s of data.

Here ist my backup config:

Settings:
–accept-any-ssl-certificate=true
–asynchronous-concurrent-upload-limit=2
–asynchronous-upload-limit=2
–auto-cleanup=true
–blocksize=500KB
–compression-module=zip
–concurrency-block-hashers=16
–concurrency-compressors=8
–dblock-size=5GB
–exclude-files-attributes=Temporary
–log-retention=60D
–no-encryption=false
–snapshot-policy=On
–thread-priority=normal
–use-block-cache=true
–dbpath=c:\ProgramData\Duplicati\Data
–http-operation-timeout=10m
–http-readwrite-timeout=10m
–all-versions=true
–asynchronous-upload-folder=d:\Temp\Duplicati
–auto-vacuum=true

Backup:
onedrivev2://_Backup/Hostname-Duplicati?authid=12345&fragment-retry-count=10&fragment-retry-delay=10000&http-operation-timeout=10m&http-readwrite-timeout=10m

–accept-any-ssl-certificate=true
–asynchronous-concurrent-upload-limit=1
–asynchronous-upload-limit=1
–auto-cleanup=true
–blocksize=500KB
–compression-module=zip
–concurrency-block-hashers=16
–concurrency-compressors=8
–dblock-size=5GB
–exclude-files-attributes=system,hidden,temporary
–log-retention=60D
–no-encryption=false
–snapshot-policy=On
–thread-priority=normal
–use-block-cache=true
–dbpath=c:\ProgramData\Duplicati\Data\AHUMPOCGZH.sqlite
–http-operation-timeout=10m
–http-readwrite-timeout=10m
–all-versions=true
–asynchronous-upload-folder=d:\Temp\Duplicati
–auto-vacuum=true
–backup-name=OneDrive_Hostname-Duplicati
–encryption-module=aes
–passphrase=xxxxx
–keep-time=60D
–zip-compression-level=5
–zip-compression-method=Deflate
–zip-compression-zip64=true
–disable-module=console-password-input

At this moment a try again (maybe last time) complete new backup (everything is cleaned befor).

We will see…

I’m sorry, but this is too vague to comment on. Please be very specific, providing actual messages.

Clarify this, for example did it fail mid-backup or fail during verification as in my recent explanation?
If the latter, can you look at one bad file as I showed? Do you have Linux? Analysis tools are better.

Generally it’s best to stay near defaults until things stabilize, however I’ll make selected comments.

Somewhat dangerous. Is there a need? Is this Windows? Any recent one should be fine without this.

“The process cannot access the file because it is being used by another process” potentially occurs, however I don’t know if that’s one of the errors you saw. If it is, try leaving this one at default of false.

This looks odd. Is it edited?. This is supposed to be a file path, as in your second list of your settings.
Why two somewhat similar but not identical lists? Is first Default options for some reason? Confused.