It is possible to have more bytes in the buffer that has the maximum configuration buffer: 2147483647.

Hello, I have some machines doing BKP and this error started to appear, and it no longer does BKP.

How do I resolve this?

Welcome to the forum @Renato

On job Destination screen, what is Storage Type?

On job Options screen, what is Remote volume size (and if not default 50 MB, please read the note).


As I have some 5gb files, 10gb to make BKP.
Opt for a 50GB download

General options
Remote volume size

In both cases, the error comes back to us within 7 days of the BKP.
You have an all swap for these settings:

General options
Volume Size Remover

All voltages are higher than BKP.

But I have some doubts. How long has this been so functional? Is this another problem?

Is there any difference between 50 GB and MB?

Giga is in billions. Meta is in millions. Your thousand-fold increase is probably the cause of your problem.

An answer to this would be nice. I found a OneDrive problem, but now it looks possibly more generalized.

These are unrelated. Please follow the link on the Remote volume size option you raised a thousand-fold:
Things seem a little messed up in the web site changes, but what it was aiming for was probably section

Choosing sizes in Duplicati - articles - Duplicati

Rather than storing the chunks individually, Duplicati groups data in volumes, which reduces the number of the remote files, and calls to the remote server. The volumes are then compressed, which saves storage space and bandwidth.

and earlier on, it talks about The block size, a.k.a. chunk size in this article. A large file gets lots of chunks.

You didn’t need to change anything to backup your 10 GB file, but now you’re probably trying to put the whole backup into one file that is too big, and this will cause all kinds of trouble. Can you look at the destination files to confirm that they’re sometimes about 2 billion bytes (2 GB)? If so, restoring files will need to download the huge file, even if it only needs a tiny bit of data for a tiny file. That’s why the article is telling you not to do that.

I read it here, thank you.

Just one more doubt.

I have to do BKP on 50 Computers, I have a OneDrive Subscription, I used it directly on OneDrive, but due to data security and privacy, I am migrating to Duplicati.

Do you think it is possible to do this? 50 computers?
Maybe I can set everyone to default Block Size.

If each machine is running a separate backup, there are no problems on Duplicati’s side, as they cannot see each other.


Another question, is there any monitoring I can do?

Ex: a Dashboard where I see if BKPs are being made on all machines?


Yes, we have recently launched a hosted monitoring portal:

Other solutions could be the dupReport tool or

What exactly is your concern? There are some possible ones, depending on how separate you want them. Minimum requirement is – don’t set several Destination to the same folder without another way to separate.

I’m not sure what this means. If there are multiple users on the multiple computers, all using one OneDrive, interference becomes possible. It’s better if there’s one user or all users are fully trusted to not harm things.

There are also some in-between approaches, such as encrypting differently for each user, but an untrusted malicious user could still delete backups for other computers. A safer use case would be a lab where staff already has access to everything. An unsafe use would be where users are unknown and so not so trusted.

To separate the BKPs I am using this structure of folders and directories.


I have a central folder within OneDrive and within this folder I have the sectors, within each sector there is a folder for that machine.

So it’s very separate.


The destination folders are separate enough to not confuse Duplicati, but a user can reconfigure Duplicati. What keeps the sectors and users from reading or deleting each others backups? Does OneDrive solve? Problem would exist even without Duplicati, e.g. if everyone shares a login, then things are real connected.

Just wondering about information security. If you’ve got it as you want it, then great.

This is a separate question. The default blocksize is badly suited to large backups because it makes too many blocks while it’s breaking up source files for deduplication. I usually think 100 GB is the limit. Others say higher. Big backups run faster with raised blocksize, but I don’t know if yours are big enough to worry.

Also, keep in mind that we were talking about Remote volume size before, which is similar to dblock-size, meaning size of the bundle of blocks. In the above, I’m talking about blocksize since you wrote Block Size.

blocksize sometimes needs to be raised for larger backups.

dblock-size sometimes merits a higher Remote volume size, and tradeoffs are explained in the document. Extremely large Remote volume size is hard to fix, I think. Existing huge volumes aren’t instantly broken up. Eventually it might happen during Compacting files at the backend (or by forced compact, but it’s awkward. Restarting fresh might be easier unless there’s some history you want to keep from the runs done thus far.

Since I run the company’s IT department, I created a new account and shared storage with that account.

So the only one who has access to all the files in this account is me. And the master key is just me too.

This way you will never be able to access the files other than me.


I might misunderstand. This might work if you’re doing one Duplicati, with backup over SMB (not advised). Putting Duplicati on a computer means putting credentials on the computer, and it’s hard to keep user out.


It’s easier to keep user out if user never touches Duplicati. You can even do command-line backups for a disaster recovery case, and not even tell the users. These can be hidden, and run by Task Scheduler, etc.


Lock down is harder if user is allowed to get administrator access, as then file permissions can’t help out.


I sometimes ask people if they prefer users to do self-service file restores, or get somebody else involved.

I saw it here and liked the solution more. I’ll join too.

One question, for monitoring using Duplicati

Do I need to use the same OneDrive account?

Or can I have one account to store the BKP and another, for example, Google (Or Outlook) to monitor all the BKPs?

The monitoring does not care where you store the data. The account you log in to the portal with does not have to be related to the storage account.