First Backup fails "uplaoded with size 0 but should be..."

I’m new to duplicati - my first attempted backup to remote storage is consistently failing. I’m backing up about 30GB to webdav storage (pCcloud).

The connection tests as fine using the test conenction button. The backup seems to do it’s local stuff fine, then some duplicati files appear in the webdav storage (so clearly duplicati can write to there), then it fails. The logs give me the following error message, which means nothing to me:

Warnings: [
2020-01-16 09:49:46 +00 - [Warning-Duplicati.Library.Main.Operation.Backup.UploadSyntheticFilelist-MissingTemporaryFilelist]: Expected there to be a temporary fileset for synthetic filelist (1, duplicati-20200116T093249Z.dlist.zip.aes), but none was found?,
2020-01-16 17:02:07 +00 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-b96f7964d2e5249b2a0def35ca7ef790c.dblock.zip.aes is listed as Uploaded with size 0 but should be 23467393229, please verify the sha256 hash “e+WEP/QWNRucZLn/gfUeuhVUZY9GxWbF/BrJvMk05og=”
]
Errors: [
2020-01-16 17:04:48 +00 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-b96f7964d2e5249b2a0def35ca7ef790c.dblock.zip.aes
]

Does anyone have any clue what this means?

I’ve tried increasing the volume size in case it was too many files being put on the remote storage, but even with the volume size set to 50gb, the same thing happens.

Thanks,
Ben.

Welcome to the forum @benhen31

Did you find a file of that name? If so, what size is it? I think this is saying it’s 0 and should be 23 GB, which I’d call way too large for reasonable performance, but you did say you you were testing 50 GB.

What size were the earlier dblock files? The default is 50 MB. Choosing sizes in Duplicati has advice. Ordinarily you should have a flow of dblock files at about the size limit, and much smaller dindex files.

WebDAV, pCloud, Duplicati is a recent case where pCloud WebDAV broke then they fixed something. Whether or not this is relevant here is unknown, but signs seem to point to upload/download issues…

would be a download issue, probably during Verifying backend files where something verified wrong.
Given the Warning above it talking about length and hash, I’d guess it’s an empty file. Can you look?

For any future tests, it might be useful to back the remote volume size down, and watch live log from some subset of the 30 GB. About → Show log → Live → Retry should give more info on any errors.

Thanks for the suggestions @ts678 !

Did you find a file of that name? If so, what size is it? I think this is saying it’s 0 and should be 23 GB, which I’d call way too large for reasonable performance, but you did say you you were testing 50 GB.

I do see that file in pCloud and it has a size of 22GB reported by pCloud, which is about right (although not the exact size Duplicati is looking for, but close. That could just be the way pCloud is displaying file sizes in the UI I suppose).

I tried with the default 50mb dblock settings initially and got the same issue, then I tried 500mb and then 50gb. Same every time. I was trying to rule out it being an issue with too many files listed on pCloud as the source is a lot of small files.

WebDAV, pCloud, Duplicati is a recent case where pCloud WebDAV broke then they fixed something. Whether or not this is relevant here is unknown, but signs seem to point to upload/download issues…

I read through that but it seems to be a different issue - I can connect to pcloud fine and files get written fine. That OP had issues getting any connection at all.

would be a download issue, probably during Verifying backend files where something verified wrong.
Given the Warning above it talking about length and hash, I’d guess it’s an empty file. Can you look?

As above, the file is there on pCloud with the roughly correct file size, so something appears amis if duplicati thinks it’s an empty file.

For any future tests, it might be useful to back the remote volume size down, and watch live log from some subset of the 30 GB. About → Show log → Live → Retry should give more info on any errors.

I’m going to try this today.

Thanks for the pointers!

21.85571308154613 would be the answer if using binary units, a.k.a. Gibibyte.
23467393229 / 1024 / 1024 / 1024, then rounding to nearest tenth is 21.9 GiB.

Possibly so. A better error message might give a better clue, if live log can get one.

A somewhat technical way to see what Duplicati is seeing is to see its interpreted results from the List command that it gave. You need an SQLite browser such as DB Browser for SQLite to do a read-only open of the DB at Local database path in Database screen, go to RemoteOperation table, and see what list is returning for Data. An example of output from a rather small backup that I made is below:

[
{"Name":"duplicati-20200114T222151Z.dlist.zip","LastAccess":"2020-01-14T17:23:18.4224117-05:00","LastModification":"2020-01-14T17:23:18.4224117-05:00","Size":832,"IsFolder":false},
{"Name":"duplicati-ba86ac863b1dd42cab26f681841ab77e4.dblock.zip","LastAccess":"2020-01-14T17:21:51.3949661-05:00","LastModification":"2020-01-14T17:21:51.3949661-05:00","Size":1952,"IsFolder":false},
{"Name":"duplicati-i51e8045bb7be413f95565ba3c697dd6d.dindex.zip","LastAccess":"2020-01-14T17:21:51.6888307-05:00","LastModification":"2020-01-14T17:21:51.6888307-05:00","Size":1322,"IsFolder":false}
]

Getting deeper than this will probably be difficult. Encryption makes watching network traffic difficult, however at least on Windows I think one can log the raw unencrypted pCloud response if one wants. This is getting really deep though. What OS are you on? Do you have any unusual optional settings?