This sounds like a delay of sorts, where the storage is slow in updating the listing.
How long apart are the runs? When it complains about a file that is missing, can you manually verify that the file exists on the remote destination?
I am guessing that this has do with some caching in a backend that is not behaving. With 2.1.0.8 I changed the way backends are (re-)used, and if the caching is not working correctly across the instances, this could explain the problem.
If this is indeed the problem, I am very interested in knowing what backend that is causing the problems.
The option --asynchronous-concurrent-upload-limit=1 can be set to avoid using multiple backend instances, and this should fix it, if it is indeed a cache issue.
The files that are “missing” on the remote storage, are always there when I read the names from the logs, so, I guess, they are there just “late” for the verification step.
From the next version of Duplicati, we will be giving a warning when using Mega.nz as they do not have a public API for use by non-Mega products and there is no maintained library for this anymore. Without a public API they could change the implementation at any moment and break the backend.
That confirms my guess then. I will try to make a common solution for this problem across backends.