Mega Backups fails (+ MFA Setup)

Hi there
I’m currently running and just now received the report, that one of my backups failed.
It’s a backup of a local directory to Mega (with the “new” MFA option set). On Friday (2 days ago) it worked without a problem, now it doesn’t.

Just a few minutes ago I received the error message “At least one error has occurred.” Unfortunately, there is no entry in the log for the backup-attempt at all.

I created a third backup to the same mega and it works without a problem.

I tried to verify files, repairing the database, recreating the database and all of them ran without any errors.

I tested around quite a lot and the problem seems to be linked to 3 video files. Without them in the source, the backup works. The files are 423MB, 353MB and 387MB of size.
It’s no problem related to the left storage of the account I used, I tried with my other account that has 30GB of storage left too, same problem.


please try to gather more information since as it is there is not much to go on.

For that, you can export the job as command line, open a terminal (in admin mode if you are running Duplicati as a service) and paste the export into it, add the flag --console-log-level=verbose, then run it.

When it has crashed, copy, say, the 200 preceding lines, zip them and attach the result to a reply.

In the server log (on the about page) there should at least be the error message that caused the failure.

Seems to be related to the API “ResourceExpired”

Jul 2, 2023 1218 PM Failed while executing Backup with id (20.9 KB)

Thank you for you answer.
Here is the Output:
I recorded the whole output, if you need more than the last 200 lines, I’ll need to censore them but can get you them.
“ResourceExpired” as API response seems to be related

Here they are
output (16.7 KB)

This issue had the same error and stack trace, but it was said to be resolved with an update:

There is also this forum post:

Neither of those seem to indicate a connection to the file sizes, but maybe nobody ever noticed. If you can reliably reproduce the error, you might be up to something.

It might help others to reproduce if you post your job configuration (without passwords and other sensitive data).

Maybe you could look at it to see if Backend event: Put worked for awhile, before error section.
Or if you can see file dates in MEGA, see how long new ones were going up.

Maybe the quotes around “new” mean two years old, but maybe Canary didn’t give it much testing.

Added support for 2fa, thanks @vfrz

  --auth-two-factor-key (Password): The shared secret used to generate
    two-factor TOTP codes.
    For accounts with two-factor authentication enabled, this is the shared
    secret used to generate the two-factor TOTP codes.

and it seemingly does really mean the original shared secret, not usual 6 digit time based code.
I see it generates the code before giving it to MEGA. Does the system time here seem correct?

If any of the failing accounts doesn’t use 2FA, then that’s a good way to kill this line of inquiry…

Problem is that it’s rather unclear what’s going on. There’s a reverse-engineered protocol done
from a third party library – and MEGA support generally doesn’t want to even start with support.

There are a few similar uses, one major one used by rclone, and I’m not finding such problems.
I did find a complaint about concurrent uploads, so especially after you had some early uploads
asynchronous-concurrent-upload-limit set to 1 instead of its default 4 might possibly avoid issue.

It might be best to set --auth-two-factor-key option on Screen 5 too. Screen 2 loses options.
Fix bug that removes advanced target options when editing backups #4972 PR means to fix this.

When option is on Destination screen 2, you could seize opportunity to Export As Command-line.
Duplicati.CommandLine.BackendTool.exe and Duplicati.CommandLine.BackendTester.exe could
use the target URL (maybe edited to a different folder, and automatic Test will insist on having it).

It’s true that the error begins before - I did not take in account the retryings when I asked for the last 200 lines. Anyway it’s possible that there is not much more interesting data before. If it’s the case it could be interesting to follow all the good advice you received, particularly posting your job config - did you by any chance raise the maximum file size above the default value of 50 MB ? If yes the problem could be that you are simply suffering from a time out due to too big blocks. There is often a maximum time for a single Http transfer (PUT).

Edit: I forgot to say that the advanced option for that in Duplicati is only active if Duplicati is managing the Http transfer itself. When there is an external third party driver, the option needs to be passed by specific API - something that don’t seem to be done currently.

I was able to reproduce it for the last 2 days. Just to confirm again, I just ran the backup again and… it worked without any problem. Even tried creating copies of the previously problematic files, but the backup worked anyway. I’ll try to replicate it in the next few days.

Here you go (I hope that’s what you meant.):

"C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe" backup "mega://duplicati/gdrive?" "C:\Users\Username\Google Drive(\\" --backup-name="Google Drive" --dbpath="C:\Users\Username\AppData\Local\Duplicati\GXEPBEUIUA.sqlite" --encryption-module=aes --compression-module=zip --dblock-size=50mb --retention-policy="30D:1D,5W:1W,12M:1M,99Y:1Y" --disable-module=console-password-input

Will do once the error occurs again.

I’ll try disabling MFA once the problem comes up again.

nope. I left it to default 50MB.

Thank you for all your help.
I’m sorry that I don’t know how to provide any lead on what caused the problem.
From what I’ve found so fare regarding problems with Duplicati, Mega doesn’t seem to be a good choice for a backup target. I’m using it, as I’ve got a grandfathered 50GB Free account.

I just tried it with my test install for next Duplicati (it uses the same Mega driver than current Beta) and succeeded with a 480 MB file so it’s not a general problem (however I did not try it with 2FA - I have no idea how to use this, I have only setup a test account with Mega to test Duplicati)

Yeah I dont know.
This is my second post screaming for help - last time the transfer limit of Mega was reached for some reason. (I guess due to my ISP cycling public IPs on a daily basis)

Anyway, here is a short guide on how to use Mega with MFA multi factor authentication with Duplicati:

  1. Set-up MFA in Mega in the security part of settings and copy the seed code that is shown to you.
    1a. If you already have it activated and use a non-shit MFA Tool you can also derive this from the TOTP: otpauth://totp/MEGA:*username*@*provider*.com?secret=herewouldbeacodethatyouneedtocopy&issuer=MEGA

  2. In the second step (Destination) of the duplicati backup setup, once needs to enter the credentials and expand the advanced options. In the dropdown, select the option auth-two-factor-key and enter the previously copied herewouldbeacodethatyouneedtocopy into the new field.

  3. Test the connection and see whether you were successful.

Yeah there is something rotten with 2FA. I test the connection and it is ‘successful’, yet trying to do a backup (empty, never mind a 400 Mb file) fails immediately. I can see the code in FreeOtp+ and it’s the same as in Duplicati :-/. Trying a second time, backup succeeds yet I can see retries in the live log, even for small files.
Removed 2FA and an empty backup succeeds without retry (it is always doing a download of at least 50 Mb file for verification purpose).
For the sake of research, if you disable 2FA, does it succeeds ?

1 Like

As currently the backup just works, I cant test it right now. But I will keep trying to replicate the behavior.