Out-of-memory docker

Environment info

  • Duplicati version:
  • Operating system: docker duplicati/duplicati:
  • Backend: mega.nz


I tried to use the destination backup mega.nz but:

  • from web interface I follow the steps in the web wizard tool when I press “test the connection” after a while I receive: “Failed to connect: API response: ResourceNotExists”
  • from cli (in docker container) i run:
duplicati-cli backup "mega://XXXXXXXXXXXX?auth-username=YYYYY@ZZZZZ.HH&auth-password=WWWWWWWWWWWWWWW" /backups/  --no-encryption --upload-verification-file=false --dblock-size=10mb --debug-output=true

I receive the message: Checking remote backup ... Listing remote folder ... but nothing happens


I tested mega login:

  • with an empty user (no data inside) => NO PROBLEM work fast
  • with a user with ~800Gb of data => PROBLEM docker exit code 137

Duplicati run correctly the operation.
I monitor the duplicati container with docker stats and iftop, I see that the process running and download correctly the data from mega api endpoint.
But after a while docker container terminate: Exit Code 137 => Indicates failure as container received SIGKILL (Manual intervention or ‘oom-killer’ [OUT-OF-MEMORY])

docker stats duplicati output:

CONTAINER ID        NAME                CPU %               MEM USAGE / LIMIT     MEM %               NET I/O             BLOCK I/O           PIDS
05b61e17ba72        duplicati           21.72%              2.797GiB / 7.772GiB   35.98%              281MB / 6.64MB      259MB / 94.2kB      30

After this docker stats output docker container terminate with exit code 137


Is possible to enlarge /usr/bin/mono-sgen process memory ?
I found this: https://stackoverflow.com/a/19375857 is this a possible solution?

I had modify the duplicati-server file in the docker images:

export MONO_GC_PARAMS=max-heap-size=500m,mode=balanced
export MONO_LOG_LEVEL=debug
exec -a "$APP_NAME" mono "$EXE_FILE" "$@"  

to add MONO_GC_PARAMS to test if is possible to limit the amount of memory.

I tested also this configuration:

export MONO_GC_PARAMS="max-heap-size=1500m,nursery-size=128m,soft-heap-limit=600m"

I map the file in the docker-compose file as follow:

version: "3"
    image: duplicati/duplicati:
    container_name: duplicati
      - TZ=Europe/Rome
      - ./data:/data
      - ./backups:/backups
      - ./source:/source
      - ./duplicati-server:/usr/bin/duplicati-server
      - 8200:8200
          memory: 2500M
          memory: 1500M

The results are the same.
Docker exit code is 137 after mega started to elaborate the downloaded information form api mega site.

Hello and welcome!

I run Duplicati in docker on my Synology NAS and back up about 900GiB of data. All I can say is I didn’t have to tweak any of those mono memory settings at all. Duplicati works fine, no out of memory issues or anything.

How much RAM do you have on your host machine? What kind of host is it?

The “failed to connect” when testing your connection is troubling - you shouldn’t get that.

Hi @drwtsn32 ,

thanks for your message, I really appreciated!

My host machine is a VM:

  • 6 VCpu
  • 8 Gb RAM
free -h
              total        used        free      shared  buff/cache   available
Mem:           7.8G        400M        2.0G        9.7M        5.3G        7.2G
Swap:            0B          0B          0B

as specifiend in the docker-compose file I upper limit the container RAM usage to 2,5 Gb

  • Docker version 18.09.5, build e8ff056

In my mega business account (not the free subscription, I don’t think something change) I have 936Gb of stored data.
I think the problem is the decryption phase after the initial download metadata. As suggested in others posts I specified the parameter: --upload-verification-file=false but nothing seems changes.

Have you any suggestions?
Thanks in advance for any suggestions!

Ok, on my setup I don’t restrict memory usage of the container. Maybe 2.5GiB isn’t enough for some operations (not sure). Could you try increasing that limit to see if it helps?

The container shouldn’t be using much RAM all the time. I just checked my container memory usage and it’s only about ~200MiB (but Duplicati is currently idle).

Hi @drwtsn32,

I follow your suggestions but I have the same issue…

I have commented the following rows:

          memory: 2500M
          memory: 1500M

This is the last docker stats output before the crash:

CONTAINER ID        NAME                CPU %               MEM USAGE / LIMIT    MEM %               NET I/O             BLOCK I/O           PIDS
0b09efc20da7        duplicati           203.17%             7.13GiB / 7.772GiB   91.75%              340MB / 8.27MB      58.5MB / 98.3kB     28

The exit code is:

  Name                 Command                State     Ports
duplicati   /usr/sbin/tini -- /usr/bin ...   Exit 137

The duplicati command is:

Input command: backup
Input arguments: 

Input options: 
backup-name: test
dbpath: /data/Duplicati/GUBVRKYATG.sqlite
encryption-module: false
compression-module: zip
dblock-size: 10MB
no-encryption: true
debug-output: true
accept-any-ssl-certificate: true
no-backend-verification: true
disable-module: console-password-input

Thanks for your support!

Is Duplicati actually placing files in your mega storage?

No folder and no files inside mega storage (under the selected folder).

Seems a stupid things but I have no idea what’s wrong…

The error you get in the web UI is a clue… Failed to connect: API response: ResourceNotExists

Seems like some detail in your connection settings is incorrect.

I fixed it thanks to the parameter: no-backend-verification=true.

Just to summarize in the web interface i have this:

I click on: commandline and i reach this:

I click on button: Run "backup" command now this is the output:

as you can see at the end the docker crash and lose the connection.

Can you watch About → Show log → Live → and set the dropdown to Retry? Watch those log entries while a backup is in progress, and report what the exact warning messages are.

To me this still looks like it cannot talk to the mega back end properly.