I have been running Duplicati for a while now, and the OneDrive (personal) have always worked very well. Around the 10th of January, MS phased out their older API, and I had to switch to the “v2” version in Duplicati. It works, the backups are running, but after a few backups I run into “Failed: Insufficient memory to continue the execution of the program.” and Duplicati stops working. Either I restart the service, or Duplicati crashes and restarts on its own. If I disable the OneDrive backups, Duplicati and all the other backends work fine.
I am running Debian 9 (fully updated) and Xamarin Mono 5.18.0 Stable and Duplicati 126.96.36.199 (Beta).
To my surprise, I have not seen anybody else having this problem? Is this fixed in a Canary-release?
Can you share what your source, block, and dblock (Upload Volume) sizes are?
Also, can you tell where footing the process this happens? For example, if the backup this error happened on is listed as restorable that would indicate the issue is in cleanup (retention, compacting, etc.) code instead of backup code.
I have 6 backup jobs that are using OneDrive, and they range from a few kb to 12 GB source data. The block size is usually set to 100 MB, backup is done daily and kept for 12 months.
It does not crash immediately, it may work for a few backups, and the crash may actually happen during a backup to a different backend, but the root cause is from the OneDrive jobs. The crashes are most frequent during those jobs, and when all OneDrive jobs are disabled there are no memory problems. After each Onedrive backup, the duplicati service takes up more memory, before that last crash 75% of the system memory was allocated by duplicati.
Thanks for the confirmation - sounds like you’ve got things pretty well pinned down!
I don’t know when I’ll find time but I’m thinking it might be useful to see if repeated runs of a smaller test job also cause the memory issue to appear.
I did a quick test of a dozen runs on a tiny job with no changes and the memory in my Docker container would go up for a few runs but then drop down again.
A similar result happened with 30 “Test connection” runs. Of course neither of these accurately reflect your scenario, so they aren’t all that important. I was just hoping for a quick confirmation test.
@rickard, if you disable backend testing (--no-backend-verification=true) for a while does that change the apparent leak rate?
“In your verification test was only one fileset (1 dlist + 1 dindex + 1 dblock) tested?”
Would not know, but sounds reasonable? I have not changed any parameters for the verification.
I will run a restore test tonight from OneDrive and one other backend and see what differences I see.
I would like to point out that I am running Duplicati on Debian, which may work slightly different, e.g. the OD4B does not work for me on Debian, and I have seen others have the same problems on OS:s that are derived from Debian. So, the behaviour of the ODv2 on Debian may differ from other Linux:es?
If you’ve got the older log format, you can see it as “VerificationsActualLength” in “TestResults”:
Just to be clear, OD4B doesn’t work for you on your Debian system so what we’re talking about here is “Microsoft OneDrive v2” (not “Microsoft OneDrive for Business” or “Microsoft SharePoint” or “Microsoft SharePpoint v2” or “Microsoft Office 365 Group”), right?
I want to make sure we’re looking into the right thing.
Yes, we’re talking about the OneDrive (Personal) v2. I gave up about the OD4B, which doesn’t work on Debian, unfortunately.
I did restore tests from OneDrive and Google Drive, OneDrive went up to 80 % memory, and then crashed, while the Google Drive restore never allocated more than 6% memory. I have 2 GB RAM, which means that 5% should mean 100 MB, which is surprisingly low given the block sizes, but I am not 100% that the process viewer is showing that accurately?
Yes, checking the logs a test run is three files, *.dlist.zip.aes, *.dindex.zip.aes, and *.dblock.zip.aes.
If that’s Job --> Show log --> Remote, could you click on the file names to see how big the files were?
This may also clarify what the 100 MB actually is. You refer to it as block size (or chunk size for Jotta), which leaves it unclear what it is. It could matter here. There are distinct values, having similar names.
Duplicati.CommandLine.BackendTool.exe would let you do a get of one of your larger files (possibly a dblock unless you have enough files that the dlist is larger), and see what that does to memory usage.
C:\Program Files\Duplicati 2>Duplicati.CommandLine.BackendTool help
Usage: <command> <protocol>://<username>:<password>@<path> [filename]
Example: LIST ftp://user:pass@server/folder
Supported backends: aftp,amzcd,azure,b2,box,cloudfiles,dropbox,file,ftp,googledrive,gcs,hubic,jottacloud,mega,msgroup,onedrive,onedrivev2,sharepoint,openstack,rclone,s3,od4b,mssp,sia,ssh,tahoe,webdav
Supported commands: GET PUT LIST DELETE CREATEFOLDER
C:\Program Files\Duplicati 2>
The target URL can be obtained using job Commandline or Export (which will probably quote it for shell).
For heavier testing involving changes or special files, you should use a separate folder (not your backup).
I’m not really familiar with mono’s behavior, but I guess I’ll ask if you ever see memory usage going down.
I have run a number of tests and checked the logs and the memory. There is a strong correlation between the size of the verified file and the rise in memory allocation.
Yes, I have been sloppy in my terms, what I most often refer to is the “Remote volume size”, which is set to 100 MB to 250 MB depending on backup.
In my tests, if a file of 3 MB was tested, no visible impact on memory allocation could be seen, while if a 100 MB file was tested the memory that the Duplicati processes use incresed from 5% (my normal value, which is around 100 MB physical RAM) to 20 %, and jumps up for every large file that is verified. In the end Duplicati crashes.
If the memory has started to grow, it can suddenly drop up to 20 %, in one case it went from 75% to 50%, I assume that either Duplicati or the OS finds a part of the memory unused and frees it?
The third player here is Mono, which is actually what would return memory to the OS. As it executes the Duplicati code, it keeps track of what has become unused, so that it can be used elsewhere or returned. Generational GC describes this in (heavy) detail. The OS is too far removed from Duplicati to know what memory can be freed, but it can move pages of memory from most programs to swap, if unused awhile. There’s quite a bit of accounting available via programs like ps and top, if you want any. A move-to-swap will probably reduce “resident” size while total “size” remains unchanged. Mono release likely cuts “size”.
What I’d encourage first, though, is the previously suggested test of a pure file download without all of the other things Duplicati will do on a verification. If you’d rather stay in the GUI a bit longer, you could add the –disable-streaming-transfers option as a test to see if it helps. This will download to a temporary file, and not attempt to do quite as much all in parallel in memory. If you set –full-remote-verification, test without it.
To be more general, you might turn off (or mention) any Advanced options that might be set, and relevant. Your --dblock-size isn’t hugely more than the default. On the other hand, your system memory isn’t large.
I tried the “–disable-streaming-transfers” with no success (Duplicati crashed). I haven’t tried the commandline download, as the example lists an FTP backend on a Windows machine, I need to test OneDrive on a Linux machine, which I suspect is a bit trickier? Can I find an example of how it’s done somewhere?
My suspicion is Mono for Debian, as the OD4B also behaves different from other environments.
The target URL can be obtained using job Commandline or Export (which will probably quote it for shell).
… let me suggest Export in case quoting is better, and for some reason I had to add a / before the folder.
C:\Users\xxx\AppData\Local\Temp>"C:\Program Files\Duplicati 2\Duplicati.CommandLine.BackendTool.exe" list "onedrivev2:///PC Backup/DELLXPS8500?authid=xxx"
Name Dir/File LastChange Size
duplicati-20190101T005253Z.dlist.zip File 1/1/2019 12:52:53 AM 521 bytes
duplicati-20190120T201428Z.dlist.zip File 1/20/2019 8:16:41 PM 714 bytes
duplicati-b718a3e0ae1ab419d89ef5f7d8e0562d1.dblock.zip File 1/20/2019 8:16:02 PM 49.95 MB
duplicati-b400108d8bda7485e9ac8923fab91e3e1.dblock.zip File 1/20/2019 8:16:41 PM 23.56 MB
duplicati-bdd1a0e3a9cc7437789a2585aa3462c2a.dblock.zip File 1/1/2019 12:52:52 AM 675 bytes
duplicati-i65be9aba4dd54d2ab6b2000eab35c5d4.dindex.zip File 1/1/2019 12:52:53 AM 611 bytes
duplicati-i082387c7936f4930bfdfc8b429212100.dindex.zip File 1/20/2019 8:16:03 PM 43.17 KB
duplicati-ifd947e91a8804e62b7d60ba3be216bc9.dindex.zip File 1/20/2019 8:16:41 PM 9.18 KB
duplicati-verification.json File 1/20/2019 8:16:42 PM 1.54 KB
C:\Users\xxx\AppData\Local\Temp>"C:\Program Files\Duplicati 2\Duplicati.CommandLine.BackendTool.exe" get "onedrivev2:///PC Backup/DELLXPS8500?authid=xxx" duplicati-b718a3e0ae1ab419d89ef5f7d8e0562d1.dblock.zip
Volume in drive C is Windows
Volume Serial Number is 5822-1128
Directory of C:\Users\xxx\AppData\Local\Temp
01/28/2019 12:36 PM 52,377,390 duplicati-b718a3e0ae1ab419d89ef5f7d8e0562d1.dblock.zip
1 File(s) 52,377,390 bytes
0 Dir(s) 24,719,097,856 bytes free
On Linux, you probably don’t have a convenient /usr/bin script for this, but you could look in one for code location (e.g. /usr/lib/duplicati) then do mono /<directories>/Duplicati.CommandLine.BackendTool.exe …
I increased my OneDrive v2 test backup to include 275M of source files and use a 200MB dblock (Upload volume size) and ran a backup.
I then ran “Verify files” while monitoring the memory usage and while it did spike from ~1.4% to 5.1% usage it dropped right back down to normal after the verification was complete.
Below is a summary of what was downloaded and the memory usage (note the file Size numbers confirming 200+ MB was transferred).
Based on this MINIMAL test trying to replicate your scenario, it looks like my Ubuntu Docker running 188.8.131.52 on mono 184.108.40.206 does NOT have the same memory usage problem you are seeing.
- Jan 28, 2019 2:22 PM: get duplicati-b8244b8b3c1ab4f3e820601f0cc214855.dblock.zip.aes
- Jan 28, 2019 2:22 PM: get duplicati-i984901ae0bed478a9031cab644e43237.dindex.zip.aes
- Jan 28, 2019 2:22 PM: get duplicati-20190124T183246Z.dlist.zip.aes
- Jan 28, 2019 2:21 PM: list
I have been away for a few days, therefore no answers. I have tested a bit on my Linux server, but couldn’t really get it going. The suggested “get” command is not listed among the usable commands, so I tested the “test” command, but there were some other problems, so it didn’t really get anywhere. I also tested running the cli commands from the GUI, but that caused some other problem that has to be fixed now…