Leaking of RAM on OneDrive v2

#1

Hello. I have problem with backup on Debian 9 (Duplicati - 2.0.4.18_canary_2019-05-12) — while verification in process, memory is leaking. After that (when server hasn’t memory) duplicati crashes.

mono /usr/lib/duplicati/Duplicati.CommandLine.exe backup onedrivev2://backup/doc /mnt/media/doc/ --backup-name=doc --dbpath=/root/.config/Duplicati/71877079907668816966.sqlite --encryption-module=aes --compression-module=zip --dblock-size=500MB --exclude-files-attributes=temporary --disable-module=console-password-input

And I cannot restore selected files:

{
  "RestoredFiles": 0,
  "SizeOfRestoredFiles": 0,
  "RestoredFolders": 0,
  "RestoredSymlinks": 0,
  "PatchedFiles": 0,
  "DeletedFiles": 0,
  "DeletedFolders": 0,
  "DeletedSymlinks": 0,
  "MainOperation": "Restore",
  "RecreateDatabaseResults": null,
  "ParsedResult": "Error",
  "Version": "2.0.4.18 (2.0.4.18_canary_2019-05-12)",
  "EndTime": "2019-05-14T06:58:15.085786Z",
  "BeginTime": "2019-05-14T06:47:53.765557Z",
  "Duration": "00:10:21.3202290",
  "MessagesActualLength": 16,
  "WarningsActualLength": 0,
  "ErrorsActualLength": 3,
  "Messages": [
    "2019-05-14 10:47:54 +04 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Restore has started",
    "2019-05-14 10:48:58 +04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()",
    "2019-05-14 10:49:08 +04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (186 bytes)",
    "2019-05-14 10:49:11 +04 - [Information-Duplicati.Library.Main.Database.LocalRestoreDatabase-SearchingBackup]: Searching backup 0 (5/9/2019 10:50:27 AM) ...",
    "2019-05-14 10:49:14 +04 - [Information-Duplicati.Library.Main.Operation.RestoreHandler-RemoteFileCount]: 1 remote files are required to restore",
    "2019-05-14 10:49:14 +04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b685d9add203f4056b7cc7c8315fd5c77.dblock.zip.aes (360.26 MB)",
    "2019-05-14 10:50:54 +04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b685d9add203f4056b7cc7c8315fd5c77.dblock.zip.aes (360.26 MB)",
    "2019-05-14 10:51:04 +04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b685d9add203f4056b7cc7c8315fd5c77.dblock.zip.aes (360.26 MB)",
    "2019-05-14 10:52:44 +04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b685d9add203f4056b7cc7c8315fd5c77.dblock.zip.aes (360.26 MB)",
    "2019-05-14 10:52:54 +04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b685d9add203f4056b7cc7c8315fd5c77.dblock.zip.aes (360.26 MB)",
    "2019-05-14 10:54:34 +04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b685d9add203f4056b7cc7c8315fd5c77.dblock.zip.aes (360.26 MB)",
    "2019-05-14 10:54:44 +04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b685d9add203f4056b7cc7c8315fd5c77.dblock.zip.aes (360.26 MB)",
    "2019-05-14 10:56:24 +04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-b685d9add203f4056b7cc7c8315fd5c77.dblock.zip.aes (360.26 MB)",
    "2019-05-14 10:56:34 +04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b685d9add203f4056b7cc7c8315fd5c77.dblock.zip.aes (360.26 MB)",
    "2019-05-14 10:58:14 +04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Failed: duplicati-b685d9add203f4056b7cc7c8315fd5c77.dblock.zip.aes (360.26 MB)",
    "2019-05-14 10:58:15 +04 - [Information-Duplicati.Library.Main.Operation.RestoreHandler-RestoreFailures]: Failed to restore 2 files, additionally the following files failed to download, which may be the cause:\nduplicati-b685d9add203f4056b7cc7c8315fd5c77.dblock.zip.aes"
  ],
  "Warnings": [],
  "Errors": [
    "2019-05-14 10:58:14 +04 - [Error-Duplicati.Library.Main.Operation.RestoreHandler-PatchingFailed]: Failed to patch with remote file: \"duplicati-b685d9add203f4056b7cc7c8315fd5c77.dblock.zip.aes\", message: A task was canceled.",
    "2019-05-14 10:58:15 +04 - [Error-Duplicati.Library.Main.Operation.RestoreHandler-RestoreFileFailed]: Could not find file \"/mnt/media/doc/books/programming/prepro/Khaggarti_R_-_Diskretnaya_matematika_dlya_progr.pdf\".",
    "2019-05-14 10:58:15 +04 - [Error-Duplicati.Library.Main.Operation.RestoreHandler-RestoreFileFailed]: Could not find file \"/mnt/media/doc/books/programming/assembler/Туториалы Iczelion`а на русском (MASM для Windows).chm\"."
  ],
  "BackendStatistics": {
    "RemoteCalls": 6,
    "BytesUploaded": 0,
    "BytesDownloaded": 0,
    "FilesUploaded": 0,
    "FilesDownloaded": 0,
    "FilesDeleted": 0,
    "FoldersCreated": 0,
    "RetryAttempts": 4,
    "UnknownFileSize": 0,
    "UnknownFileCount": 0,
    "KnownFileCount": 186,
    "KnownFileSize": 47833600082,
    "LastBackupDate": "2019-05-09T14:50:27+04:00",
    "BackupListCount": 2,
    "TotalQuotaSpace": 5497558138880,
    "FreeQuotaSpace": 4769961182450,
    "AssignedQuotaSpace": -1,
    "ReportedQuotaError": false,
    "ReportedQuotaWarning": false,
    "MainOperation": "Restore",
    "ParsedResult": "Success",
    "Version": "2.0.4.18 (2.0.4.18_canary_2019-05-12)",
    "EndTime": "0001-01-01T00:00:00",
    "BeginTime": "2019-05-14T06:47:53.765584Z",
    "Duration": "00:00:00",
    "MessagesActualLength": 0,
    "WarningsActualLength": 0,
    "ErrorsActualLength": 0,
    "Messages": null,
    "Warnings": null,
    "Errors": null
  }
}

Could you help me? I think, I should reduce size of block to 100 MB, but it doesn’t help with restoring.
Thanks.

#3

Well that was rude… Memory is very unlikely to be leaking since everything is managed code, which means we dont handle memory so we can’t mess it up.

If your system cannot handle managing multiple blocks of 500MB then you need to recreate your backup with a more compatible sizing.

#4

So, problem with mono-sgen?

#5

Everything duplicati is mono-sgen because that’s how your system executes C# code.

But there isn’t a way to fix the backup if you already backed up with 500MB blocks and your system can’t verify them without running out of memory.

Your memory usage looks completely reasonable for a 500MB block size.

#6

Ok, I will try to rebackup and restore again with 100MB block size.
I have question, why I don’t see progress of downloading and list of files, while restoring in process?

UPD. I could restore files after “delete and repair” database.

#7

I’m not hugely familiar with restore code, but I think it’s oriented around backend files not source as backup is (but even there, I think the nice per-source-file UI wasn’t always there – sometimes UI displays improve).

There’s far from a one-to-one mapping of source to backup files, and a given source that’s updated many times will only upload changed blocks during backup, so they may be aggregated with others in the dblock. Restore downloads whatever dblocks are needed for the requested files, then takes them apart for blocks. Choosing sizes in Duplicati discusses the tradeoffs. 500 MB may be reasonable. Some go even higher, to avoid the perhaps-still-annoying-us 5000 file list view limit in OneDrive (go beyond that and see nothing…).

I assume that took the place of repeating the backup, and that the troublesome two files now restore. That reassures me because I’ve seen some odd upload behavior on recent canary to OneDrive, and am hoping that it’s just me somehow. duplicati-b685d9add203f4056b7cc7c8315fd5c77.dblock.zip.aes download error wasn’t well explained last time, but you could probably have seen the cause in About --> Show log --> Live at Information level or above. If there was a hash mismatch, you might have “fixed” it by importing the bad hash into the database during Recreate. To be extra cautious, you can try downloading the file and opening with AES Crypt or CLI tool SharpAESCrypt.exe in the Duplicati folder. But file restores are also good proof.

#8

Thanks for response!
As I know, OneDrive for Bussiness suggest 300 000 files limit, but I`m not sure and can’t find proof. Hope, it will be enough for 2TB data.
I’ve changed block size to 100MB and now I’m waiting backuping for checking leaking of RAM.

#9

OneDrive for Business - files in folder limited ? mentioned the 300,000 and also the 5,000 per-folder limit which some people seem to hit. In Failed Backup with id: 3 (Missing Files), switching to OneDrive v2 did successfully workaround it, but I think I’ve also seen some reports where the Microsoft limit stayed there.

For a 2 TB backup, your database will have to track 20 million blocks, so unless you prefer deduplication effectiveness over database speed (especially felt if the database gets lost), bigger –blocksize may help.

#10

Ok, the same picture with 100MB block size. :frowning:


And restoring with errors. Again.

#11

If similar to last time, still need info requested last time, otherwise, not much to say due to no information.

People running canary should also know that canary is bleeding-edge and will get new errors sometimes, however those who run it can help solve those before they get to the bigger beta (now 2.0.4.5) population. While this is much appreciated and a service and I hope we can pursue it, canary isn’t everyone’s choice.

If you’d rather not use logs, want to stay on canary, and are willing to back up a few (not always possible), trying 2.0.4.15 would be interesting to see whether it avoids those restore errors where it retries then fails. One of my mysterious problems seemed to get better with that. Parallel uploading was added in 2.0.4.16, and bugs from that have been shaken out in 2.0.4.17 and 2.0.4.18, but possibly there are more to remove.

releases

Alternatively, have you been on canary for awhile? If so, when did you first start having that restore failure?

There are numerous other ways to see if you’re hitting the problem I saw, but some are more complicated. One easy way to check the integrity of the remote files (which I haven’t tried, but seems like it would work) is to set –backup-test-samples to all, do a backup, and wait while it downloads and integrity-tests all files after the backup. If that works, then remove that option to let it revert to its default test sample of only 1 set.

#12

Big thanks for response!
After resizing blocks I haven’t first problem. And restoring problem had solved by “delete and repair” database.