Mono CPU utilization is absurdly high

I’m noticing on a modern Mac running 10.10.5 that mono-sgen64 (which was only installed for Duplicati) is routinely taking 120-150% of CPU

That’s not good - can anything being done about this?

What settings are you using for your backup? (Feel free to just paste an “export as command-line” result with personal data like passwords, ids, and hashes changed.)

1 Like

I am seing the same thing. I thought it was due to mono installed on my machine for C# dev with Unity. I’ve uninstalled and reinstalled mono and have not opened Unity since a fresh restart.

Mac OS up to date.

Just installed Duplicati yesterday. CPU on mono-sgen64 seems to flux between 100-170% CPU.

Any thoughts?

Same here on MacOS High Sierra. Maybe related to file compression routines used by Duplicati?

If you don’t mind doing a test you could try adding --zip-compression-level=0 to your job (or a test job) and see how the mono-sgen64 CPU usage looks…

--zip-compression-level
This option controls the compression level used. A setting of zero gives no compression, and a setting of 9 gives maximum compression.
Default value: “BestCompression”

Hi Jon, with a test with zip excluded as per your command line, the load fluctuated between 8% and 115%. The average was about 60%. This is on a Core i3 dual processor. The upload was 5 á 8 MB/sec. I have attached a graph by Activity Monitor.

And with the normal job today (update bi-weekly 300 GB+ data of which ~ 15 GB may have changed) so with zip and encryption on, the load was not very different:
load fluctuated between 5% and 120%, on avg about 50%. So it maybe that the 170% load is only happening on the initial run?

That could be the case as the initial run requires 100% of stuff to be backed up (lots more database writes) while, unless you have a lot of changes to files, later runs are still processing the same amount of source data but most of it comes up as database reads and generally there’s no need to re-block / re-hash files with no changes.

So based on your tests it sounds like compression likely isn’t the issue which leaves things like these that happen a lot on initial backups and backups with many file changes:

  • sqlite writes
  • hashing
  • encryption
  • file transfers
  • local source IO (reading complete new / changed files for blocki and hashing purposes vs. just meta data to determine no changes)

Same issue here. After a terrible slow initial backup Duplicati now is going through backend data verification (probably stuck) and CPU usage of mono-sgen64 on a MacMini is 800% (per ActivityMonitor).
Now I also get error message of connection to server lost (a locally attached USB drive) and my CPU is overheating. So I have to force-quit those processes, which will probably totally screw up the databases.

What version of Duplicati are you sing?

the latest beta 2.0.3.3_beta_2018-04-02.

After computer cooled down I restarted. mono is now using less than 1% CPU but Duplicati is stuck on “Starting …” for a very long time. I think all this interrupting and restarting broke something.

Duplicati is pretty good at doing things as transaction sets so hopefully it’s just taking a while to clean up the previously unfinished transactions…

So it was stuck on “starting” for many hours and now it “seems” to start all over again with the backup. It lists the whole backup size as to go, but the files do not count down and the upload speed is stuck at 1.08 KB/s. So I think it is in fact stuck.
I stopped Duplicati one more time. Restarted computer, and now it seems to work again although it does not display the speed anymore (probably heard me yelling too often at it).
Here are some of the messages from the Duplicati log:

May 4, 2018 9:24 AM: Failed while executing "Backup" with id: 3
May 4, 2018 9:24 AM: Error in worker
May 4, 2018 9:09 AM: Reporting error gave error
May 4, 2018 9:09 AM: Request for http://localhost:8200/api/v1/backup/3/log?pagesize=100 gave error
May 3, 2018 4:58 PM: Error in updater

I’m seeing this same problem, CPU utilization is quite high while the backup is initializing (i.e. while Duplicati is “Counting…”–once that’s done and the actual backup proceeds, CPU utilization drops).

Duplicati 2.0.3.3_beta_2018-04-02
macOS High Sierra 10.13.4
Source data size: ~120 GB

If you click on the actual line with a date it should expand to show details - for example, clicking on the “Failed while executing…” line should expand to show more info about errors that might have occurred.

Is there any difference if you set the --thread-priority lower?

--thread-priority
Selects another thread priority for the process. Use this to set Duplicati to be more or less CPU intensive.
Default value: “normal”

When you’ve upgraded to 2.0.3.4 or newer you might also want to look at enabling --use-background-io-priority to see if that has any effect:

--use-background-io-priority
This option instructions the operating system to set the current process to use the lowest IO priority level, which can make operations run slower but will interfere less with other operations running at the same time
Default value: “false”

I’ve played around with this issue some more. Here’s what I’m seeing:

  • The performance is when dealing with large numbers of files
    • One of my machines has backup file counts in the four and five digits and doesn’t seem to have any issues (albeit it’s also Windows and utilizing specific features there).
    • My problematic machines (Macs) have file counts in the six and seven digit ranges.

For example, my main working laptop has a backup file count of over one million files–backups take about two hours, with most of that time spent “counting” files (that’s when the CPU utilization is at 400+ percent).

However, if I enable “check-filetime-only”, CPU utilization drops to less than 200% during the “counting” process. The amount of time it takes to “count” the files is still around two hours, though.

It sounds like a combination of hashing the files and comparing those hashes to whats in the database are what’s causing the issue. My guess is it’s mostly the database side of things.

There are some optimizations coming for the database, but I couldn’t say whether or not they’d help in this particular instance…

I am getting similar ~160% of CPU utilization by “mono-sgen” process with “2.0.4.5_beta_2018-11-28”. Average system load (per TOP) peaks at ~5.2 causing the Debian Linux box to generate “overload” notification. I used to occasionally receive such warning messages from Debian 8. Now, after upgrading to Debian 9 a week back, I am getting these warnings on every Duplicati backup session.

Does the problem continue with the latest canary versions? The latest is 2.0.4.18_canary_2019-05-12
Personally I’ve found most of the canary versions are quite stable (except certain ones like 2.0.4.13).

It does help to export the current backup configs to a file, install the canary as its own setup in a different folder, and then import the previous backups but point it to a different destination folder. This helps prevent the new version from potentially messing with your current version and backups.

Running latest canary (105-1) plus latest mono (6.8.0.105)
recreate database on a local fileset of 250GB flies to 90% completion and then slows to 2-5/1000s of 1% every couple of minutes.

‘top’ shows mono-sgen gobbling 100+% of cpu. Since last comments on this issue were about a year ago, and both duplicati and mono have new versions, I think it should be re-opened.