Backup process stuck and high cpu load

Dear Community,

I’ve just ran into some problems (unknown issues) and want to write my experience down to not forget anything, before I can continue gathering more details.

  • I’ve installed Duplicati on a new machine and started testing the first backup.
  • The system is Windows 10 Pro 21H1 19043.1052.
  • I’ve two user accounts on the system. One admin user and a standard user.
  • I did not set a password for securing access because both accounts are used by me (if this may be relevant).
  • Duplicati is set to autostart.
  • I try to backup a folder (~4GB, ~3000 files) from a local HDD to a USB3 drive (2TB, HDD).
  • I used the GUI to set up a plan, quite straight forward with no pro options.
  • Manual run, no user filters, just exclude options for system and temporary files and keep 4 backups.
  • I started the plan also via GUI.
  • The progress went through until it states 0 files remaining … speed # MB/s.
  • It kept showing this for a long time (~30min) and the speed dropped very slowly.
  • After that it switched to the state removing unneccessary files.
  • It also kept showing this for about an hour after I lost my patience.
  • Backup files are present on the USB3 drive.
  • I tried to end the job after last file → Nothing.
  • I tried to end it > Nothing.
  • I looked at Windows task manager and found two Duplicati processes.
  • One showed very high power consumption and used at least one logical processor with 100%.
  • I selected quit from task icon and did a reboot.
  • I remembered the message for multiple user environments and checked the other account (first time after installation).
  • Duplicati was in autostart and showed the message for multiple environment again.
  • I tried to check the backup and again noticed high cpu usage without anything going on (at least as I could tell).
  • I tried to stop Duplicati again, no response until I did a reboot.
  • Deactivated autostart and felt quite lost.

Am I doing something totally wrong?
What is it about the multiple user thing apart from security?
Could not find threads directly addressing my issues, I guess I have to read more.
Maybe I can get a hint to the right direction.
I try to add more details/logs if weekend has landed.

System specification

  • AMD Ryzen 9
  • 32GB RAM
  • Seagate ST2000DM008-2FR102
  • WD Elements 2621 USB Device


I have the same issues on a non-regular basis. Usually it finishes eventually, even though it seems stalled.

Would be nice if the devs could add something like a heartbeat to see if it is stalled or if it really does something.

Welcome to the forum @sthag

Thanks for the good detail. The main additional diagnostic that I can think of if it happens again is logging.

About → Show log → Live → Retry is handy. If need be, you can increase it to Profiling to look for action.
Long-term capture (if needed) is probably easier with log-file=<path> log-file-log-level (runs unattended).

It sounds almost like it’s having trouble writing to that USB drive, although I don’t know why that would be.
Stop now doesn’t stop immediately explains how that option doesn’t exist. It has to finish the file uploads.

There are typically two. The first is the Program Files base install. It just starts the latest installed update.

This would be the one doing the actual work, but that’s more work than I would expect it to be doing. If it’s somehow doing SQL, you would see that in the profiling log level. If it’s doing I/O, you could look at it with Sysinternals Process Monitor for details (which might show .sqlite* database file, or destination work).

I think that message just makes you password-protect the UI. The launch-at-login Duplicatis are per-user.
Duplicati.WindowsService.exe can make a start-at-boot Duplicati for all users if you like the arrangement.
There are some caveats, e.g. Windows version upgrades wipe out the SYSTEM profile, so go elsewhere.

I’m not seeing anything like that.

I’m not sure what you’d like, but I made some comments earlier. Ask questions if you want some more.

Non-regular probably requires the log file route. You don’t want to be watching a live log of every backup.

If it finishes, check your log and see if you did a compact that actually made some new files. It can take awhile for large backups compared to the time it takes to do the backup. It’s repackaging the older files.

This would not occur on the first backup mentioned in original post. It takes awhile to get wasted space.
Alternatively, you can run with no-auto-compact for awhile to see if it avoids the after-backup slowdown.