Performance is really slow with high CPU usage!

There currently isn’t a “here are the slow parts” profiling function, but the Live Log does have a profiling mode that shows pretty much everything being done and how long it takes, but it’s not pretty. Some discussion has happened around adding metrics, but so far development time has been focused on functionality.

Are you getting these 1GB/min times on initial backups, subsequent runs, or both?

As far as what Duplicati is doing, the summary is (sort of in this order):

  1. compare list of current files to what’s already been backed up & store in a sqlite database
  2. chop up changed / new files into blocks & record the block references in the sqlite database (while flagging records associated with deleted files as being deletable)
  3. zip up a chunk of blocks into a dblock (archive) file
  4. upload dblock file
  5. repeat 2-4 as necessary
  6. when enough deleteable entries are found, download associated dblocks and recompress without the deleted file contents, and upload them (then update the sqlite database)
  7. when enough sparse dblock files are found (such as can result from step 7) download them and recompress into fewer non-sparse files, and upload them
  8. download some dblock files for validation

More specifics can be found here: