Backup states/phases - information reported to users about what their backup is doing is not helpful

TL;DR The database query shown was happening to check block sizes. There might have been other factors slowing it down, however analyzing a seeming performance problem will require investigation, if you’re willing.

Are you familiar with SQLite database performance? I’ve done very little myself, but your question was asked between the start and timed end of a Structured Query Language (SQL) statement that’s running and taking possibly longer than it should (depending on your backup size and other factors). Seeing that is helpful to go deeper to (try to) identify the source code running it, to possibly address your question about what it’s doing.

https://github.com/duplicati/duplicati/blob/v2.0.3.3-2.0.3.3_beta_2018-04-02/Duplicati/Library/Main/Database/LocalDatabase.cs#L723 is my guess, in a routine named GetBlocksLargerThan(), which is then called from https://github.com/duplicati/duplicati/blob/3ec3dc649953c3de2d9595428ba4189f6c86aa4d/Duplicati/Library/Main/Utility.cs#L169 as a check to see whether it finds any blocks that are too large, and if so try to say that:

“You have attempted to change the block-size on an existing backup, which is not supported. Please configure a new clean backup if you want to change the block-size.”

I guess if one were to assign a few words to report to user, it might say something like “Checking block sizes”, however that doesn’t do much to explain why it’s doing that for so long. That’s a performance question going way beyond Duplicati to everything else that might be happening on your system at the time of the slow spot. Yours sounds reliably slow, which is good, because mine jumps all around, perhaps based on system loading and perhaps especially on disk loading because this query “seems” like it’s primarily doing I/O.Task Manager “Performance” can show all such loads. There’s got to be some reason why it’s slow. How big is this backup? How big is the local database for this backup? You can click “Database” for the job to see the database path.

Looking up GetBlocksLarger than in the forum posts did find one other slow case here with some discussion about different ways to do the SQL query. I think optimization is one reason why timing of operations is done.

Stuck on “Starting …” might sound all too familiar to you. It had slow queries, much work, and no tidy ending. This was on a VM (raising some more questions). Is yours directly on a computer that’s typically plenty fast?

Heading in a completely different direction, your observation on system file exclusion might be a CPU issue.
Filter group has incredibly bad performance
Filter groups code is extremely slow #3395
I’m not familiar with this code (I’m just a user like most people trying to help on this forum) so I’ll stop there…