Possibility to identify large directories in backup

Welcome to the forum @smo

For better but slower test, use no-local-blocks, otherwise it gets most blocks from source, not backup.

Duplicati will attempt to use data from source files to minimize the amount of downloaded data. Use this option to skip this optimization and only use remote data.

If you’re really willing to read database, Database rebuild has info on the tables. Source files sizes are in Blockset table. Fitting the wish, File table is now a view with PathPrefix storing the unique folder prefixes.

but there’s a simpler way to know file sizes. Just list the files. See Visualize backup data usage for more.

The TEST-FILTERS command can help test, and (contrary to documentation) appears to accept multiple folders. You could also pair that up with stat --format="%s" or something else that can show the sizes.