Do you run as Windows service? Is it default SYSTEM profile or did you move Duplicati config?
Typically not doing so produces a dead Duplicati, but I want to be sure it’s not a possible cause.
Beyond that I have no guess why Server 2022 is working so very differently than Server 2019…
Duplicati says I have 20x more data than actual is long-shot relevant, maybe sparse file or bug.
You could note time, test subset backups, and delete those versions when/if bad area is found.
Maybe also turn off auto-compacts and retention deletions while doing chopped-down backups.
You don’t want one accidentally being picked as the backup-of-the-year that you keep 10 years.
If you don’t want to mess with the real backup, you can do test backups, but it might be slower.
Cutting down a backup probably won’t upload much, just basically note some files are deleted.
Actual deletion of file data from destination won’t occur, assuming file is in some other version.
You can watch About → Show log → Live → Verbose. Maybe you can match files to size jump.
Looping or other bad enumeration patterns might also be easier to spot. If it helps, log to file by
log-file=<path> log-file-log-level=verbose.