Source size jumped up from 8GB to 128TB

Most of the forum reports (except first) didn’t have user investigation.

Duplicati says I have 20x more data than actual involved phone photos + NTFS compression.
VPS backup size showing 120TiB
Backup showing wrong source size, huge memory use causing system crash (two reportings)

Have you already tried isolating it to a specific file in the folder? A way to do this fast may be:

Configure a backup of the folder in a different job, and additionally use job Advanced options:

dry-run checkmark. This should keep it from getting stuck actually trying to handle a big file.
log-file=<path>
log-file-log-level=Verbose

After the run, see if any files in the dry run are showing a forecast way past the expected size.

Example output (with no surprise):

2025-02-03 09:01:04 -05 - [Verbose-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-WouldAddNewFile]: Would add new file C:\backup source\short.txt, size 118 bytes

EDIT:

Note: A test with Process Monitor suggests that it actually reads the file, so maybe get ready in Task Manager to kill the process if it gets stuck trying to read through (for example) 128 TB file.