I think the encrypted file is downloaded to a temporary file, decrypted to another one, and then the resulting .zip
file is compared to a database on disk. There should be activity…
If you were in the GUI, About → Show log → Live → Profiling would show database uses. Command line can set that log file or log level too, but it must be optioned at start of a run.
Then you don’t have new SQLite cache memory enhancement, although it’s possible to put it in with an environment variable if you really want it.
I think test
also does a check on database integrity. At least the one before backup does.
Backup stores progress. Few or no other things do. Test also has the question of what is progress? If you ask for a sample of 100, and interrupt at 50, and repeat, what happens? Probably a new sample of 100, not what remains of random sample of 100 picked before.
For a similar warning (and I don’t know if anything has been done there or can be here)

I feel I should just throw it all away and restart from scratch
As mentioned, you can restart from scratch without throwing the damaged one away, however if you’re absolutely sure that history will never be missed, then toss, I guess.
There’s also another way to test for damage, though one user found test
was faster.
--upload-verification-file (Boolean): Determine if verification
files are uploaded
Use this option to upload a verification file after changing
the remote storage. The file is not encrypted and contains
the size and SHA256 hashes of all the remote files and can
be used to verify the integrity of the files.
* default value: false
Because it’s just the size and hash, it doesn’t check the internal structure, and another limitation is that it assumes you can reach the destination files directly as local files, so downloading from B2 to some local space is needed, but that would also give you your history if you ever need it. You can do a fast download with rclone or similar, or try new
Duplicati.CommandLine.SyncTool --help
Description:
Remote Synchronization Tool
This tool synchronizes two remote backends. The tool assumes that the intent is
to have the destination match the source.
If the destination has files that are not in the source, they will be deleted
(or renamed if the retention option is set).
If the destination has files that are also present in the source, but the files
differ in size, or if the source files have a newer (more recent) timestamp,
the destination files will be overwritten by the source files. Given that some
backends do not allow for metadata or timestamp modification, and that the tool
is run after backup, the destination files should always have a timestamp that
is newer (or the same if run promptly) compared to the source files.
If the force option is set, the destination will be overwritten by the source,
regardless of the state of the files. It will also skip the initial comparison,
and delete (or rename) all files in the destination.
If the verify option is set, the files will be downloaded and compared after
uploading to ensure that the files are correct. Files that already exist in the
destination will be verified before being overwritten (if they seemingly
match).
Usage:
Duplicati.CommandLine.SyncTool <backend_src> <backend_dst> [options]
Arguments:
<backend_src> The source backend string
<backend_dst> The destination backend string
Options:
-y, --confirm, --yes Automatically confirm the operation
[default: False]
-d, --dry-run Do not actually write or delete files. If
not set here, the global options will be
checked [default: False]
--dst-options <dst-options> Options for the destination backend. Each
option is a key-value pair separated by an
equals sign, e.g. --dst-options key1=value1
key2=value2 [default: empty] []
-f, --force Force the synchronization [default: False]
--global-options <global-options> Global options all backends. May be
overridden by backend specific options
(src-options, dst-options). Each option is
a key-value pair separated by an equals
sign, e.g. --global-options key1=value1
key2=value2 [default: empty] []
--log-file <log-file> The log file to write to. If not set here,
global options will be checked [default:
""] []
--log-level <log-level> The log level to use. If not set here,
global options will be checked [default:
Information]
--parse-arguments-only Only parse the arguments and then exit
[default: False]
--progress Print progress to STDOUT [default: False]
--retention Toggles whether to keep old files. Any
deletes will be renames instead [default:
False]
--retry <retry> Number of times to retry on errors
[default: 3]
--src-options <src-options> Options for the source backend. Each option
is a key-value pair separated by an equals
sign, e.g. --src-options key1=value1
key2=value2 [default: empty] []
--verify-contents Verify the contents of the files to decide
whether the pre-existing destination files
should be overwritten [default: False]
--verify-get-after-put Verify the files after uploading them to
ensure that they were uploaded correctly
[default: False]
--version Show version information
-?, -h, --help Show help and usage information
If you do this, tools to use duplicati-verification.json
file are in utility-scripts
.