Verify all local files are properly backed up

Duplicati crashed on me several times because of taking all space on SSD while backing up, vacuuming, etc. The whole setup has a history of year+ of backups.
I’ve moved all files, including databases elsewhere, but now I would like to verify that all my (existing) files are correctly backed up on remote storage.

Is there a way to tell duplicate to verify (see, that local file’s size/checksum matches what’s in the remote database) all local files against remote storage?

Do backups work after you made your changes? Reason I ask is a lot of those checks have already been done. Every time a backup is kicked off, Duplicati scans the local files and compares their metadata (timestamp, file size, etc) to what was backed up previously. If any local files are new or the metadata differs (file was changed), then it will process those files for backup.

You also say you want it to verify “checksums” - Duplicati won’t check file contents if the metadata hasn’t changed. It is assumed the file has the same contents. If you really want it to reprocess all data, I’m not sure there’s a way to force that directly but I could be wrong.

Indirectly you can trigger this through a database recreation. After the database is recreated, the stored timestamps don’t have the same high resolution as the filesystem and Duplicati will detect this as a metadata difference, forcing all files to be reprocessed. In my opinion this behavior is actually a bug, something I discussed in this separate-but-related Github issue.

If it were me, I don’t think I’d go through that exercise if all I did was move the job specific databases to a different location. May be more trouble than it’s worth.

Yea, the backup just continues to backup new / changed files.

I do have some Duplicati test found error, how to fix? type of errors in one of the backup test I launched, but the conclusion seems to be “just ignore it, it goes away eventually”.