Is there any way we could have a test restore process that doesn’t require a full backup’s worth of disk space + a version of destination files + temp usage?
I’m thinking restoring one file, CRC verifying it, deleting it, then moving on to the next file.
Three scenarios come to mind:
Performance / bandwidth frugal:
All destination files for targeted test version are downloaded files restored, CRCed, deleted individually. This would require full destination version size + largest backed up file - but minimal transfers (likely fastest option)
Space frugalFiles necessary for a single to-be-restored file are downloaded, file restored, CRC checked, deleted, repeat for next file. Lots of repeated downloads needed but required space would be minimal - at MOST latest backed up file + that size /dblocks size.
Combination of the two with logic to try and group files to be tested based on their proximity in destination files. Actual performance varies depending on backup layout (such as history length and frequency of file changes).
Is such a feature even worth attempting?