I have an UNRAID server that uses multiple HDDs, but the file system looks like one. Periodically I may lose an HDD and have to restore a few TBs from local or cloud backup locations.
Is there a way for Duplicati to only restore the missing files or files that are different file sizes than the ones in the backup?
I may have a directory that has 4TBs in it, but only 800GBs went missing since those files were stored on the HDD that went bad. Is there a way to tell it to only restore those that are missing?
I can’t think of one that’s both human-simple and resource-light. If you want to brute-force it, restoring everything to a bunch of free space then overwriting the source files only if backup is newer might do.
-u , --update
copy only when the SOURCE file is newer than the destination file or when the destination file is missing
If you want to try to script to get only missing source files, you could probably use The FIND command, perhaps running it from true CLI based on adapting an Export As Command-line. This doesn’t show file timestamps, and it only shows exact sizes for smaller files that don’t cause it to start saying KB, MB, etc.
After getting find or something to tell you what you have left, you can sort that and a version of Duplicati backup find output, compare them with comm, and run a true CLI restore of exactly the files you need.
Or you can let Duplicati work for you, having it not overwrite versions where restore differs from current. That option in the GUI is called Save different versions with timestamp in file name, and the manual (or scripted, or tool aided with a merge tool) job is to figure out which file to keep if you have two.
Ordinarily I’d think collisions would be the backup being older than current, so another way to reduce the collisions would be to move aside the files newer than the last backup, then move them back after doing wasted restore of a stale version. You could do some sort of a date-based find command for new files…
That certainly makes sense, and would help when file recency is a factor.
Most of the files are for long-term storage (e.g. photos, videos, etc.) so I don’t have much of an issue with updated files. Its more preventing having to re-download 4TBs when I am only missing 800GBs (example).
Your response has me thinking of another issue (not Duplicati) I have with something else, and I think this just might help. Thanks!
In principle, yes. When replacing a drive outright, its pretty straight forward.
However, every so often my UNRAID system will pop an error that there is no file system found on a drive. Sometimes it will even happen to two at once.
The rub is that the drives are not actually bad. So I end up having to format them and restore the data.
I’ve tried numerous fixes, including reinstalling UNRAID from scratch, but the product is niche enough that there isn’t a lot of similar cases I have found. Each one seems to be unique, and I just haven’t figured out specifically what is causing mine. I have a hunch it has to do with power, but I’ve even had it happen while on a UPS.
So the wipe/restore is the current fix until I can isolate why this happens.