"Set and Forget" alternative to Time Machine on Win10?

Everything is relative. Some commercial backup products bundle in the cloud storage and other services. Compared to those, Duplicati use to a cloud needs cloud setup, and then type a few values into Duplicati.

At least one commercial backup product’s big feature is unlimited backup of more of less everything there. There are others that give you a few options. On Duplicati, you checkmark what you want. You decide, not someone else, who also typically forces exclusions on you either with default or there’s-no-way-to-avoid-it.

Some commercial backups don’t encrypt, or tie encryption to your login so vendor can show you your files. Duplicati’s encryption is done on the client, so data stays private, but if you encrypt, don’t lose passphrase.

I wouldn’t call Duplicati complicated, especially if to an SSD, but it’s more complicated than some might be. Open source backups in general seem to be more do-it-yourself than the slicker just-commercial offerings.

One way for it to be more complicated is if you have locked files, e.g. Microsoft Office, that need a backup. Windows can deal with open files using VSS snapshots, but doing that requires an elevated administrator. Because this is awkward to arrange (and usually needs user interaction to elevate), you could run using a Windows service running as SYSTEM, a bit like some systems have root – very powerful, but bigger risk.

Duplicati GUI install out-of-the-box runs at login time, and as user. Question is – is it enough for the need? Answering no means Duplicati.WindowsService.exe run, and setting up a suitable tray icon, if you want it.

If it happens to break, which doesn’t happen much, but does happen, you might be back here, seeking help. because you’re the “IT Support Guy”. You can also monitor backup health in any number of ways, remotely.

The end user might just run happily, and if it’s to a cloud, they don’t even need to plug in an SSD sometime. Cloud (depending on whose) can also avoid the out-of-space warning (if you guessed wrong on SSD size).

Generally, one wants to backup one’s own files, so once you figure out where they are, tell user to NOT get creative and put their files somewhere that itsn’t backed up. I don’t favor whole-drive backup in Duplicati, as too much extraneous stuff comes in, such as the OS, which you could just reinstall. There are some filters available if you like a recommended set of exclusions, e.g. no point in backing up browser cache heavily…

I used to use Macrium Reflect Free until it was discontinued. It was a lot faster than the file-based backups. Image backup tends to be grab-everything, and that’s good sometimes, especially for me who chose a tiny careful selection of things to backup often. The image backup gives me old copies of things I didn’t choose.

That covers much of original list except the disconnect/connect, which goes away if you go cloud. Most OS don’t recommend just unplugging drives, as it may corrupt the filesystem, especially if drive is being written. Windows and macOS seem to have a safe eject request, but probably won’t, if the drive is currently in use.

Time Machine and Duplicati can both do scheduled backups, and can skip that backup if drive isn’t present.
Duplcati would likely need to use run-script-before and an exit code to do a “don’t run” if the drive isn’t there.

Unplugging was covered as risky-at-OS-level, but if you want to try, you could maybe set number-of-retries super high to see if the backup can ride through a very long disconnect, if somebody just pulls the SSD out.

Windows also has drive letter issues (they may vary each time you plug in a drive), but Duplicati has ways:

That might depend on things like how long the computer is available to push the initial backup up. You can phase it based on urgency if it helps. Adding files a bit at a time has other advantages over one huge push.

There’s a similar situation on restore with a slow link. Again, one might do the most urgent first, more later.

One way to try to get a quick local backup/restore and a slow bigger-disaster recovery is to backup to local then sync to cloud, e.g. with run-script-after and something, maybe rclone sync, which does fast transfers.

This isn’t quite as good as independent backups, e.g. if SSD gets corrupted without it being detected, sync destination will have the same problem. I guess best practice isn’t actually “Forget”. For ANY software, test matters. If you don’t test (e.g. Restoring files if your Duplicati installation is lost), it might surprise later on…

EDIT:

Even though there are lots of free monitoring solutions for Duplicati, one with some planned extras (at an additional fee) is the one Duplicati Inc. is doing. I don’t know how much of the due diligence it can remove.

Simple straightforward pricing is what it costs, but I don’t see all the specifics of future (?) features shown