I’m totally unexperienced into backup solutions. I bought a new reliable HDD into my PC for backups only, because I didn’t want any extra NAS box laying around. My goal was to simply copy-paste specific folders into this HDD. Initially I tried built in Windows Backup using file history and it worked just as I wanted. It created multiple versions of files that were directly accessible as files (e.g. JPG) in the backup folder. But the problem was, that it didn’t update automatically even tho I did set it up that way. TLDR, I resigned and looked for a new backup solution that provides things I needed and found Duplicati.
I really like it, but when I ran the backup it created files called duplicati-something. All I wanted was a simple accessible file automatically backed up on the disk. Is this even possible with Duplicati? Are there some special settings to set it up this way?
Or am I completely dumb to backup using this method?
Duplicati is a backup tool, not a file synchronization tool. The files it places on the “backup” side are not in native format. There is no way to get Duplicati to work in that fashion.
The main reason for this is because Duplicati uses deduplication. What this gets you is efficient storage of many, many backup versions (point in time snapshots). What you lose is being able to read the backup files directly - you need to do a restore to convert them back into native format.
As an example of version efficiency, one of my backups protects about 30GB of data. I currently have 220 backup versions from the past 4 years, and the backup storage required is only 90GB. With a file synchronization type backup, this number of versions would have required over 6TB of storage.
Were you planning to have this be single-version, meaning if you have an unwanted change to a file, then copy that over into backup, original unchanged file is gone? Sometimes versions will enable going back…
As soon as you have versions, you potentially get into space efficiency (or maybe your new drive is huge).
Computer viruses and ransomware can cause unwanted changes to files, and if copying is automatic, the changes may get into the copied-over files, but actually even just leaving a drive connected can get that…
It depends on priorities. Copy to remote site will help you if a local disaster causes loss of local equipment.
I’m working in the creative industry that is working with various file types - pictures, large video files and 3D projects/files. I also work in a method of Save As so I don’t overwrite previous files. That being said, I don’t know if “deduplication” will save a lot of space, but maybe I’m wrong. I often find myself looking for a previous version of a file in the backup folder mostly if I deleted it by mistake. Currently, I have around 1.2TB of files and my new HDD is 8TB, so hopefully, space won’t be a problem in the near future.
What caught my eye tho, was mentioned protection from viruses, so maybe it is worth the long run to restore files using the Duplicati interface. Thanks for defining the difference between backup and synchronization, it really did help me a lot.
Deduplication benefits depend on the copying (or backup) scheme. If you used separate folders for each copy of a folder to new HDD, you would do lots of copying and waste lots of space, because of identical files in the various folders. Deduplication avoids this. Identical files have no extra blocks to store. The old ones are reused. It sounds, though, like you might not have separate folders, but just one that keeps new versions, overwrites old versions, and doesn’t delete (which is how copy works, as opposed to a “sync”).
I once tried Windows File History, and I believe it used a single folder structure, but kept history by putting file modification time into an edited filename. If file is not modified, no need to transfer another copy of it…
File Versioning in FreeFileSync is (I think) a similar design, and you may consider it if you like direct copy. Duplicati adds things like compression (which won’t help, if video files are compressed), and encryption (which you may or may not care about, as your new drive is local and not a remote system somewhere). There are a lot of other sophisticated features, but you wanted “very simple” and I’m not certain it’s THAT.
You might still consider remote copying of some sort, which may mean paying either by amount stored, or flat-rate. Your entire file set is right about where flat rate might be cheaper. For simple, you might consider Backblaze Unlimited Backup. Duplicati users would use their B2 option, which is what fancy backups use. Internet speed becomes a factor if you handle large files, but you could keep both local and remote copies.
Interesting, Windows file history so as much as I can see on one of my own computers does automatically backup files as they get changed. So it should be working.
Either way, I use that plus at least one other backup solution in combination as you never want to only rely on one.
Free alternatives that do what you want: Syncthing, Resilio Sync, and BT sync. I dropped all of those however.
Syncthing dropped for Duplicati recently, because the way I wanted it to work, they have it foolishly so it eats drive space for breakfast by default and writes too much which is nuts. You may run into the problem as well.
Resilio Sync might be what you’re looking for. As long as you don’t encrypt the backups which is what you want anyway so you should be fine. It has a small learning curve. I don’t recommend it because encrypting the backups could lead to loss of backups just from time which is nuts and made me nervous. Think I finally dropped it for other reasons but no longer care so whatever.
Cobian Backup is a dedicated backup solution which puts changed files into separate backup folders according to when the backup job is run. It may be preferable for you to do it this way than the way that FreeFileSync does versioning. Cobian can keep X number of backup versions/jobs and it keeps files in their current file format, which is what you are after.
I’ve previously used Cobian for years and it was extremely reliable.
In terms of preventing against viruses, as already mentioned, you need to disconnect the external hard drive when the backup job has finished running. Paid for products such as Acronis, or the free Veeam Agent can eject the external hard drive for you automatically, but both of these put the backed up files into their own file format so you cannot just browse the backup files to restore them yourself and have to go through the program to make a restore.
I’d like to add that you should probably consider a long term solution of the local backup and a separate non-local backup. I’d recommend having duplicati do what you intend, but also add another automated backup scheme to a separate offsite location. The best schemes protect you from your local machine OS being corrupted or a physical failure, but also provide fast restores of lost files. The local backup will give you the fast restores of lost files, but not protect you against OS corruption or physical failure of your hardware. At least most people don’t setup a robust plan against that type of problem until after they have a serious failure. Maybe someone here would give a good example of how they prevent downtime from local physical hardware failure by using an offsite or non-local backup, while also taking advantage of a local fast backup for quick restores?