Backup strategy for home?

This is not really a Duplicati question… But anyway, it’s the one I’m using for my offsite backups now… Very well, I just wondering about backup strategy.

I have 3 computers and now I run Duplicati on everyone and backups to the cloud.

I was thinking that it might be better to just run Duplicati on the server and UrBackup on the other computers and backup them to the server.

Client 1 & 2: UrBackup to Server

Server: Duplicati to the cloud.

What do you think? =)

Thanks.

@NoidPurity, the strategy of copying all local content to a single local location (such as a PC or NAS) then using Duplicati to back all THAT up to the cloud does have benefits such as deduplication across multiple machines.

I’m not familiar with UrBackup though so the deduplication benefit might be lowered if your individual copies to the centralized local box are compressed or encrypted in some way.

Beyond that, I’ve heard a number of people say they store or back up individual boxes to their local NAS then run Duplicati on the NAS to get an offsite backup so it seems a somewhat common way to go.

Just make sure you store your individual backups in an easy-to-access-individual-boxes fasion otherwise you might find yourself having to restore content from ALL machines just to get to the one machine (or file) you really need. :slight_smile:

1 Like

This topic seems to be somehow related:

Thanks @tophee, I knew I forgot to do something (search similar topics) when posting my reply. :blush:

@JonMikelV Thank you for you answer, I don’t really get it what you meant with that “…individual-boxes fasion…”

One backup job from duplicati but have each computer in one folder like,
/backups/computer1/
/backups/computer2/

Or should I make three backup jobs on the server, one job for each computer?
Job 1: Computer 1
Job 2: Computer 2
Job 3: Server files (without the other backups).

Thanks. =)

BTW: A good way of showing your appreciation for a post is to like it: just press the :heart: button under the post.

If you asked the original question, you can also mark an answer as the accepted answer which solved your problem using the tick-box button you see under each reply.

All of this also helps the forum software distinguish interesting from less interesting posts when compiling summary emails.

NoidPurity, if you set up three backup jobs you’ll lose any potential deduplication benefits so if it were me I’d go with your first suggestion of having each computer in one folder like:

  • /backups/computer1
  • /backups/computer2

Then having a single Duplicati job backing up:

  • /ServerFiles
  • /backups

But again, this all assumes the contents of /backups/computer1 and /backups/computer2 are “normal” files, not something that is already compressed or encrypted. If each /backups/computerX folder is already compressed or encrypted then you likely won’t see much difference (in terms of deduplication) between the two options.

Of course from my end this is all theoretical since I don’t know much in the way of details. For example, it’s possible the sum of all your /backups/computerX and /serverFiles backups couled eventually become LARGER than your destination storage in which case using separate jobs for each source would allow you do split the backups up over mutliple destinations.

Oh, and I suppose I should have said from the start that I think the benefit of what you’re looking at doing (local computers -> local server -> cloud) is that you’ve then got a local backup from which to do restores if necessary. Depending on how much you could be restoring this would likely be quite a bit faster than pulling stuff back down from the cloud.

2 Likes

@JonMikelV, I think I will do like that and use UrBackup, Urbackup also got deduplication and compression but I think that you can disable those options.

I will have a look at this solution this weekend.

Thank you for your answers. =)

NoidPurity, you’re welcome and good luck!