Hello @Ramiro85, welcome to the forum!
It looks like you’ve got a pretty good setup going right now, other than the manual intervention and downtime.
Let me start by saying that Duplicati is a file level backup tool, so if you have a hard drive failure you’ll still have to manually install the OS, programs, etc. (or restore from Clonezilla) then restore from Duplicati before you’d be back online.
If that’s still adequate for you then you might want to consider how much data changes in the various parts of your backup source. If it’s all pretty static EXCEPT for one area (such as the database) you might want to consider two backup jobs - one of the static data and one of the more dynamic stuff.
Alternatively, if you’ve got critical and non-critical level data (as in as-fast-as-possible restores vs. can be a day or two later) then splitting them up that way might make more sense.
If the reducing destination disk space usage is of primary importance then a single big backup would give the best deduplication, though honestly exactly how much benefit that would provide is hard to estimate and would depend on your data.
One other thing to consider is versioning - are you looking to keep multiple versions of files or just a single one?