I have a 50gb backup and it takes about one hour. In the source directory, Data, there are about 20-30 subdirectories, and one of them, svn_repos, is quite small and changes often. I would like to do a backup of Data once a day and svn_repos 4 times a day.
How do I do that? TIA.
Not sure what’s wrong to do 4 backup a day, if your other directories does not change often they will not be backed up. Otherwise, do 2 backup jobs, excluding your special directory in the first job, and only including it in the second job.
svn_repos size is less than 1% of total and I would assume that 4 backups could be done in less than 10min. On the other hand, 4 full backups would take between 4 and 8 hours. Keeping a drive at 100% utilization for that long, day after day, will kill it quickly.
If your 50 GB minus your special directory don’t change they will just be scanned and nothing will be backed up. Duplicati is a deduplicating backup tool.
On the home screen, you can watch the files actually being read through and where they are.
You can also see
Counting on the status bar, which is basically a walk through source areas.
If you’re on Windows and running as an elevated admin, usn-policy can sometimes run faster.
Duplicati first identifies potentially changed files (e.g. timestamp changed), then reads those to
identify which blocks of the file actually changed and aren’t already in backup, then packages
those blocks into a series of temporary files which eventually get uploaded to your destination.
Typically when people worry about drive wear, it’s from SSD users worrying about write cycles.
If USN doesn’t fit, you can certainly split the backup as suggested using exclude as suggested.
Optionally, you can try to study where your time is going, monitor actual drive loads, and so on.
You are 100% correct. Unfortunately scanning entire Data 4 times can take up to 8 hours and keeps the disk at 100% utilisation, and this is what I’m trying to avoid.
unless you have millions of files this seems unlikely. If yes, are you on Windows ? USN can help if it is the case. And if not, create another job as already advised.
I did a cleanup and from 2000 historical backups, about 6 years, I’am down to 70. I’m on windows and I used USN as recommended here. It is much faster now.
I’m getting a warning though:
* 2024-01-02 15:32:12 -05 - [Warning-Duplicati.Library.Main.Operation.BackupHandler-FailedToUseChangeJournal]: Failed to use change journal for volume "E:\": Access is denied
Do I need to run the backup as administrator for USN to work?
yes, it will work with Duplicati installed as a service (as it runs as System). It’s documented in the link you were given here.