I have been grappling with the best way to run a backup that has 4 different drives we rotate daily. The backup looks for a file on the destination backup drive, using a Bash if-then-else logic check. If the corresponding backup file doesn’t exist, the script doesn’t run the backup script and simply exits. If the file exists, the Duplicati backup script runs and an email report is generated.
For example, I have 4 backup disks. In the root of each disk, contains the file “backup(1-4)”. On each disk is written “backup(1-4)”. Every day, the next backup disk 1, 2, 3 or 4 is inserted. The script checks for the file and, if it exists, runs the correct backup. That way, we don’t get missing database errors.
I WAS doing this through the Duplicati web interface, but decided to start sending myself log reports. The problem is, it sends the log reports for all the backups, including the 3 that were not run, because their media was not present. This report is an error report and doesn’t need to be sent.
The only issue is, if I run the backup from the command line that the web interface provides, it doesn’t update the web interface, as far as showing when the last successful backup took place. Is there a way to make it update the web interface, as well?
I must be misunderstanding something because it sounds like you’re using the bash file because you wanted to get log reports, but now you’re getting them for every backup job not just the applicable one for the current day (in which case it sounds like a bash error).
As far as a command line backup not updating the web interface, which of these are you seeing:
- you open the web GUI then run a command line backup the web is not updated
- you open the web GUI then run a command line backup then close and re-open the web GUI and it’s not updated
- something else is going on
Did you try using the Duplicati scheduler along with the
--alternate-destination-marker parameter (to identify the correct drive) and possibly the
--send-http-level parameter to not get notified about “failed” backups due to the drive not being attached?
Similarly, you should be able to use the Duplicati scheduler with a
--run-script-before-required call to use your existing bash driver checker to decide whether or not a particular job should execute.
Thanks for the reply. All of those sound like excellent ideas and I have tried them, lol.
We don’t go by days, because the people that actually insert the drives into the USB reader aren’t tech savvy enough to figure that out. They CAN count, however, and they know that 2 comes after 1 and when they get to 4, they put 1 back in. I agree that going by days would be the best thing to do, but, it never fails, they put Monday’s backup in on Thursday, then wonder why it didn’t backup.
The situation is #1. The backup happens just fine, from the command line, but it doesn’t update the GUI and those non-tech-savvy people really wanna see the line that says, “Last successful run:”
This topic originally came up, over on Github, and I was told to use the
--alternate-destination-marker to run all 4 backups, at once.Then I would put the corresponding file in the corresponding backup drive’s root directory. That works fine, except it records an error for the other 3 backups, that the destination marker could not be found.
--run-script-before-required is the one that looks for the zero value, right? If so, I tried that, as well. If my bash script returns a non-zero (error) value, Duplicati also terminates in error and sends me 3 emails saying so and 1 email saying the backup was successful.
--send-http-level I want it to report when the backup fails. I would rather it not run at all, if the responsible backup disk is not in place, which I can accomplish, but it doesn’t update the GUI.
I think what you want is something like this:
Judging by that, the command line and GUI are separate and the command line does not update the status of the GUI?
Yes, correct. They each live their independent lives