I am attempting to use Duplicati for backing up data to a pair of locally attached external USB drives for purposes of storing one copy offsite. My plan was to have an “A” and a “B” copy which I would rotate: one would be offsite, while one was actively backing up onsite. I plan to create 2 sets of backup jobs in Duplicati: an “A” and a “B” set. I need some help with the behavior I am observing and how to do some things…
Handling a missing target: When a drive is not mounted (probably because it is the offsite copy), I would expect a Duplicati backup job to quickly. Instead, it appears to not check for the accessibility of the target directory, but instead creates missing directories, then fails with an error message saying “found xxx missing files” or something like that. This takes several minutes.
I tried using the --run-script-before-required option with a script to check the mount point, but the script is not being run and I get the same result as above. I would like the job to fail quickly when the target is not available, and I don’t want Duplicati to create any directories.
Is the script not running at all? Does it have the execute bit set, and does it have the perl interpreter on theh top line of the script? eg, #!/usr/bin/perl -w
Yes to all, and it is also logging start and end of script. It is not being triggered at all. It runs fine separately though. Since Duplicati is currently running as root, it shouldn’t be having any permissions issues.
root@gilsvr:/etc/duplicati
$ ls -l
total 24
-rwxr-xr-x 1 root root 22536 Aug 20 10:48 check_a_mounted.pl
Is “ls -lu” going up, indicating script file was accessed by Duplicati? That might narrow the issue down.
I see in another post that you know strace, but that might be a somewhat more complicated technique.
OK, something weird going on with the script. I am seeing the entries in the logfile for the script out of order, like the write to the log was delayed. Probably something to do with not turning on perl autoflush and probably script not completing and closing the logfile for some reason - perhaps related to the backup job sometimes not completing and hanging on “verifying files”. Hopefully It’ll get figured out soon.
The stream stderr is unbuffered. The stream stdout is line-buffered when it points to a terminal. Partial lines will not appear until fflush (3) or exit (3) is called, or a newline is printed. This can produce unexpected results, especially with debugging output.
When opened, the standard error stream is not fully buffered; the standard input and output streams are fully buffered if and only if the streams do not refer to an interactive device.
Thanks. I agree that what you are describing is likely related to the issue I am seeing. Right now, I am seeing backup jobs succeed and fail based on the intended script/backup functionality, so: all good. The problem may only exist when the backup job fails to complete, which is strange because the script should still complete. I am going to leave things unchanged to validate that theory. If it (script doesn’t output to log or outputs our of order) happens again, I will add some intentional buffer flushing to the perl script and try again.
Here is what is happening in the script. Not much to it…