Can I get a list of what's changed after sync and call a custom script?

My use case: I’d like to have duplicati run, and then log how it went to prometheus so I can have a grafana chart tell me how my backups are doing. I can then use grafana alerting to tell me when things are going south.

My thinking is that if I were a Duplicati engineer I wouldn’t necessarily want to tie it directly to prometheus/grafana, but perhaps having Duplicati emit a .json file or some such that details what’s changed (files uploaded, deleted, and the timestamp), and run a custom shell command when completed (ideally with data on whether it was fully successful or not), then I could push this to prometheus gateway and render, and other people with different infrastructures could do something different with the same information.

Does such a thing exist somewhere?

Hrmm, looking through the logs I’ve found --run-script-after which seems to be half my answer. Still sleuthing if there’s a way of sending along what’s changed.

Oh, interesting, just found out about “DUPLICATI__RESULTFILE” which seems to be what I want. Digging now for the format to see if it’s useful to parse into prometheus metrics.

Apologies for the stream of thought, but I’m documenting it in case it’s useful for future google searchers as I had to carve this path myself.

actually there is a monitoring site for Duplicati. I don’t use it but you should not have any difficulty sleuthing it out the forum archive.

Thanks gpatel-fr. I’d prefer not to rely on another third party solution - having everything centralize din my current stack would be great. That said, I appreciate you offering another solution, particularly if a future user finds this post and finds it useful. I assume you’re referring to https://www.duplicati-monitoring.com/ ?

I’ve done more digging. It appears that there’s an undocumented (or at least not easily discoverable by me) command line flag: run-script-result-output-format which you can set to json. It makes the file created at the filesystem location described by $DUPLICATI__RESULTFILE to be json when “–run-script-after” script is ran.

I’m going to see if I can use that to shove into my prometheus instance and then graph based on that.

I assumed that the site had linked to a repo with the source, but I was wrong. For my sins I searched Github and there is a Grafana using repo:

1 Like

Reporting options feeds reporting tools via HTTP, SMTP, or XMPP. Or look at the result file in a script.
None of these give individual path names, only statistical information, but it sounds like that’s enough.

Duplicati Dashboard - Monitoring solution is the forum topic about the fabien-github tool. Another tool:
Duplicati grafana dashboard

1 Like

Thank you gpatel-fr and ts678!

I ended up digging through the source and ended up implementing myself. In case people are curious, there’s a gist here that details it: upload_results_to_prometheus.py · GitHub

But TL;DR version:

  • Additional commands on the run will let you run a script after completion and generate a .json file with statistical information (–run-script-after=/scripts/upload_results_to_prometheus.sh --run-script-result-output-format=json)
  • The script I wrote above flattens out the .json structure and shoves them to prometheus.

I intentionally kept the post processing to a minimum. My thinking that this is versatile enough for people to build prometheus alerts or grafana charts based on this, which is in fact my next step. :slight_smile:

Thanks for the help everyone!

1 Like