Easier to read backup reports or weekly summarization

Many thanks for all the feedback and comments on my query thus far, I’ve just set myself up on www.duplicati-monitoring.com so finger crossed :):grinning:

1 Like

Well, crap, I didn’t know there was already some monitor service out there.

Using the “send-http-url” feature, I have Duplicati “report” to a server after every backup. The server parses the “message” POST sent by Duplicati and saves the data and a timestamp as a MySQL record.

A cronjob then queries the MySQL database daily and then builds HTML output that can viewed on a server or sent as email.

1 Like

@BitingChaos would you be willing to share your code so I can try it myself?

I’m working on getting it cleaned up and straightened out. Part of it is on a local server at home, and another part is on a local server at my place of work. I haven’t had a lot of time to work on it, so each location just has some parts working!
At home I have the script recording to the MySQL database, and at work (where I did the most recent screenshot) I have the script reading from text files with the backup report output (what you see as part of %result% in the email reports).

1 Like

The www.duplicati-monitoring.com is interesting, but would be nice to have self-hosted solution as well :slight_smile:

1 Like

Hi all,

Being in a similar situation and missing my daily CrashPlan summary email, I have recently put together dupReport, a utility for gathering email reports from Duplicati and creating a summary email similar to the one CrashPlan used to provide.

The description and code can be found here.

It’s only one step removed from prototype and it hasn’t been tested anywhere but on my personal system, but I’m interested for people to take a look, try it out on their own systems (Linux for now, other OSs in the future) and let me know what you think. If there is sufficient interest I will continue developing it into something more stable and portable.

I haven’t set up a GitHub project yet, so please comment here. If it gets too unwieldy we can move the discussion to a separate thread.

Enjoy!

1 Like

The linked to stuff looks pretty good!

I’ll try to poke around with it in the coming days and see what I can make of it.

Before I spend too much time on it, do you know if it will work on the Windows 10 Linux Subsystem?

I have only been able to test on a vanilla Debian 8 system, so that’s all I can say will definitely work. The main code is all bash scripting & awk, so you should have no problem there. Getmail, ssmtp, and sqlite will be your dependency issues, if anything. If you can get those running you should be OK.

Hey guys!
Cool that you already found our monitoring service: https://www.duplicati-monitoring.com/
We are continuously improving it. Just go ahead and try it out, and let me know what you miss the most. Nice readable e-mail reports are definitely already on the top of our todo list.

Nice to see that @handyguy already published a self-hosted solution. We love Duplicati and think that its main disadvantage is that there is/was no central monitoring. Nice that multiple opportunities for users are coming up now to solve this.

Greetings,
Chris

4 Likes

Hi folks,

I’ve spent the past couple of weeks completely rewriting dupReport to be better/stronger/faster. Here are some of the highlights:

  • Rewritten entirely in python into a single self-contained executable. No more bash/awk madness and no more need to install supporting programs like getmail and ssmtp

  • Based on that, it should be able to run on OSs other than Linux (still testing that)

  • Built-in support for multiple email transports (IMAP/POP3/SMTP) (still testing that also)

  • Automatically discovers source/destination pairs for reporting. No more need to manually specify them in the .rc file.

  • Lots of other configuration and reporting options

Here’s a sample report output. It’s still a bit messy because (1) it’s running against a lot of mail for testing and, (2) I’m still working on the formatting:

I’ll need another week or two to let it burn in, clean up the code, and complete testing. As soon as its ready for public consumption I’ll update the web site and post an announcement here. Hopefully people will find it useful.

5 Likes

That looks awesome!

Based on the 9/21 B2 row having fewer files but a 0 in the deleted column is it safe to assume Deleted refers to historical versions cleaned out of the archive files?

Begin greedy mode… :slight_smile:

Is it possible for the report to include links such as to the web GUI command line for a specific job with parameters pre-populated to generate a file list for the specific job?

I realize it would only work if clicked while on the machine from which the backup ran, but it might still be useful…

1 Like

That looks awesome!

Thanks!

Based on the 9/21 B2 row having fewer files but a 0 in the deleted column is it safe to assume Deleted refers to historical versions cleaned out of the archive files?

Interesting question. The 'Deleted" column comes directly from the “Deleted:” line in the report email. Your assumption sounds correct, but I’ll need to do some more research to verify. The “+/-” column shows the actual calculated difference (size and file count) between the previous run and the current run. The fact that the “+/-” and “Deleted” columns are different would indicate that “Deleted” refers to a different type of calculation. Perhaps kenkendk can clarify what that means.

Is it possible for the report to include links such as to the web GUI command line for a specific job with parameters pre-populated to generate a file list for the specific job?

Not greedy at all! I’m looking for feedback and suggestions for future updates. I’m currently trying to finish up the basic functionality and get the program “released.” Once that is done I can turn to wish list items. I already have a few, but I will add your to the list to see how it might be done.

Thanks for the feedback! :grinning:

Doesn’t it make more sense to analyse the LogData table in the database? It has all data you need, and this way you don’t need to check your email client. The data in LogData looks exactly the same as in the email report.

Quickly checked, and seems the LogData only keeps the last 30days by default, but there is an option to extend this, so you could go back further.

Doesn’t it make more sense to analyse the LogData table in the database?

Unfortunately, in my setup I have 9 different systems in multiple locations backing up to 2 back-end storage locations (one local, one cloud). “The database” in this case is distributed across all those systems. The only thing they have in common (and the only way to easily correlate all their data) is to parse through the result emails.

If all your backups are controlled from one system, I agree that searching through the database is probably a lot easier. I wasn’t so lucky.

1 Like

Hey, folks. I just released version 2 of dupReport and it’s a lot nicer and cleaner than the old bash/awk version. Check it out on this thread, try it out, and let me know what you think.

2 Likes

@crazy4chrissi : Love the project. Any roadmap on future updates?

@Beshiros We just had an update today (see my comment here). Planned features include:

  • Improve lists of backup sets and backup reports (sorting, filtering, pagination)
  • better handling of backup set groups (e.g. toggle)
  • generate config-files that can be imported in Duplicati
  • “reseller” account to manage multiple sub-accounts
  • optionally send backup report emails for each backup run (not only daily)
  • improve daily email reports (e.g. add graphs)
  • add an API for monitoring systems like icinga/nagios/zabbix so they can check the current state of the backups
  • [your ideas here :slight_smile: ]

I can’t promise we will do all of this and when we do it. This project is donation based, we cannot spend too much work into it as long as we don’t receive a lot donations. So you know what you can do to speed it up :wink:

@crazy4chrissi
I like your reporting site. A reset-function for a backup set would be nice. So delete all reports, new bytes, modified bytes aso. Keep the configuration but just start again with “never”

@thommyX Thanks, seems useful. I just put it on our todo list and will let you know once its done.

3 posts were split to a new topic: Log Duplicati runs in Windows Events