I upgraded to the latest canary, and now I am gettling large verbose notifications with stuff like:
[Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event:
[Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Duplicati.Library.Main.BasicResults-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent]
I have the send-mail-level set to “all”, and I never had this much notification before. Do I have to change the send-mail-level setting?
Thanks!
Hi @Folsdup , welcome to the forum! I moved your message to it’s own topic so we can make sure it doesn’t get overlooked.
Unless you correct me I’m going to assume “the latest canary” means version 2.0.3.2-2.0.3.2_canary_2018-03-16.
I’m a little unclear on the actual issue, Is the problem you’re having one (or more) of these?
getting too many emails
getting too much detail in emails
getting too much detail in log files
getting too much detail in web GUI logs
Sorry I should have been more descriptive. I am getting much more details in emails. I used to only get this:
Messages: [
Deleting 1 remote fileset(s) …,
Deleted 1 remote fileset(s),
Compacting not required
]
Warnings: []
Errors: []
Thanks!
I am seeing this too… not every day, and only on one of the two computers being backed-up. 2.0.3.2_canary_2018-03-16
I agree that there is way too much verbosity in the ‘header’ information with the latest builds… but when it blows up, it’s really bad. What the heck happens here at 3:33?
DeletedFiles: 32
DeletedFolders: 4
ModifiedFiles: 177
ExaminedFiles: 15514
OpenedFiles: 289
AddedFiles: 112
SizeOfModifiedFiles: 731602975
SizeOfAddedFiles: 54057247
SizeOfExaminedFiles: 24761724839
SizeOfOpenedFiles: 789114402
NotProcessedFiles: 0
AddedFolders: 13
TooLargeFiles: 0
FilesWithError: 0
ModifiedFolders: 0
ModifiedSymlinks: 0
AddedSymlinks: 0
DeletedSymlinks: 0
PartialBackup: False
Dryrun: False
MainOperation: Backup
ParsedResult: Success
EndTime: 2018-04-01 3:41:20 am (1522575680)
BeginTime: 2018-04-01 3:30:00 am (1522575000)
Duration: 00:11:20.7600521
Messages: [
2018-04-01 03:30:00 -06 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started,
2018-04-01 03:30:07 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started: (),
2018-04-01 03:30:34 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed: (3.52 KB),
2018-04-01 03:30:57 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b951e1391291545828611353075fd3a67.dblock.zip (49.93 MB),
2018-04-01 03:32:47 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Retrying: duplicati-b951e1391291545828611353075fd3a67.dblock.zip (49.93 MB),
2018-04-01 03:32:58 -06 - [Information-Information-Duplicati.Library.Main.BasicResults-BackendEvent-BackendEvent]: …,
2018-04-01 03:33:08 -06 - [Information-Information-Information-Duplicati.Library.Main.BasicResults-BackendEvent-BackendEvent-BackendEvent]: …,
2018-04-01 03:33:18 -06 - [Information-Information-Information-Information-Duplicati.Library.Main.BasicResults-BackendEvent-BackendEvent-BackendEvent-BackendEvent]: …,
2018-04-01 03:33:28 -06 - [Information-Information-Information-Information-Information-Duplicati.Library.Main.BasicResults-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent]: …,
2018-04-01 03:33:38 -06 - [Information-Information-Information-Information-Information-Information-Duplicati.Library.Main.BasicResults-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent]: …,
2018-04-01 03:33:48 -06 - [Information-Information-Information-Information-Information-Information-Information-Duplicati.Library.Main.BasicResults-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent]: …,
2018-04-01 03:33:58 -06 - [Information-Information-Information-Information-Information-Information-Information-Information-Duplicati.Library.Main.BasicResults-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent]: …,
2018-04-01 03:34:08 -06 - [Information-Information-Information-Information-Information-Information-Information-Information-Information-Duplicati.Library.Main.BasicResults-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent]: …,
2018-04-01 03:34:18 -06 - [Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Duplicati.Library.Main.BasicResults-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent]: …,
2018-04-01 03:34:28 -06 - [Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Duplicati.Library.Main.BasicResults-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent]: …,
2018-04-01 03:34:38 -06 - [Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Duplicati.Library.Main.BasicResults-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent]: …,
2018-04-01 03:34:48 -06 - [Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Duplicati.Library.Main.BasicResults-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent]: …,
2018-04-01 03:34:58 -06 - [Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Duplicati.Library.Main.BasicResults-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent]: …,
This goes on for a while and then all of a sudden Duplicati stops imitating Max Headroom and goes back to normal (overly verbose, but not repeating) output…
2018-04-01 03:40:28 -06 - [Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Information-Duplicati.Library.Main.BasicResults-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent-BackendEvent]: …,
2018-04-01 03:40:35 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-b951e1391291545828611353075fd3a67.dblock.zip (49.93 MB),
2018-04-01 03:40:35 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-b9537f292de504b538ed957294f8620ee.dblock.zip (49.93 MB),
2018-04-01 03:40:35 -06 - [Information-Duplicati.Library.Main.BackendManager-RenameRemoteTargetFile]: Renaming “duplicati-b951e1391291545828611353075fd3a67.dblock.zip” to “duplicati-b9537f292de504b538ed957294f8620ee.dblock.zip”,
Yes, you and @Folsdup are both running into the additional info now available with the logging rewrite done by kenkendk.
It is still pretty new so it’s possible there are some bugs, I haven’t seen the issue to your extent on any of my systems. Are you using any --log-file
, --log-level
, --log-file-log-filter
or --log-file-log-level
parameters?
Edit:
Hmmm… I wonder if this is the same cause as discussed here:
Hello @kenkendk thank you for filter option!
One question: Can I filter out email and send-http-url output?
If I backup MS SQL, log in email message is extra long if I have many databases.
Email message has 150 KB and if I want use https://www.duplicati-monitoring.com for monitoring, I get error because message size.
Failed to send http message: System.UriFormatException: Invalid URI: The Uri string is too long.
I tried use console-log-filter and log-file-log-filter but it does not seem to…
I don’t have any --log parameters set, only send-mail parameters are set.
Folsdup
Nothing overly fancy for the backup… this is everything {sanitized}…
“C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe” backup “ssh://{hostname}//usr/home/backups/Duplicati/Gekko?auth-username=jamie&auth-password=jamie&ssh-keyfile=Duplicati.openssh&ssh-fingerprint={blah}” “C:\Users\jamie” --snapshot-policy=Required --send-mail-to=“jamie@{email.addr}” --send-mail-from=“gekko-duplicati@{email.addr}” --send-mail-url=smtps://smtp.{domain.name}:465 --send-mail-username=“{secretstuff}” --send-mail-password={moresecretstuff} --backup-name=Gekko --dbpath=“C:\Users\jamie\AppData\Local\Duplicati\HAOYARWNAL.sqlite” --encryption-module= --compression-module=zip --dblock-size=50mb --retention-policy=“1W:1D,4W:1W,12M:1M” --no-encryption=true --exclude-files-attributes=temporary --disable-module=console-password-input --exclude=“C:\Users\jamie\AppData\Local\Temp\” --exclude=“C:\Users\jamie\AppData\Local\Google\Chrome\User Data\Default\Cache\”
Yep, nothing odd there. I’ll try to do some tests of different log levels, otherwise perhaps @kenkendk knows if the new default log level is more verbose than the old one and, if so, suggest a setting that might better match the old verbosity.
Edit:
This issues may also be actively being discussed here:
Hello @kenkendk thank you for filter option!
One question: Can I filter out email and send-http-url output?
If I backup MS SQL, log in email message is extra long if I have many databases.
Email message has 150 KB and if I want use https://www.duplicati-monitoring.com for monitoring, I get error because message size.
Failed to send http message: System.UriFormatException: Invalid URI: The Uri string is too long.
I tried use console-log-filter and log-file-log-filter but it does not seem to…
JonMikelV:
Yep, nothing odd there. I’ll try to do some tests of different log levels, otherwise perhaps @kenkendk knows if the new default log level is more verbose than the old one and, if so, suggest a setting that might better match the old verbosity.
I want to add specific log filtering to the http and email modules, so we can control the log messages in those as well. I have most of the code written, and once I complete it, this should be fixed.
1 Like
Looking forward to it
Thank you, as always @kenkendk & Team!
jhermans:
2018-04-01 03:30:57 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b951e1391291545828611353075fd3a67.dblock.zip (49.93 MB),
2018-04-01 03:32:47 -06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Retrying: duplicati-b951e1391291545828611353075fd3a67.dblock.zip (49.93 MB),
2018-04-01 03:32:58 -06 - [Information-Information-Duplicati.Library.Main.BasicResults-BackendEvent-BackendEvent]: …,
This seems to indicate that the extreme logging contents are only created on retry events. I will investigate.
Turns out the problem is that I added a “repeating log”.
The idea is that every 10 seconds with no log activity, the last log message is repeated. Due to some issues with how the repeat was made, the log-tag field kept growing for each repeat.
I have fixed that, but I was wondering if we should disable this feature in the core, and maybe just add the repeat on the outputs (console and webui).
I’ve added a commit now that removes the repeating log scope. It needs to be done differently if we want to support repeated output, because it needs to not end up in the logs, but only show in “interactive” contexts (i.e. a human looking at it).
Thanks, looking forward next canary
Hello, @kenkendk today I tried v2.0.3.5 canary
I have little trouble get http filters working.
I write regex for filtering out all text information (with my backup log here: https://regex101.com/r/BhVCUR/1/ )
But I can’t get it working.
I tried add to config:
–send-http-log-filter=[[.\s\S ]
or
–send-http-log-filter=[.\s\S
but nothing - whole log is send to https://www.duplicati-monitoring.com
I also tried option –send-http-max-log-lines=1 but again whole whole “Messages: []” section is send to http
I am doing something wrong? Thanks
The log filters apply to the stuff at the end, starts with a line that says “Log data:” and then each line is a log data line. If you have data in Messages
, then that is the problem, and we need to filter that out.
Looking at the code it seems I missed that part when I merged in the new serialization code…
I want to add that the latest canary fixed the problem. I now get much less info on the emails.
Folsdup
2 Likes
I agree with @Folsdup . Long logs in emails/http is fixed. Thanks
1 Like