I did not realize until half an hour ago that there was such an option, then looking at sqlite commands to compact database I learned about vavuum, and with this hint I realized that this option was available in Duplicati as well.
I have started the vacuum command (running Duplicati.CommandLine.exe vacuum URL and the --dbpath option specified, I hope this is the correct procedure) exactly 11 minutes ago: given the db size I expect this will take some 3 hours to complete.
I’ll let you know how this goes…
Thanks for the pointer!
Hope it all goes well, keep us posted!
Surprisingly it took just 21 minutes to complete. The db size didn’t change much: from 24Gb to 21Gb
I’ll let you know tomorrow how this affected the next backup, scheuled to start at 20:00 this evening.
Usually it takes about 5 hours (used to be 16 hours before moving the db to an SSD and setting check-filetime-only)
The backup process tonight took 4hours 20minutes, not a sizeable improvement compared to the 5 hours in took before then.
In any case thanks for helping with the vaccuum command. I guess I’ll have to look elsewhere to fins the reason for the long backup times.
What I don’t understand is that the backup starts at 20:00 and I don’t know what’s it doing for the first 2 hours:
2019-01-08 20:00:00 +01 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started,
2019-01-08 22:04:47 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started: (),
2019-01-08 22:05:04 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed: (12.55 KB),
2019-01-08 23:48:02 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ba1332b4229164271a20b379c1dd4a401.dblock.zip (49.95 MB),
2019-01-08 23:48:22 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ba1332b4229164271a20b379c1dd4a401.dblock.zip (49.95 MB),
Sorry I didn’t get back to you sooner, but it wouldn’t have mattered since @mikaelmello already covered what I would have suggested.
The question of what’s going on for that first hour is valid - but my guess is it’s doing something like backend validation but not logging it at the normal log level.
I’d recommend either watching the ‘lastPgEvent’ block (bottom of the About -> “System info” tab) during that period or adding
--log-level=profiling to the job just long enough to see what’s going on then.
(BTW - I edited your previous post by putting “~~~” around the logs to make them easier to read.)
@JonMikelV Sorry for the delay in my reply, I was on a trip, and thanks for editing my previous post: it actually was unreadable!
I can’t watch the log while it’s running, I’ll setup the logging and report back.
So I had the log running overnight and here come the results:
- From 20:00 to 21:30 it basically listed all files the folders to ba backed up saying for each one of them
[Verbose-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-IncludingPath]: Including path as no filters matched: /path/filename
- Then from 21:30 to 22:03 many SQL queries as follows (just 1 example):
[Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: ExecuteScalarInt64: SELECT COUNT(*) FROM (SELECT DISTINCT "Path" FROM (
- And then the “real” backup started skipping files where they had not changed and so on
All the rest I believe is pretty much straight forward …
I only do not know if there’s anything I’m doing wrong which causes the first hour and a half basically doing something I’m not totally sure is necessary.
In any case thanks for your help.