Are you wanting to know every file in a version that would be recovered if you were to restore from that version or are you looking for a list of what files were added / updated / deleted in a specific version compared to another (such as the previous) version?
Correct. Actually, I’d like the backup process to be more transparent. Now Duplicati works as a black box. It would be a great chance to check that things goes well if the list of added/updated/deleted files was somehow accessible.
It is indeed available already - it’s just hiding on the --compare command.
Well, not really
When I open that “commandline” URL for a particular backup job, there is no way to see the database path while it’s compulsory for “compare” command. Could you make it more smart and automatically include “–dbpath=/Users/haron/.config/Duplicati/XXX.sqlite” as the first line in the “Commandline arugments” box? Maybe for all commands (easier for implementation), maybe just for “compare” (easier for users).
After just a couple of seconds I can see A HUGE list of files which ends with some statistics:
However, I don’t see anything except approx 2000 lines of deleted files. It seems, all the other lines are lost, or maybe there is not a vertical scroller. I suppose. Moreover, the path of each file is so long that it can’t be visible without horizontal scrolling. Which is hardly possible taking into account height of the list.
To sum up, what I’m trying to say is that the output window is too high and too narrow at the same time, and has no vertical scroller. Maybe it’s a good idea to provide an option to export output as a text file. In such a case it would be absolutely easy to open in any viewer and analyse.
Why not just put list of changes into the task log (in case of --verbose=on, of course)?
Yeah, in large delta jobs the change list can be longer than the web buffer.
I’ll see if I can provide something for you in a few hours that can output a text file or email for you. Otherwise a direct Duplicati.CommandLine.exe call may be the only way to go.
Welcome to the forum! I edited your post to improve the formating. (Just added ~~~ before and after the output you pasted, please see here for details).
Can you include the command that produced those results?
For me, this command works: Duplicati.CommandLine.exe" compare "file://E:\_Duplicati-Block -Test" --dbpath="E:\_Duplicati-SQLite\NJQGAWWNCA.sqlite" 0 1 --full-result
Including the --dbpath parameter speeds things up as it doesn’t have to pull data from the destination, plus I believe it avoids having to include a password (or the disable password parameter).
Too many changes to fit in the browser scroller? You might want to shell / CLI call as shown in this post with a redirect to an output file (add >>C:\DuplicatiOut.txt to the end of line).
While I hope to see output routing added to the GUI based command line, I don’t believe it’s on the short list for the next release.
Not in a simple way, no. You could probably dig into the .sqlite file and get file sizes and the like from that but you’d basically be writing your own SQL to do it.
Reducing backup size pretty much comes down to some combination of:
reduce source size
increase compression
reduce versions / retention
I’d suggest you start with looking at what’s actually being backed up and make sure you’re not grabbing junk you don’t need. I’m not sure what all Macs keep around but on Windows it’s easy to accidentally be backing up temp, paged memory, hibernation, and recycle bin files).
You could set the “Advanced options” --zip-compression-level higher (such as to 9 for maximum compression). Not that this is found in the “Zip compression” section of the “Add advanced option” select list.
You might also want to look at the --retention-policy feature (although it was added to a canary version after your beta, so you’d have to update to that version or wait for the feature to come to your upgrade path).
Not in a simple way, no. You could probably dig into the .sqlite file and get file sizes and the like from that but you’d basically be writing your own SQL to do it.
Could you create a feature request to include file size info into the file log?
Reducing backup size pretty much comes down to some combination of:
• reduce source size
• increase compression
• reduce versions / retention
I’d suggest you start with looking at what’s actually being backed up and make sure you’re not grabbing junk you don’t need.
Right, this is exactly was I’m trying to do. There are around 50’000 files in the profile therefore I’m trying to find the biggest and check whether I really need them.
You could set the “Advanced options” --zip-compression-level higher (such as to 9 for maximum compression). Not that this is found in the “Zip compression” section of the “Add advanced option” select list.
As far as I remember, I did it in general settings as a common rule for all backups.
You might also want to look at the --retention-policy feature (although it was added to a canary version after your beta, so you’d have to update to that version or wait for the feature to come to your upgrade path).
Could you drop me a link to read more about this feature?
The --verbose command does not affect the email output. You would need some logic to store the log file, and then attach the log file to the email for this to work.