Hi, before executing a backup I am used to run a dry run and read the files are going to be changed.
I’m looking for this possibility from the User Interface.
I haven’t found anything except the possibility of enabling/disabling the flag inside the profile configuration.
I also tried to clone the profile and having a second one with the dry-run flag enabled, but during the execution I get some database error, maybe is not possible to have two profiles on the same database.
any help about the implementation of “dry run execute” ? thanks
There will be lots of output, but lines you want have format like I showed. Use a tool to extract those lines.
Windows can use findstr. Linux and similar can use grep or similar.
That’s not the log text that the log-file option puts out. It looks like you just grabbed the regular job log.
You have to look at the path that you chose and put on screen 5 Advanced options per the directions.
For a sample of what that looks like, run About → Show log → Live → Verbose during your dry run.
It looks like you added an empty file, which should show in the log file you set. I’m not sure deletes do, however you can see them in the compare command, e.g. run in Commandline, after doing a backup.
in addition, is there a way to set the “compact now” amount of data in the UI? if you use “compact” flag you can set --threshold=0. but in the UI feature?
The COMPACT command documents the adjustments you can add to the backup job Options screen.
I think these also affected the Compact now last time I tried, but if they don’t, then there may be no way.
Be careful about huge changes from default. I think I’ve seen Duplicati compact constantly at that value.
The calculation appears to look at the amount of needed data relative to the capacity (default is 50 MB). Catch is that it’s nearly impossible to fill any volume exactly. They’re underfilled to avoid overfilling them.
Compacting heavily is also quite wasteful of network, CPU, and time, though it might save some space. You may also be reducing reliability of the backup, as compact adds more procesing that can go wrong.
I strongly recommend you not set 0 for threshold unless you want to risk problems. Why do you want it?
--threshold = 25
As files are changed, some data stored at the remote destination may not be required. This option controls how much wasted space the destination can contain before being reclaimed. This value is a percentage used on each volume and the total storage.
Database page can Repair loss of database if system is destroyed. Direct restore from backup files exists.
These will usually also be far faster, although in a damaged remote backup situation they can be very slow.
You should definitely save an Export of the job configuration somewhere safe. That’s tiny. DB might not be.
If you have to save it, one question is how many copies to save, e.g. to protect against damaged database. Also note that it’s not encrypted (even if your backup is), but the main harm is that it may reveal file names.
If you absolutely have to save a copy to Google Drive, and you have a part of Google Drive set to sync from local disk, you can probably just make a batch file with a copy command, and run that with run-script-after. Sync will take awhile for a large database, and I suspect for awhile you’ll have only part of a database there.
If you have to save an encrypted copy, rclone can do it from the script. Some people may put a second job in Duplicati to backup the database from the main job after it finishes, but then how to backup second DB?
ok, thank you. In addition, i want to ask you if Duplicati is a secure and reliable software. I need to make a backup of important data and I need to know if the backupped data are being safe. Around the forum I read about some case of partially lost backup.
Duplicati is still in Beta (although a long one – progress is limited largely by lack of volunteer resources).
Tracked Issues are sometimes more clearly stated, sometimes less so. Forum (here) can’t do tracking.
As far as I know, security on remote backup is good. Security against attackers on your computer is not.
Usage statistics show about 3 million backups per month, unknown restore count. Some have problems, however all software has problems. Best practice for important data is to backup several places and use different methods. That way you’re protected if one of your methods suffers from an attack or a breakage.