Dry run from UI

Hi, before executing a backup I am used to run a dry run and read the files are going to be changed.
I’m looking for this possibility from the User Interface.
I haven’t found anything except the possibility of enabling/disabling the flag inside the profile configuration.
I also tried to clone the profile and having a second one with the dry-run flag enabled, but during the execution I get some database error, maybe is not possible to have two profiles on the same database.

any help about the implementation of “dry run execute” ? thanks

Welcome to the forum @wallbroken

Can you clarify? Backup changes no source files. Destination files after processing have random names.

Are you looking to see what source files WILL be backed up? To see after, try The COMPARE command.

If you’re looking for before, it’s possible but requires getting a dry-run log at verbose level, and looking at it:

Nov 21, 2020 6:07 PM: Checking file for changes C:\backup source\B.txt, new: False, timestamp changed: True, size changed: True, metadatachanged: True, 11/21/2020 11:06:51 PM vs 6/19/2019 11:06:17 AM

Above file is changed and will be backed up. Unless you set otherwise, the original version is still retained.

You don’t ever want that, but it’s not guaranteed you did that – unless you messed with path on Database.

hi, can you please tell me the complete command to show that file changes log? thanks

Still unclear, but since you mention log, maybe you want dry run.

Advanced options
dry-run set true
log-file to wherever you want log file
log-file-log-level=verbose

There will be lots of output, but lines you want have format like I showed. Use a tool to extract those lines.
Windows can use findstr. Linux and similar can use grep or similar.

here is the log, and I don’t see any line like those showed by you:

That’s not the log text that the log-file option puts out. It looks like you just grabbed the regular job log.
You have to look at the path that you chose and put on screen 5 Advanced options per the directions.
For a sample of what that looks like, run About --> Show log --> Live --> Verbose during your dry run.

EDIT:

The regular log does give some statistics:

  "DeletedFiles": 1,
  "DeletedFolders": 0,
  "ModifiedFiles": 0,
  "ExaminedFiles": 4,
  "OpenedFiles": 1,
  "AddedFiles": 1,
  "SizeOfModifiedFiles": 0,
  "SizeOfAddedFiles": 0,
  "SizeOfExaminedFiles": 5254,
  "SizeOfOpenedFiles": 0,

It looks like you added an empty file, which should show in the log file you set. I’m not sure deletes do, however you can see them in the compare command, e.g. run in Commandline, after doing a backup.

in addition, is there a way to set the “compact now” amount of data in the UI? if you use “compact” flag you can set --threshold=0. but in the UI feature?

The COMPACT command documents the adjustments you can add to the backup job Options screen.
I think these also affected the Compact now last time I tried, but if they don’t, then there may be no way.

Be careful about huge changes from default. I think I’ve seen Duplicati compact constantly at that value.
The calculation appears to look at the amount of needed data relative to the capacity (default is 50 MB). Catch is that it’s nearly impossible to fill any volume exactly. They’re underfilled to avoid overfilling them.

Compacting heavily is also quite wasteful of network, CPU, and time, though it might save some space. You may also be reducing reliability of the backup, as compact adds more procesing that can go wrong.

wait, --treshold=0 used with “backup”, will compact data? so, this operation does not need to be done with “compact” ?

I strongly recommend you not set 0 for threshold unless you want to risk problems. Why do you want it?

--threshold = 25
As files are changed, some data stored at the remote destination may not be required. This option controls how much wasted space the destination can contain before being reclaimed. This value is a percentage used on each volume and the total storage.

Compacting files at the backend explains how it’s automatic.

The compacting procedure is triggered after each backup, but can be disabled with an advanced option.

ok, thanks. I also need to know how to specify to save db into remote folder which is in google drive.

Are you sure you want that, and for what situations?

Duplicati Backup and Backing up Duplicati is one experienced person’s take on what to save and what not.

Database page can Repair loss of database if system is destroyed. Direct restore from backup files exists.
These will usually also be far faster, although in a damaged remote backup situation they can be very slow.

You should definitely save an Export of the job configuration somewhere safe. That’s tiny. DB might not be.
If you have to save it, one question is how many copies to save, e.g. to protect against damaged database. Also note that it’s not encrypted (even if your backup is), but the main harm is that it may reveal file names.

If you absolutely have to save a copy to Google Drive, and you have a part of Google Drive set to sync from local disk, you can probably just make a batch file with a copy command, and run that with run-script-after. Sync will take awhile for a large database, and I suspect for awhile you’ll have only part of a database there.

If you have to save an encrypted copy, rclone can do it from the script. Some people may put a second job in Duplicati to backup the database from the main job after it finishes, but then how to backup second DB?

ok, thank you. In addition, i want to ask you if Duplicati is a secure and reliable software. I need to make a backup of important data and I need to know if the backupped data are being safe. Around the forum I read about some case of partially lost backup.

Duplicati is still in Beta (although a long one – progress is limited largely by lack of volunteer resources).

Tracked Issues are sometimes more clearly stated, sometimes less so. Forum (here) can’t do tracking.

As far as I know, security on remote backup is good. Security against attackers on your computer is not.

Usage statistics show about 3 million backups per month, unknown restore count. Some have problems, however all software has problems. Best practice for important data is to backup several places and use different methods. That way you’re protected if one of your methods suffers from an attack or a breakage.

oh, also the command line is in beta? or just the UI?

They are together, at the same level. You possibly downloaded from www.duplicati.com on page saying:

that gave you file (name may vary) like this:

duplicati-2.0.5.1_beta_2020-01-18-x64.msi

But even if the Stable channel finally ships a release, that does not mean it will be perfect. Nothing is.