Moving from GUI to command line use


#1

If I want to switch from using the GUI to command line, is it as simple as exporting my jobs to command line, and then in my case using the Windows task scheduler?

What about errors/warnings? The log parsing via the GUI is now very good IMO (latest Canary), so is there a way to obtain similar output?


#2

In terms of running backups, yes - export as command line and run should do it.

There is a disconnect between command line runs and the GUI but I’m not sure exactly what’s effected - I THINK it’s mostly “last run time” and maybe system info lastPg stuff.


#3

Hi Jon
I exported to command line and gave it a try and all seemed ok, although there were no errors to see what the logs would be like. But I like the convenience of the GUI and the error reporting.

My reason to change is to avoid the GUI memory overhead which is 127mb as I write this with no backup in progress, although it has been around 49mb IIRC, which I feel that is very high. I ran a backup to see what would happen and it DROPPED to 96mb. I wonder if there is a memory leak or some other issue as this seems a bit weird.

Apart from Firefox when in use, the Dulpicati GUI (2.0.4.13 although waiting to update to 2.0.4.14) consumes the most memory albeit that I have 8gb total, so it is not really so significant.


#4

I’ve been running command-line for some months now with only a few ‘issues’. I switched because some of my backups are sourced from both local drives and network drives and snapshots require admin rights (Win10) but admin doesn’t automatically have visibility of network drives. I created scripts to handle all this but, unfortunately, it’s rather complicated and I wish a better solution was available. Anyway, I digress…

The only real problem I’ve noticed running via command line is the logs can get pretty big (I use --console-log-level=verbose redirected to a file; can’t remember why I reverted to console output instead of log file, I think the output is different?) and the web GUI doesn’t indicate the size of the backup source data, regardless of whether the source is completely local or not, nor when a backup was last run. However, backup stats is available under “Reporting->Show logs …” (huge fan of the new logging format if anyone responsible reads this: Love! Your! Work!) and all backups are available from Restore.

The other problem is that there’s no good way to store/load credentials. Currently, I’ve got those sitting pretty in my scripts. Being able to load credentials from a local file or perhaps even some sort of key management native to Duplicati would be nice.