The tasks run in the background of the computer

Hello
I was running spots with http: // localhost: 8300 / ngax / index.html # /
but I often forgot to run duplicate.
I put in the ubuntu launcher
sudo systemctl enable duplicati
sudo systemctl restart duplicati
then I realized that it had to be launched with http: // localhost: 8200 / ngax / index.html # /
Can I do the settings in firefox but have the tasks run in the background of the computer that is on? can you use the simplest? the ubuntu launcher?
Regards

Hello
In fact after restarting the computer and clearing the cache, I no longer see any backup configuration. A priori duplicati is well launched because accessible via locahost 8200.

If I run manually, I have access to all my backup settings again via locahost 8300.

How do I get duplicati to automatically launch backups that I temporarily set on a browser?

The ideal would be to have my backup settings on 2 different paths on my data disk
/mydata/param-config-save1
/mydata/param-config-save2
Regards

Sounds like youā€™re running two instances of Duplicati and itā€™s causing some confusion. Each instance will have its own configuration. The first instance is accessible at localhost:8200, the second instance localhost:8300, etc.

You probably donā€™t want two instances. If you want Duplicati to run all the time regardless of any user being logged in, then systemd is the right way. Donā€™t run the Duplicati icon after login as that will start a second instance. Instead just use a web browser to access the main instance at localhost:8200.

Hello
Thank you for your interest.
I donā€™t mind having 2 instances, it might also be good if you understood my wish.

a) Default currently;
the launch of duplicate when starting ubuntu, I have either an error or an empty display with the 2 locahosts urls in firefox;
By then launching duplicati, I find the operation with the localhost which shows me my parameters of backup, but that required that I think of the ncer duplicati !!

b) For the future;
If possible, I want to launch each of the locahost1 and locahost2 instances at ubuntu startup with its parameters (which each have different backup configurations).
I donā€™t really know the command mode including sytemd!
It would be of the type
launch localhost1 with path /mydata/param-config-save1
launch localhost2 with path /mydata/param-config-save2

Is it more understandable?

I think I understand, but Iā€™m not sure why youā€™d need two instances (especially if they both run from systemd). A single instance can have multiple backup jobs defined with different targets and options.

Re
because I would like to have a hidden directory on the disk (so no interest in making the task visible). I donā€™t understand why this would be a problem?
If you could at least tell me for the command line to launch a duplicati instance with a personal path?

launch localhost1 with path /mydata/param-config-save

It would already be great because for the moment I have to start duplicating manually!

Itā€™s not a problem, you can run more than one instance. I just donā€™t know if it will achieve your goal.

In any case try setting the DUPLICATI_HOME environment variable to the folder where your Duplicati-server.sqlite is stored for the different instance. It will override the default Duplicati location which is ~/.config/Duplicati

Good luck!

HI

Still not solved my problem after reading the duplicate documentation in command mode or looking on the internet.
Can someone help me ?

at the moment

  • I manually run duplicate every day :frowning:
  • I change 1 or 2 times a year, the parameters of a backup by browser and check if I have no error message blocking every day :frowning:

I would like that:
a) duplicate is launched when ubuntu 20.04 is launched
b) the backups are done by themselves * :slight_smile:
c) ideal that I store my backup settings in
/media/ ā€¦/mydata/save-duplicate
rather than ~ ~/.config/Duplicati
Regards

  • if possible to receive notifications by email or see notifications in ubuntu without needing to monitor every day!

You donā€™t list two instances as a requirement. Did you change your mind on that one?

re
Yes I prefer to start simple before trying to become more complex on a less vital desire
Already succeed in improving the current situation which I do not find optimal

Ok, I agree - keep it simple. :slight_smile:

All of your goals are achievable. Per your previous posts, you already have Duplicati running as a systemd service. I recommend keeping it that way. Make sure you stop launching Duplicati automatically at logon to the GUI desktop.

When you want to access the Duplicati service, just bring up a web UI to http://localhost:8200. Is everything configured there? Set up your backup job and enable the scheduler and it should run automatically.

You can also set up email alerts using the several --send-mail-* options. The exact settings depend on your email service, but you need to configure several options. My system has these:

--send-mail-from=alerts@domain.com
--send-mail-level=Warning,Error,Fatal
--send-mail-password=xxx
--send-mail-subject=Duplicati %OPERATIONNAME% report for %backup-name%: %PARSEDRESULT%
--send-mail-to=myemail@domain.com
--send-mail-url=smtps://mailserver.domain.com
--send-mail-username=xxx

Hope this helps.

Hi again
Thank you for your support. I have to complicate unnecessarily because I donā€™t understand everything

All of your goals are achievable. Per your previous posts, you already have Duplicati running as a systemd service. I recommend keeping it that way. Make sure you stop launching Duplicati automatically at logon to the GUI desktop.
=> I donā€™t understand, itā€™s with the ubuntu launcher?

When you want to access the Duplicati service, just bring up a web UI to http: // localhost: 8200. Is everything configured there?
=> No, I donā€™t have any backup configuration. They are on localhost: 8300 (once I run duplicati via the ubuntu launcher) !!
If I understand correctly I have to run my backup parameters again on localhost: 8300 to move them to localhost: 8200 !?
You were talking for a moment about DUPLICATI_HOME environment variable, how do I move my dupicati configuration files to my data disk?

Set up your backup job and enable the scheduler and it should run automatically.
=> If I understand correctly, is it scheduler in my firefox profile?
But do I have to load it every time (this makes one more tab, no way it runs without my firefox profile? That was one of my questions

You can also set up email alerts using the several --send-mail- * options. The exact settings depend on your email service, but you need to configure several options. My system has these:

--send-mail-from=alerts@domain.com
ā€“send-mail-level = Warning, Error, Fatal
ā€“send-mail-password = xxx
ā€“send-mail-subject = Duplicati% OPERATIONNAME% report for% backup-name%:% PARSEDRESULT%
--send-mail-to=myemail@domain.com
ā€“send-mail-url = smtps: //mailserver.domain.com
ā€“send-mail-username = xxx

=> ok great

I donā€™t understand, itā€™s with the ubuntu launcher?

Yes, you need to stop launching it with the Ubuntu launcher. This is starting a second instance which runs at localhost:8300.

You can either recreate your backup jobs at localhost:8200, or you can try moving the Duplicati-server.sqlite database. Iā€™m a bit hesitant to try to walk you through that as it could cause other issues. Itā€™s something an advanced user could try though. Are you willing to recreate your backup job(s) at localhost:8200 ?

If I understand correctly, is it scheduler in my firefox profile?

No, itā€™s the scheduler built in to the Duplicati engine running at localhost:8200. You configure the scheduler when you are setting up the backup job.

Hello
I advance slowly but surely. I did different operations
a backup test under localhots: 8200: ok
transfer of a configuration parameter from localhost: 8300 (occasionally activated) to another tab with localhost: 8200.
I get an error message if I copy by doing export> command line then storing it in an ā€œexportā€ file:

Unexpected character encountered while parsing value: m. Path ā€˜ā€™, line 0, position 0.

I canā€™t right click to save as ā€¦
Well, Iā€™m not going to spend a lot of time if itā€™s not a simple mistake on my part.
At worst, I will manually recreate my 5 backup parameters under locahost: 8200!

More interesting, I may have found where my backup settings were hiding: /root/.config/Duplicati/VOFAJEJBUV.sqlite!
and that I could possibly have specified a path for each backup setting. I saw that you answer on some threads:
ā€“server-datafolder = C: \ ProgramData \ Duplicati (for me itā€™s ubuntu)
but i couldnā€™t get it to work.
If I change the dbpath in command line mode of a backup setting. I tried a lot to repair, reset ā€¦ nothing works!

I would be interested to have the storage of this sqllite / backup configuration file in different directories: personal / work / association !?
@+

Yes, as mentioned above the default location is ~/.config/Duplicati which for the systemd service (running as root) is /root/.config/Duplicati

I would let it stay at that location unless you have a good reason to change it.

Instead of doing export config and trying to import in the 8200 instance, would it be too much trouble to just set it up from scratch? You can avoid some potential issues this way.

HI again
Ok I will recreate them manually but since there are already backups on the remote target, is that okay?

Yes I have good reasons for wanting to change the path of the configuration setting.
As I tested it and it didnā€™t work. I imagine I must have done this in the wrong place or at the wrong time (after creating a new backup setting).
In order to recreate my backups, can you tell me where (and if just before making a first new backup)?
It is in command mode of the sqllilte database
advanced> command line> dbpath = /mondossierduplicati1/

Yes thatā€™s fine. After you recreate the job, you can go into Database ā†’ Repair to regenerate the local database. (Alternatively, you could copy the job-specific database from the old location.)

This is also where you can specify the path to the job-specific database. It will default to ~/.config/Duplicati but you can change it.

To add some background and comments:

From manual section Duplicati.Server.exe

--server-datafolder
Duplicati needs to store a small database with all settings. Use this option to choose where the settings are stored. This option can also be set with the environment variable DUPLICATI_HOME.

Duplicati.Server.exe is what one typically runs in background. It has no TrayIcon and just a web UI.
The Linux systemd system (administered by systemctl) can run one. It will have trouble doing two.
Run multiple instances of the same systemd unit gives the concept but you might be on your own.
Even more advanced is to see if you can design a start/stop script system to run in your systemd.

Duplicati is designed to somewhat gracefully support start at login for different users on a system,
storing configs in home directories. If started at boot, the usual usage is one shared configuration.
Configuration information is in Duplicati-server.sqlite. Per job databases have randomized names.
Typically for GUI users the DB path is on the Database screen in the Local database path field.

Exporting a backup job configuration is something you should do anyway, and store that file safely.
The export will be useful if your disk drive is lost, and can also help you set up your moved configs.

I am not set up to test it, but I see an /etc/default/duplicati that looks meant for options via systemd:

# Defaults for duplicati initscript
# sourced by /etc/init.d/duplicati
# installed at /etc/default/duplicati by the maintainer scripts

#
# This is a POSIX shell fragment
#

# Additional options that are passed to the Daemon.
DAEMON_OPTS=""

You could try putting the --server-datafolder=<path-to-your-chosen-config-folder> there, then restarting.
Duplicati should make a very small Duplicati-server.sqlite there, then you can do the import and Repair.
Though Duplicati-server.sqlite has paths to the per-job databases, the job databases are easily moved.

1 Like

Hi
Thank you @ts678 for this additional information.
For the moment I abandon the idea of ā€‹ā€‹the 2nd instance. I thought it was possible quite easily with localhost: 2000/3000/4000 . I do not have the competence and I do not understand very well in English.

I am currently migrating backup settings from localhost: 8300 -> 8200

  • manual creation on localhost: 8200 by copying the information from the other instance
  • in Advanced / database; give the right path (eg / /media/ā€¦/Appli-param/duplicati/atout.sqlite) then press the save and repair button

It worked very well for the 1st :slight_smile:

On the 2nd, it has been blocked for 1 hour (see image). I restarted duplicati but the same problem remains. I see in my home directory the file trump.sqlite and trump.sqlite-journal. The latter disappeared after a few minutes, the size of trump.sqlite no longer increases
What to do ?
Regards

The path-to-your-chosen-folder is unrelated to 2nd instance. You seem even now to want a special path to the backup jobā€™s database, but if you are happy with only that moving (it is not the configuration), thatā€™s fine.

The image (measured in cm) is about 77% on progress bar. More commonly, when database Recreate is having trouble, the 90%-100% range is the slower portion. Maybe you had a file download get stuck earlier.

Does the Hubic name in the job name mean hubiC is the backup destination? I recalled a reliability issue:

New backup failure

and it was your backup. After being warned off, did you wind up staying? See other note for ā€œWhat to do?ā€

What you can do to watch your downloads is About ā†’ Show log ā†’ Live ā†’ Verbose during a Recreate. During the first 70% of the progress, you should see files with their names containing dlist and dindex. Around the 70% mark it will either progress quickly to the finish, or begin downloading larger dblock files.
The message texts will speak of filelist (dlist), indexlist (dindex), and blocklist (dblock) volumes while providing progress counts. If you did in fact get stuck in mid-sequence, thereā€™s a little more logging available at Profiling (rather than Verbose) level, but not enough to easily diagnose any low-level problem.