I have 4 jobs on Duplicating runnig. One runs daily, two weekly and one on demand. One of the weekly jobs (the fourth in the list) loses Host Name, Port, Path and User very often (I supose every second time it is running).
Any idea what I can do to do not lose the data?
Version: 2.2.0.2 - 2.2.0.2_beta_2025-11-26 on actual Win 11 (10.0.26200.7623)
I add the lost data but when I send the job and reedit it there are no data in the fields.
My solution: Removing the job and the local database, restoring the saved job from file and recreate database (delete and repair). After that the job runs one time without loosing data from the NAS.
I forgot to mention that I use WebDAV.
This sounds like an issue with the UI, where it trips over something and fails to render the UI.
If you click “next” or “submit”, it will save this with a broken URL.
Any ideas as to what could have been “odd” so we can reproduce this?
I would think running the backup would not cause the URL to get lost, only editing it?
I only use the UI nether the comand line. Browser: actual Firefox. I got an error message like ‘host not found’ or similar. When I edit the job the fields for host name, … were empty. First I could add the host name, … and the job runs. After some weeks (the job juns weekly) I got an error again. For two or three times it helps to edit the job. But when I wrote the post I could not start the job. After editing the job and ‘send’ it and edit job again the data was lost. Only removing the job and import it from file solves the problem.
I have no idea how you can reproduce it. In what file Duplicati stores the job (Duplicati-server.sqlite?)? When the problem occus again I can send it.
Now I have the same error message: Error while running Bilder. Dem Uri fehlt ein Hostname: webdav://
Error on log: "2026-02-22 09:01:58 +01 - [Error-Duplicati.Library.Main.Controller-FailedOperation]: The operation Backup has failed\r\nArgumentException: Dem Uri fehlt ein Hostname: webdav://"
URL in Browser: http://127.0.0.1:8200/ngclient/backup/11/destination
I added a screenshot of the job:
The job runs 2 times: at 12.2. after I imported the job and at 15.2. (it is scheduled on every Sunday). Today it failed.
It is interesting for me at what location the job is stored. In this case I would check it before it runs.
I forgot to mention that I set: DUPLICATI_HOME=D:\Data-NoBackup\Duplicati\
These are about 1.6. TB of free mem. in drive D. The size of backup data is about 1,65 TB.
What I will do: I import the job and check daily Duplicati-server.sqlite whether the TargetURL is empty or not.
Interesting things happens: This morning I edited the job and added the server data.
Then the job runs without error. I checked Duplicati-Server.sqlite and saw the server data in TargetURL.
After that I edit the job and in the edit page there are no server data. When I open Duplicati-Server.sqlite I see the server data as long as I do not leave the destination page in the edited job.
My explantation: the UI does not read the server data from Duplicati-Server.sqlite but writes it back.
This does not explain why the job loses the server data after the first run. Also the first 3 jobs do not loose the server data. I will continue to monitor this.