I have a problem with repairing my duplicati database on my synology nas:
A single backup job with ~80GB of photo data - backup stored on ondrive.
I had two successful runs/versions but than the backup progress crashed due to no more space available in my temp folder. This was my mistake, as I had specified “TMP” instead of “TEMP” in the environment variable. But after this was fixed the database was broken and could not be repaired.
So I decided to delete it and run a recreation - this runs for several hours and if I want to check the progress the web server is some kind of dead (running but not serving the site - loading indefinitely).
If I kill and restart the Duplicati process I can see an error message saying that the process was not successful (but I can’t see an error there - local and remote log are both empty).
Message: “Object reference not set to an instance of an object”
And this time there definitely enough space left on temp dir and in the home dir (where Duplicati places the sqlite db) .
I also tried to export my backup-config and reinstall Duplicati (also deleted the folder under .config).
But after importing the backup-config I have the same problem again.
I have the same setup running at an other NAS(same type) with even more images (~180GB) without any problems - but there I didn’t made the mistake with the temp variable naming.
I also tracked the memory consumption over the first the hours of repair: always a bit under 80%.
Also I saw that the process bar is slowly moving til that point near the end where it stops for a longer time before the server crashes.
thanks for your answer!
But the first two runs where running without problems and also the other device is running without any problems with the limited ram of 512MB.
For the first hours of the repair progress I also monitored the usage level of the ram -> it was always under 80%. (The nas had no other tasks during the repair progress).
However, should I search for an other backup solution? I do not want to reupload everything and get into the same situation again. I really liked Duplicati because everything is encrypted transparently for the user.
One idea from my side: Does it make sense to copy the whole config to my desktop, change the path the the share of the nas and start the repair there? Afterwards I would copy back the restored database to the nas. Or are the databases somehow machine dependent? (and should I use a Linux system on my desktop?)
It could be that 512MB RAM isn’t an issue for normal operation and just causes problems for database recreations.
Your idea MIGHT work. I don’t know off hand if Duplicati validates the local user datafiles when doing a repair. If it doesn’t even look, then maybe it would work. You’d definitely want to run it on Linux though to most closely match the NAS.
I had issues with memory on Synology which were due to the location of the DB. I got it fixed by moving the temp folder and the DB out of the basic volume, which typically is very small on a Synology. The setup I usually see is a root volume of a couple of GB (/dev/md0 - mounted on “/”), which is used for the OS and then the actual storage (eg /dev/md2 - mounted on “/volume1”).
What deblocked the situation was moving the database to “/volume1” and inform Duplicati about this by using --dbpath and also specify a --tempdir, because sqlite can sometimes create big intermediary files.