Unable to Create Bug Report (Disk Full even when Disk has space)

Hi, my latest backup on 2.0.4.5 was stopped suddenly with a lot of errors that I will like to create a bug report for. Unfortunately I then got this error message even though the SSD I installed my OS and Duplicati into has not run out of space at all:

Dec 14, 2018 11:46 PM: The operation CreateLogDb has failed with error: Disk full. Path /root/.config/Duplicati/TSLABTWQLC.sqlite" or "/tmp/dup-bf9bf5b4-0e13-4825-953d-5ebe42e8b944

{“ClassName”:“System.IO.IOException”,“Message”:“Disk full. Path /root/.config/Duplicati/TSLABTWQLC.sqlite" or "/tmp/dup-bf9bf5b4-0e13-4825-953d-5ebe42e8b944”,“Data”:null,“InnerException”:null,“HelpURL”:null,“StackTraceString”:" at System.IO.File.Copy (System.String sourceFileName, System.String destFileName, System.Boolean overwrite) [0x001c4] in <8f2c484307284b51944a1a13a14c0266>:0 \n at Duplicati.Library.Main.Operation.CreateBugReportHandler.Run () [0x00100] in <c6c6871f516b48f59d88f9d731c3ea4d>:0 \n at Duplicati.Library.Main.Controller+<>c__DisplayClass27_0.<CreateLogDatabase>b__0 (Duplicati.Library.Main.CreateLogDatabaseResults result) [0x00019] in <c6c6871f516b48f59d88f9d731c3ea4d>:0 \n at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String& paths, Duplicati.Library.Utility.IFilter& filter, System.Action`1[T] method) [0x0011d] in <c6c6871f516b48f59d88f9d731c3ea4d>:0 ",“RemoteStackTraceString”:null,“RemoteStackIndex”:0,“ExceptionMethod”:null,“HResult”:-2147024857,“Source”:“mscorlib”}

May I know what I should consider next?

Thanks!

Turns out that my database is indeed too big for /tmp! I had 16 GB of RAM, which translate to 8 GB of /tmp space (since tmpfs is half of available RAM to ensure the server’s stability). My database itself was at least 8 GB and Duplicati needs another 8 GB at least to do its scrubbing of names etc so naturally I will need much more space to work with.

To resolve this, I created a folder specially for Duplicati within the OS SSD. Within Duplicati, I got the backup job to use that folder as its tmp folder instead of /tmp/ (which other programs on the server and SQLite can still use)

RIght now I am monitoring the disk usage and saw that the bug report process used at least another 12 GB of space. Hopefully the use of SSD will be a reasonably fast substitute for /tmp/

2 Likes

To report back, yes the use of SSD (and even HDD) is sufficient enough substitute for /tmp/

The elegant solution is to split your backup jobs so that each job is small enough for its database to fit within the RAM. Splitting your backup jobs has additional benefits but scheduling them can be a challenge.

I’m glad you got it working :slight_smile:

I think we have some work to do with scheduling and making configuration easier.

There are many issues with very large datasets but these user experience challenges encourage the user to create one giant job.

What constitutes a very large dataset? Mine is 2.7TB, is that too large and should be split up into smaller jobs?
How would one best do that?
My database is currently 6.23 GB with 12 GB on this computer it exceeds 50% of the available RAM.

When Duplicati is idle I would like to move the database off the SSD that contains C:.

The 50% thing is for Linux users who keep /tmp in RAM for various reasons. It doesn’t affect Windows.

tmpfs is one page that discusses that. –tempdir shows where your Duplicati’s temporary files might be.

Database management can do that, but keep it on persistent storage that’s preferably fast. Why move?