-rw-------. 1 root root 233472 2021-07-29 10:20:00.149070786 -0400 Duplicati-server.sqlite
-rw-------. 1 root root 18098118656 2021-07-29 02:34:19.850664721 -0400 MYFGVUULA.sqlite
This is quite a large database. You might want to try a VACUUM operation to see if that helps. Make sure that you have at least 2x the database size in free space in your temporary directory before doing so.
Friends searching Google on Duplicati’s vaccum, I found this from here: SQLite VACUUM
After reading the article, I decided to test on a client that has a database of 163MB, I ran the following command:
sqlite3 72667174897076837983.sqlite VACUUM;
After running this command, the database was cleaned up to a size of 145MB.
With the vacuum done, I did the test trying to run the Backup Job again and it ran 100% without fail. \o/
In this way, I believe that at least for me, it is more feasible to…
SQLite VACUUM - VACUUM command cleans the main database by copying its contents to a temporary database file and reloading the original database file from the copy. This eliminates free pages, aligns table data to be contiguous, and otherwise cleans...