VACUUM - performance improvement is huge

@JonMikelV It does! At least duplicati.commandline.exe.

I was not able start vacuum in commandline if I have low free space on drive C:\ - even if the database was on disk D:\ and temp was on disk Y:
But on web gui it work, so I probably did something wrong

Another thing is, whether low free space check require 1x or 2x free DB size space.

Is there a way to run vacuum on commandline for all my databases or do I have to run it on each individually?

Only individuality, but it’s pretty easy process via web gui, click on Commandline …, select Vacuum from dropdown menu, delete text in “Commandline arguments” and click Run

You could also create a script or batch file to run Duplicati.CommandLine.exe for each backup, if that helps.

C:\Program Files\Duplicati 2>Duplicati.CommandLine.exe help vacuum

Usage: vacuum <storage-URL> [<options>]

  Rebuilds the local database, repacking it into a minimal amount of disk
  space.

Exporting the job as a command line and then editing it (similar to @mr-flibble GUI approach) should work.

1 Like

What command did you use to do the database vacuum? I was very interested about that!

I have a client with very large banks where this option can be the lifeline for a problem I am facing … After 80 or 90 versions crashes occur in the execution of the backup and I had to delete the job and start from scratch, even though it is an Xeon server, it is a very old machine, which works perfectly for the job, but during the backup it gets 100% processing.

I tried to test, I went through the web interface, I selected Job Test1, I selected the option “Command line” and I had the vacuum run, when running it it returns the following error:

Found 2 commands but expected 1, commands:
“mega://Teste?auth-username=mail@gmail.com” “E:\cssg”
Return code: 200

Version: 2.0.4.4_canary_2018-11-14

Hello, you doing it right. In web interface select Commandline, then choose Vacuum.
But after that, remove text from “Commandline arguments” filed
It’s should be empty
image

I’d usually link to the command’s entry in Using Duplicati from the Command Line, but there isn’t an entry, so:

C:\Program Files\Duplicati 2>Duplicati.CommandLine.exe help vacuum

Usage: vacuum <storage-URL> [<options>]

  Rebuilds the local database, repacking it into a minimal amount of disk
  space.



C:\Program Files\Duplicati 2>

Above is what @mr-flibble was aiming you at. Keeping options is generally fine, and sometimes necessary.

I ran the command over the web interface to see if I could copy the complete command to run via Task Scheduler or via Crontab.

To test the way you said, I created a backup job that backs up 20 exe files from a total of 200kb to mega.nz and the task is running 10 minutes without completing.

In the meantime I tried to stop the Job and I could not, it was as if Duplicati was locked during the execution of the service and not even close and open again it unlocks.

Friends searching Google on Duplicati’s vaccum, I found this from here: SQLite VACUUM

After reading the article, I decided to test on a client that has a database of 163MB, I ran the following command:

sqlite3 72667174897076837983.sqlite VACUUM;

After running this command, the database was cleaned up to a size of 145MB.

With the vacuum done, I did the test trying to run the Backup Job again and it ran 100% without fail. \o/

In this way, I believe that at least for me, it is more feasible to run the vaccum outside of Duplicati via script (powershell or shell script), than with Duplicati.CommandLine.exe

1 Like

Hello
Tacio, excellent.
In the Windows operating system I could not find ways to execute this command.
It looks like Sqlite is not installed.
Anderson

Are you saying it was failing before? This is supposed to help performance (did it?), not fix things, though it might as a side effect. The 200kb job that ran 10 minutes without completing was probably a different issue. Please also run a connection test to make sure you’re not affected by the authentication changes at Mega, and open a new support topic if it looks like the problem there has no relationship to VACUUM performance.

It’s not part of Windows, and Duplicati doesn’t ship the command, only the parts it needs for its own access, including the weak database encryption that Windows users get by default. To get that might require a build with encryption, or use of --unencrypted-database with a normal sqlite3.exe, however my general thinking is:

To anyone who plans to vacuum Duplicati databases from anything besides the main Duplicati, please try to avoid access conflicts. Probably turning the main Duplicati off while working behind it would reduce the risks, however even better would be not bypassing (same thinking applies to manual changing of destination files). Looking at the “SQLite VACUUM” article above, it sounds like (unsurprisingly) the results of the VACUUM are based on a temporary copy of the database, meaning any backup in progress then possibly got corrupted…

For windows download https://www.sqlite.org/2018/sqlite-dll-win32-x86-3250300.zip and play the 3 files in C:\Windows\system32 and run the command via CMD.

The problem with the delay is I think it was running some other command that did not finish … But with the executable of sqlite itself I was able to do it.

My question now is to see if Duplicati has any way to purge data that is no longer needed, such as data from 91 days ago when my retention is 90 days, for example.

Compacting files at the backend explains the usual way this happens. Manual command is here and I point to it mainly because that page gives the options you can put in the backup’s web UI to control how aggressively compact runs. This is an example of really forcing compact, for a special purpose. Compaction requires a lot of downloads and uploads as backed up data gets repackaged, so storage provider charges may be a factor.

Very good! I did not know that it would be necessary to download and upload the data again, but I believe that for use on servers with local backup by FTP, ssh, etc, may be a good option.

Thank you very much for the feedback. =D

Good afternoon guys, yesterday after 2 days running the backup normally the following error occurred:

Failed: Unexpected difference in fileset version 0: 22/11/2018 21:00:00 (database id: 27), found 67153 entries, but expected 67252
Details: Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 0: 22/11/2018 21:00:00 (database id: 27), found 67153 entries, but expected 67252
at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency (System.Int64 blocksize, System.Int64 hashsize, System.Boolean verifyfilelists, System.Data.IDbTransaction transaction) [0x00370] in &lt;a699962d1b954fd09198884685231873&gt;:0 
at Duplicati.Library.Main.Operation.Backup.BackupDatabase+&lt;&gt;c__DisplayClass32_0.&lt;VerifyConsistencyAsync&gt;b__0 () [0x00000] in &lt;a699962d1b954fd09198884685231873&gt;:0 
at Duplicati.Library.Main.Operation.Common.SingleRunner+&lt;&gt;c__DisplayClass3_0.&lt;RunOnMain&gt;b__0 () [0x00000] in &lt;a699962d1b954fd09198884685231873&gt;:0 
at Duplicati.Library.Main.Operation.Common.SingleRunner+&lt;DoRunOnMain&gt;d__2`1[T].MoveNext () [0x000b0] in &lt;a699962d1b954fd09198884685231873&gt;:0 
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in &lt;4ffb8394f71c471ab65d43c04283a838&gt;:0 
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0004e] in &lt;4ffb8394f71c471ab65d43c04283a838&gt;:0 
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x0002e] in &lt;4ffb8394f71c471ab65d43c04283a838&gt;:0 
at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x0000b] in &lt;4ffb8394f71c471ab65d43c04283a838&gt;:0 
at System.Runtime.CompilerServices.TaskAwaiter.GetResult () [0x00000] in &lt;4ffb8394f71c471ab65d43c04283a838&gt;:0 
at Duplicati.Library.Main.Operation.BackupHandler+&lt;RunAsync&gt;d__19.MoveNext () [0x003d6] in &lt;a699962d1b954fd09198884685231873&gt;:0 
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in &lt;4ffb8394f71c471ab65d43c04283a838&gt;:0 
at CoCoL.ChannelExtensions.WaitForTaskOrThrow (System.Threading.Tasks.Task task) [0x00050] in &lt;6973ce2780de4b28aaa2c5ffc59993b1&gt;:0 
at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String[] sources, Duplicati.Library.Utility.IFilter filter) [0x00008] in &lt;a699962d1b954fd09198884685231873&gt;:0 
at Duplicati.Library.Main.Controller+&lt;&gt;c__DisplayClass13_0.&lt;Backup&gt;b__0 (Duplicati.Library.Main.BackupResults result) [0x00035] in &lt;a699962d1b954fd09198884685231873&gt;:0 
at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]&amp; paths, Duplicati.Library.Utility.IFilter&amp; filter, System.Action`1[T] method) [0x0011d] in &lt;a699962d1b954fd09198884685231873&gt;:0 

Log data:
2018-11-23 21:01:51 -03 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 0: 22/11/2018 21:00:00 (database id: 27), found 67153 entries, but expected 67252
at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency (System.Int64 blocksize, System.Int64 hashsize, System.Boolean verifyfilelists, System.Data.IDbTransaction transaction) [0x00370] in &lt;a699962d1b954fd09198884685231873&gt;:0 
at Duplicati.Library.Main.Operation.Backup.BackupDatabase+&lt;&gt;c__DisplayClass32_0.&lt;VerifyConsistencyAsync&gt;b__0 () [0x00000] in &lt;a699962d1b954fd09198884685231873&gt;:0 
at Duplicati.Library.Main.Operation.Common.SingleRunner+&lt;&gt;c__DisplayClass3_0.&lt;RunOnMain&gt;b__0 () [0x00000] in &lt;a699962d1b954fd09198884685231873&gt;:0 
at Duplicati.Library.Main.Operation.Common.SingleRunner+&lt;DoRunOnMain&gt;d__2`1[T].MoveNext () [0x000b0] in &lt;a699962d1b954fd09198884685231873&gt;:0 
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in &lt;4ffb8394f71c471ab65d43c04283a838&gt;:0 
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0004e] in &lt;4ffb8394f71c471ab65d43c04283a838&gt;:0 
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x0002e] in &lt;4ffb8394f71c471ab65d43c04283a838&gt;:0 
at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x0000b] in &lt;4ffb8394f71c471ab65d43c04283a838&gt;:0 
at System.Runtime.CompilerServices.TaskAwaiter.GetResult () [0x00000] in &lt;4ffb8394f71c471ab65d43c04283a838&gt;:0 
at Duplicati.Library.Main.Operation.BackupHandler+&lt;RunAsync&gt;d__19.MoveNext () [0x003d6] in &lt;a699962d1b954fd09198884685231873&gt;:0

Could it be that it was caused by VACCUM right in the database without using Duplicati?

I think not, if ran Vacuum by DB Browser for SQLite Portable and then kill it (because RAM usage by that program was 18GB) I get this error from Duplicati:

System.Data.SQLite.SQLiteException (0x80004005): database disk image is malformed
database disk image is malformed

So your error don’t look Vacuum related.

If you’re saying there’s a chance you ran it outside of Duplicati while Duplicati might have been at work such as doing a backup, then possibly you got into the access conflict I’d worried about, with stale data put back. Maybe there was even a problem on the data copy to a temporary file, if it could change while it was being copied. This is a deeper SQLite question than I can answer, on exactly how SQLite vacuum is implemented.

The test from @mr-flibble seems different from what I think you described, but it does answer how one can get into that unfortunate state with the database. I had hoped SQLite would be able to resist such problems. There’s currently such an error here, so if anyone is good at fixing these, please stop by to help with that…

When I did the vacuum Duplicati was not running the job, it was the first thing I checked before executing the command … The strange thing is that it only occurred 2 days after executing the command, in the mean time, it worked normally.

After rebuilding the database and using the purge-broken-files command, it returned to normal operation.