Vacuum command is taking 24+ hours

I started a mono /usr/lib/duplicati/Duplicati.CommandLine.exe vacuum yesterday at around 12pm. It is still running today at 12:30pm. There has been no output from the command to stdout or stderr.

How long should I expect the command to take?

The command has been keeping one of my CPU’s spiked to 100% most of the time.

I’m on an SSD, so I’d think file I/O wouldn’t be much of an issue.

The .sqlite file is over 2G in size, so I’m not surprised it is taking a long time. I just am running out of time I can leave my laptop running, and the lack of output from the command gives me no way to tell if it’s just stalled or something…

$ duplicati-cli help   

[cut for brevity]

http://www.duplicati.com/              Version:  - 2.0.3.3_beta_2018-04-02

$ cat /etc/lsb-release 
DISTRIB_ID=Ubuntu
DISTRIB_RELEASE=18.04
DISTRIB_CODENAME=bionic
DISTRIB_DESCRIPTION="Ubuntu 18.04 LTS"
2 Likes

Did you abort the vacuum yet or is it still running?

I know part of the 2.0.3.3 update was to not run vacuum after every backup (Release 2.0.3.3 (beta) 2018-04-02)

Was the command you included the actual command you ran? I would have thought you’d need at least --dbpath to indicate which database was to be swept…

I aborted it a couple days ago.

Yes, it was a manual run. I did set the --dbpath and other needed options.

I hadn’t read the description of the command, and was thinking it might clean up whatever file is causing me issues over here: “Unexpected number of remote volumes marked as deleted” error

In any case, it does seem like a good command to run once in a while, so I would like to be able to.

Edit:
Wait, no, I didn’t set dbpath. I copied the backup command, then deleted stuff until it worked… So I ended up with something like duplicati-cli vacuum "ftp://blah" Yeah… I don’t think that would work right… coughs

I certainly could be wrong (maybe if you’ve only got one job it figures it out) but, yeah… including --dbpath would be safest. :grin:

How are you supposed to run it manually?
I checked the help and it says to use:

Duplicati.CommandLine.exe vacuum storage-url

It doesn’t say anything about dbpath. And from what I can tell, storage-url means the destination of your backup data. This seems weird.

I tried using --dbpath and providing the path to the local db but it didn’t work.

I actually hadn’t tried, until just now, using the --dbpath option. But it did work.

mono /usr/lib/duplicati/Duplicati.CommandLine.exe vacuum "ftp://..." --dbpath=/root/.config/Duplicati/BLAH.sqlite --passphrase=BLAH

Before running command:

-rw-r--r-- 1 root root 3.7G May 15 14:35 BLAH.sqlite

After:

-rw-r--r-- 1 root root 3.4G May 15 14:38 BLAH.sqlite

Does that help any?

Thanks… I will try it! But it seems strange that it would need to know the remote path and encryption key. is the local database also protected with that same encryption key as the remote files?

Seems to work just fine if you leave off the storage url and just put in the path to the sqlite file.

duplicati-cli vacuum /path/to/db.sqlite

It still needs the passphrase. Either via the flag or you can enter it in the prompt when it asks.

Hmm still doesn’t work for me. What is duplicati-cli?

Same thing as:

mono /usr/lib/duplicati/Duplicati.CommandLine.exe

I’m on Linux, so I can call that via mono, or via duplicati-cli. I usually use the mono version when I’m copying the command from the export as command line option.

That would make sense as vacuum is really just doing database cleanup - it shouldn’t need the destination location at all (as far as I know).

Yeah, I think running vacuum on every backup, is kind of excess. There are interesting views on this topic. Some prefer running vacuum all the time, and some do run it very rarely. I think running vacuum on compact, would be pretty sane choice. Unless there’s something like compact to trigger vacuum, running it like monthly would sound pretty ok to me. It’s like defrag, it’s good to run “sometimes” but running it all the time is also totally pointless waste of resources.

I ended up just turning on the “auto-vacuum” option. Turned it back off after one of the backups completed. My database dropped from 5GB to 2.6GB.

Not sure what the Duplicati help text means when it says this option will allow Duplicati to run vacuum automatically “at its discretion.” From looking at the source it seems to run the vacuum command automatically after each backup.

It’d be cool to have an option where it did a vacuum once every couple weeks or something.

1 Like