Moving duplicati docker volume without restarting ongoing backup

I currently have a 20TB backup running, unfortunately I hadn’t considered that the sqlite database also reaches a certain size. It can be assumed that in the next few days the database will be larger than the free space on the hard disk on which the duplicati docker volume is located. What can I do to move the Docker volume without having to restart the current backup? I have already tried pausing the backup, then moving the volume, restarting the container and continuing the backup. Unfortunately, pausing the backup did not work, files are still being processed and transactions are still being carried out on the database.

I have only been using duplicati for a short time and am not very experienced in Linux, I would be very grateful for any further ideas.

I have no way to expand or free up disk space for the docker volume.

Thank you

If your backup is only one file of 20 TB, there is nothing you can do. Forget Duplicati, docker, what you have is a process having a database open. Nothing known to humanity can allow a process to have a database pulled out and stuffed back in with a greater size. Advanced database can grow online with volumes separate from the initial one, but that’s not the kind of database used by Duplicati.

So if your backup has several files to backup, just instruct it to stop after the current file. Then do whatever to have to to be able to create a bigger database, then start the job again. It will ignore the files already backed up (unless the files have changed of course)

No it is not one file it is one backup with multiple files and the process continues backupping individual files instead of pausing.

So instead of pausing, i can abort the backup and it will continue where it stopped when it started again (in case the files are not changed)?

Thank you i will try this than.

With all the database talk, is backup of a database? If so, read its directions carefully for best results.

If it’s just 20 TB of other files being slowly backed up without any consistency worry, size still matters.

I’m worried you didn’t increase the blocksize before starting. Default of 100 KB is 200 times too small, resulting in (probably linear) growth in size as processing occurs, with far greater slowdowns in times.

Increasing blocksize can help keep database size small, although a lot of files can still cause issues. Sometimes difficult situations with too much data and too many files can be solved by splitting backup.

If you mean the pause button on the right, the stop button is to its left inside status bar during backup:

image

that’s a cleaner way to stop a backup (after some delay to finish work in progress). A hard kill is riskier, however based on my previous post, I’m wondering if a fresh start with larger blocksize might be good.

So i backup 20TB of files and the duplicati sqlite database is currently around 30TB with 10TB already backupped.

And yes i paused the backup but did not cancel it:

With all your information i would now cancel it, than move the duplicati config docker volume to a bigger drive and than start it again.

You said in future i should increase the blocksize? So for this type of sizes to like 20MB?

(It is just a one time backup because i plan greater maintenance to my server and want a full backup of all my data)

Size looks a little suspicious. Are you sure it’s not 30GB?
Limits In SQLite suggests limited to 1073741823 pages.
Default page size is 4096 bytes, so about 4TB database.

You might not need bigger if you increase blocksize to 20MB, but we’re still discussing database size.
I don’t think I heard the remaining space or any other uses, or file counts, but running out gets messy.
If you’ve got a larger space handy, might as well play it safer.

Test the backup well. I hope this is a just-in-case and not a plan for Duplicati to have the only data copy.

That loses lots of the benefits of Duplicati in compactly holding multiple versions. Maybe you gain some savings from compression and deduplication, but there are other ways depending on anticipated usage.

Oh god yes its 30GB sorry and yes its a just in case backup.

After the maintenance i will delete the backup and will use duplicati for specific data.

Thanks for your help

1 Like