Pause also pauses other needed stops. For example, a Commandline purge:
Running commandline entry
Server is currently paused, resume now
Commandline vacuum gets the same. GUI Compact now seems to just queue.
Another reason why disabling the schedule is safer is I don’t think anything but Canary remembers the pause across Duplicati restarts which might be wanted.
I assume delete means carefully use the purge command. Delete from source configuration means future backups will be less, but historical ones don’t change.
“delete” means carfeully deleting individual files and directory trees from the filesets. I was thinking I could use the “delete” command to do this as opposed to using “purge” and then having to do a “compact”, but perhaps I am misremembering. “purge” is probably better anyway in this case since I intend to do a “compact” at the end anyway.
I am primarily Linux-based, so this process is Linux-centric.
My backups run as system services, not user-level.
This process assumes you know how to run Duplicati commands, either using the GUI or with Duplicati-cli - if you don’t know how to do this, you probably should not attempt this process!
Create a “rclone” remote for the relevant b2 bucket.
Use “rclone” to copy the desired backup from the current folder to a new folder. Note that this assumes the backup uses a folder within the bucket rather than just using “/”. If you are using “/”, you probably want to just create a new bucket.
Create a new backup. This can be done by using the GUI and filling out the individual fields, or by exporting the current backup to a file and then importing it and updating the fields as needed. Set the new backup to only backup the directory/ies of interest.
Note the local database name from the original backup and the new backup - copy the database to the new name.
With the new backup use “list” and “list-broken-files” to confirm everything looks OK. Then use “purge” to eliminate everything except the desired directory/ies. Note that you need to add the “*” wildcard to directory paths.
If everything is still OK, run the new backup.
If the new backup is OK update the original backup to exclude the directory/ies that are now being backed-up by the new backup.
With the original backup use the “purge” command to remove the directory/ies that are now being backed-up by the new backup.
Run the original backup to confirm it is OK.
I also did a “vacuum” against the databases after the “purge” commands, but I don’t know how helpful this was.
For Windows, there are 2 additional issues:
you need a Windows version of rclone (If it exists) or some other method to copy the data on the cloud service. In my case I just used rclone from a Linux system.
to copy the local database you need to run commands as the local system user, and this requires a tool such as “psexec”. You could also just let Duplicate recreate the local database if you can tolerate the downloads and the time it takes.
If you are using the canary builds, there is now a Duplicati.CommandLine.SyncTool.exe / duplicati-sync-tool bundled that is designed to copy Duplicati files from one destination to another.
It is by no means as flexible as rclone, but it does accept the same connection strings as Duplicati uses, so you do not need to set up rclone.