Possible to clone an existing backup

After some recent issues, I would like to take an existing backup and split it into 2 or more smaller backups while retaining the history.

Is it possible to to do the following?

  1. disable schedule so we don’t run any backups
  2. clone/copy b2 bucket
  3. export backup config
  4. import backup config
  • change name
  • change destination to cloned b2 bucket
  1. copy local db for exported backup to name used by imported backup
  2. delete files as desired from the split backups
  3. compact the split backups
  4. vacuum the local dbs?
  5. enable schedules

“pause” and selecting “until resumed” should do this

Pause also pauses other needed stops. For example, a Commandline purge:

Running commandline entry

Server is currently paused, resume now

Commandline vacuum gets the same. GUI Compact now seems to just queue.

Another reason why disabling the schedule is safer is I don’t think anything but Canary remembers the pause across Duplicati restarts which might be wanted.

I assume delete means carefully use the purge command. Delete from source configuration means future backups will be less, but historical ones don’t change.

“delete” means carfeully deleting individual files and directory trees from the filesets. I was thinking I could use the “delete” command to do this as opposed to using “purge” and then having to do a “compact”, but perhaps I am misremembering. “purge” is probably better anyway in this case since I intend to do a “compact” at the end anyway.

The DELETE command deletes versions.

The PURGE command does files/folders (with some control on which versions).

1 Like

Thank you for the clarification! I have rarely had need of the “purge” command so obviously was mis-remembering.

Here is the process I ended up using.

Notes

  • I am primarily Linux-based, so this process is Linux-centric.
  • My backups run as system services, not user-level.

This process assumes you know how to run Duplicati commands, either using the GUI or with Duplicati-cli - if you don’t know how to do this, you probably should not attempt this process!

  1. Create a “rclone” remote for the relevant b2 bucket.
  2. Use “rclone” to copy the desired backup from the current folder to a new folder. Note that this assumes the backup uses a folder within the bucket rather than just using “/”. If you are using “/”, you probably want to just create a new bucket.
  3. Create a new backup. This can be done by using the GUI and filling out the individual fields, or by exporting the current backup to a file and then importing it and updating the fields as needed. Set the new backup to only backup the directory/ies of interest.
  4. Note the local database name from the original backup and the new backup - copy the database to the new name.
  5. With the new backup use “list” and “list-broken-files” to confirm everything looks OK. Then use “purge” to eliminate everything except the desired directory/ies. Note that you need to add the “*” wildcard to directory paths.
  6. If everything is still OK, run the new backup.
  7. If the new backup is OK update the original backup to exclude the directory/ies that are now being backed-up by the new backup.
  8. With the original backup use the “purge” command to remove the directory/ies that are now being backed-up by the new backup.
  9. Run the original backup to confirm it is OK.

I also did a “vacuum” against the databases after the “purge” commands, but I don’t know how helpful this was.

For Windows, there are 2 additional issues:

  • you need a Windows version of rclone (If it exists) or some other method to copy the data on the cloud service. In my case I just used rclone from a Linux system.
  • to copy the local database you need to run commands as the local system user, and this requires a tool such as “psexec”. You could also just let Duplicate recreate the local database if you can tolerate the downloads and the time it takes.
1 Like

Thanks for the note! One Windows point I’m not sure I agree with is the following one:

For Duplicati run as a regular user, the user themselves would have access.
For a Windows service run as SYSTEM, ordinary elevated admin should do.

C:\Windows\System32\config\systemprofile\AppData\Local\Duplicati NT AUTHORITY\SYSTEM:(I)(OI)(CI)(F)
                                                                 BUILTIN\Administrators:(I)(OI)(CI)(F)

If you are using the canary builds, there is now a Duplicati.CommandLine.SyncTool.exe / duplicati-sync-tool bundled that is designed to copy Duplicati files from one destination to another.

It is by no means as flexible as rclone, but it does accept the same connection strings as Duplicati uses, so you do not need to set up rclone.