Partial Restore After Crash

Background Info: had a server completely fail and was unable to access anything on file system. We have multiple backups of the important files but not entire server.

I wanted to see what was in the duplicati backup but was not able to see how if that was actually possible. I am currently running a direct restore but wanted to know if there was a way to just do a partial restore and not the entire thing. It took almost 2 days just start restoring as it recreated the database.

I see now that I should have exported a configuration file and get that somewhere. If I had that would it let me restore select items much quicker ? should I backup the duplicati database file every day ?

Now that the backup has recreated a local database and doing a restore is there a way to tell what it was actually backing up without restoring the entire thing ? its 6tb.

Thank you for any assistance you can provide.

Welcome to the forum @gouda272

  --no-local-db (Boolean): Disable the local database
    When listing contents or when restoring files, the local
    database can be skipped. This is usually slower, but can be
    used to verify the actual contents of the remote store.
    * default value: false

I’m not sure about restoring, but for a file list that you can browse, you can try:

Duplicati.CommandLine list "<URL>" "*" --no-local-db --version=0 --passphrase="<passphrase>"

Ordinarily the URL would come from Export As Commandline, but you can use CommandLine help for your storage type, or check new or old docs for format.

Or since Duplicati is installed, you can fill out your Destination and find its URL:

Partial restore is what it does. You pick date, browse file tree, pick desired parts. Browsing in tree after picking a date is another way to view what’s in the backup.

It speeds up setup of things like destination info. It’s just config, not current data, meaning it doesn’t know what latest backup found. That needs a look in backup.

Your choice, but unless restore time is critical it’s usually fine to recreate the DB. Database tends to be big, and even a small incremental backup can make large upload of DB. Don’t back up DB with its own job (as it will be instantly stale), but sometimes people have a run-script-after or job after main backups for DB.

That’s one reason recreate is slow. Too many blocks, and way too many if it started on a 2.0 version with blocksize=100KB. Ones begun on 2.1 are blocksize=1MB.

So i’m trying to restore a smaller set of files but struggling with the command line. (The gui is failing half the files for some reason with a permissions error). I am trying to run this command but get an invalid argument count (6 expected 3):

Duplicati.CommandLine.RecoveryTool.exe download “computername” “F:\ZZZ” --s3-server=s3.us-east-2.wasabisys.com --wasabi-storage-class=STANDARD --aws-access-key-id=11111111111111 --aws-secret-access-key=22222222222222222222passphrase=33333333333333

I cannot find an actual example of how to restore from wasabi using the command line on a windows pc. any ideas are greatly appreciated!

No comment without more information on failure spot and some message.

What you’re posting are not double quotes, which Windows may need, e.g. to prevent breaking a string into its words at spaces. Use regular double quotes.

Typographic forms can be made by formatters. Avoid those on command line.

The three arguments it’s looking for are in the command’s built-in help, saying:

Duplicati.CommandLine.RecoveryTool.exe download <remoteurl> <localfolder> [options]

The URL format for s3 can be seen in CommandLine help s3, but manual has format with options attached. Main problem is it mis-spelled s3-server-name.

After the typo fix by me, URL format for Commandline there is suggested to be:

s3://<bucket name>/<prefix>
  ?aws-access-key-id=<account id or username>
  &aws-secret-access-key=<account key or password>
  &s3-server-name=<server ip or hostname>
  &use-ssl=true

Easier way to build the URL would have been having GUI do it, as I suggested.
If I try it that way, it uses auth-username and auth-password (synonym names).

GUI also knows how to percent-encode if needed. It’s an odd URL requirement. Possibly you’ll be lucky and not have special characters, but there’s no promise.

I think I may have figured out my issues with the GUI not restoring properly. the c drive had run out of space and I did not notice it until now. I actually had to switch to the old interface where the problem stuck out like a sore thumb. - I dont see those warnings in new interface.

Look for anything red or yellow in the lower right corner. They’re easy to miss, whereas the old UI “sore thumb” might have sometimes been a bit too noisy…

I’m not sure what revisions will happen, but it’s been recognized as a problem.

Another change being worked on is temporary space used in new restore plan.
The new design has several cache issues, so keep an eye on any Temp folder.

  --restore-legacy (Boolean): Use legacy restore method
    Use this option to use the legacy restore method. The legacy
    restore method is slower than the new method, but may be more
    reliable in some cases.
    * default value: false

can use old less-Temp-use restore. It’s only a little slower, for me. YMMV a lot.
If you configured a job, this is in Options. For Direct restore, on second screen.