How can I instantly remove deleted files? -> I can't

Hi,

let’s say I messed up a backup. Added some unneeded files accidentally. Adjusted filters wrong, whatever. Now I know, there’s a lot of unneeded stuff remotely. It’s not in the local files/definition any more.

How can I do a “one time clean up”? A “delete all deleted files NOW”? If they are no more existing on the client side?

Is this a stupid question? I am not seeing it.

Thx

You can delete the backup version containing the unwanted files using the delete command.

First you have to find the oldest and newest backup that contains the unwanted files. Click the name of your backup job and click “Restore files…”. Find out which backup versions (in the “Restore from” menu) contain the unwanted files. Make a note of the oldest and newest version. Let’s assume that backup version 8 is the first backup that contains the unwanted files and version 3 is the most recent backup you want to delete.

Click your backup name and click “Commandline”.
Select the delete command.
Remove everything from the “Commandline arguments” text box. Replace it with this text:

--version=3-8
--dry-run

(Of course you type the versions that apply to your backup)
Click “Run the Delete command now” button.
Verify if this is what you want. If this is the case, repeat the procedure, but don’t use the --dry-run option.

If there are just a few files that you want to delete from all backup versions, you can do this by using the purge command.

2 Likes

I am still wondering why it is as clunky. Conservative approach? I mean, it’s something very natural to say: “Duplicati, please create and exact mirror of the local backup”. Yes, a simple purge. It should be supported by the UI, more prominently.

Yep - it sure should . Of course the UI is relatively new to Duplicati - and while what’s there is so much nicer than command line calls, it still doesn’t handle everything.

There’s only so much time people can donate to working on Duplicati and a purge GUI isn’t one that anybody’s taken on…yet. :wink:

1 Like

I see. I wonder how the --dry-run is helpful when it just says “[Dryrun]: Would delete remote fileset: duplicati-20180205T132244Z.dlist.zip.aes” - I do not know what is inside that file…

It’s just informing you that the file will go away. It might be because the only stuff in that archive is all stuff that is going to be deleted, it might be because the archive is “too small” and it’s contents got merged into another archive, or it might be a combination of things.

But the end result is you have to trust the Duplicati correctly knows everything in that file can go away.

Still I don’t get it.

  • I accidentally included M:\Android\ and now want to get rid of it on target side.

So, I remove m:\Android\ from the backup definition

Then I start command line + delete command.
But when I delete backup version 1-n, i.e. version 0 has to stay, the above folder stays in it.

Or do I have to delete ALL new backups on remote side until the point where I included above path? This would be … a problem.

Are you actually using the delete command (for removing an entire backup run) or the purge command (to remove specific files from a backup)?

Usage: delete <storage-URL> [<options>]
Marks old data deleted and removes outdated dlist files. A backup is deleted when it is older than or when there are more newer versions than . Data is considered old, when it is not required from any existing backup anymore.
–keep-time=
Marks data outdated that is older than .
–keep-versions=
Marks data outdated that is older than versions.
–version=
Deletes all files that belong to the specified version(s).
–allow-full-removal
Disables the protection against removing the final fileset


Usage: purge <storage-URL> <filenames> [<options>]
Purges (removes) files from remote backup data. This command can either take a list of filenames or use the filters to choose which files to purge. The purge process creates new filesets on the remote destination with the purged files removed, and will start the compacting process after a purge. By default, the matching files are purged in all versions, but this can be limited by choosing one or more versions. To test what will happen, use the --dry-run flag.
–dry-run
Performs the operation, but does not write changes to the local database or the remote storage
–version=
Selects specific versions to purge from, multiple versions can be specified with commas
–time=
Selects a specific version to purge from
–no-auto-compact
Performs a compact process after purging files
–include=
Selects files to purge, using filter syntax

Thanks, first I tried to delete, then I tried purge, with “–include=M:\Android* --dry-run” as command line argument. But had to delete all the extra options which are listed there as well. (Retention, compression module, password etc.). It is always complaing. It was saying I want to change password, it was saying that mixing this with paths is not possible, so I deleted one by one of those options, which usually belong to a backup. But still, now I end up with:

“System.InvalidOperationException: Schlüssel können nicht gelesen werden, wenn keine der Anwendungen eine Konsole besitzt, oder wenn die Konsoleneingabe aus einer Datei umgeleitet wurde. Verwenden Sie Console.Read.”

Will upload the maybe 200 Gigs again after my holiday, this only does take like 5 days, but after all less time wasting for anybody of us. I cannot afford more time. Thanks for trying to help. Lesson learnt: If I accidentally included stuff, best way is to re-do the whole backup. And there is the problem: You do not recognize that you included tons of unneeded stuff because you do not see what Duplicati is doing.

For me it works without issues. In the folder C:\Scripts\DuplicatiTestFiles there are a bunch of text files:

image

Say I would like to delete all text files starting with di. Then I would open the command prompt window for my backup job and enter this (without modifying the Advanced options list):

image

The generated output is:

Finished!

            
Starting purge operation
  Listing remote folder ...
Not writing a new fileset for duplicati-20180206T101451Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180206T120000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180206T200000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180207T040000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180207T120000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180207T200000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180208T040000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180208T120000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180208T200000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180209T040000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180209T120000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180209T200000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180210T040000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180210T120000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180210T200000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180211T040000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180211T120000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180211T200000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180212T040000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180212T120000Z.dlist.zip.aes as it was not changed
Not writing a new fileset for duplicati-20180212T200000Z.dlist.zip.aes as it was not changed
Replacing fileset duplicati-20180213T040000Z.dlist.zip.aes with duplicati-20180213T040001Z.dlist.zip.aes which has with 1 fewer file(s) (200 bytes reduction)
[Dryrun]:   Purging file C:\Scripts\DuplicatiTestFiles\di-13-02-2018--5-00-00;07.txt (200 bytes)
[Dryrun]: Writing files to remote storage
[Dryrun]: Would upload file duplicati-20180213T040001Z.dlist.zip.aes (7,58 KB) and delete file duplicati-20180213T040000Z.dlist.zip.aes, removing 1 files
Replacing fileset duplicati-20180213T120000Z.dlist.zip.aes with duplicati-20180213T120001Z.dlist.zip.aes which has with 13 fewer file(s) (2,52 KB reduction)
[Dryrun]:   Purging file C:\Scripts\DuplicatiTestFiles\di-13-02-2018--5-00-00;07.txt (200 bytes)
[Dryrun]:   Purging file C:\Scripts\DuplicatiTestFiles\di-13-02-2018-11-45-16;73.txt (198 bytes)
[Dryrun]:   Purging file C:\Scripts\DuplicatiTestFiles\di-13-02-2018-11-45-16;88.txt (198 bytes)
[Dryrun]:   Purging file C:\Scripts\DuplicatiTestFiles\di-13-02-2018-11-45-17;01.txt (198 bytes)
[Dryrun]:   Purging file C:\Scripts\DuplicatiTestFiles\di-13-02-2018-11-54-45;17.txt (198 bytes)
[Dryrun]:   Purging file C:\Scripts\DuplicatiTestFiles\di-13-02-2018-11-54-45;27.txt (198 bytes)
[Dryrun]:   Purging file C:\Scripts\DuplicatiTestFiles\di-13-02-2018-11-54-45;39.txt (198 bytes)
[Dryrun]:   Purging file C:\Scripts\DuplicatiTestFiles\di-13-02-2018-11-55-16;51.txt (198 bytes)
[Dryrun]:   Purging file C:\Scripts\DuplicatiTestFiles\di-13-02-2018-11-55-16;62.txt (198 bytes)
[Dryrun]:   Purging file C:\Scripts\DuplicatiTestFiles\di-13-02-2018-11-55-16;76.txt (198 bytes)
[Dryrun]:   Purging file C:\Scripts\DuplicatiTestFiles\di-13-02-2018-11-55-20;10.txt (198 bytes)
[Dryrun]:   Purging file C:\Scripts\DuplicatiTestFiles\di-13-02-2018-11-55-20;22.txt (198 bytes)
[Dryrun]:   Purging file C:\Scripts\DuplicatiTestFiles\di-13-02-2018-13-00-00;17.txt (200 bytes)
[Dryrun]: Writing files to remote storage
[Dryrun]: Would upload file duplicati-20180213T120001Z.dlist.zip.aes (7,59 KB) and delete file duplicati-20180213T120000Z.dlist.zip.aes, removing 13 files
Compacting not required
Return code: 0

First, all DLIST files are listed. DLIST files contain a list of all files included in that backup set. As you see, most DLIST files will not be modified, because all files we want to purge are created today and are only included in the 2 most recent backup versions.

Then all filenames are listed that are going to be purged from the backup set. Data exclusively used by these files in DBLOCK files will be marked for deletion. After this process, probably a Compacting operation will be performed to reorganize and cleanup the DBLOCK files, freeing up space at the remote location.

Of course files will actually be purged when the --dry-run option is not supplied.

No. With your command line arguments and not modifying advanced options list:

You cannot combine filters and paths on the commandline
Return code: 200

Then, when I remove all filters from the advanced options, it happily uses a random name or a password in the threadlevel argument, but in GUI it does not display any value for threadlevel. Had to remove this as well because I got another error message.

Now, the best result is: “Unable to start the purge process as there are 2162 orphan file(s)
Return code: 100”

Funny Duplicati. This is what I want. Get rid of accidentally included files.

I still did not find out how to purge. From this commandline URL I think it is impossible. There’s those many advanced options, Duplicati also seems to take into account.

I leave out unnecessary advanced options, see what’s left below. No matter what I try, I always get error messages. E.g.

“You have attempted to change a passphrase to an existing backup, which is not supported. Please configure a new clean backup if you want to change the passphrase.”

Is what I get with:
Commandline Args:

*.* --dry-run

Advanced options:

--thread-priority=normal --backup-name=[Local] Musik --dbpath=C:\Users\Michel\AppData\Local\Duplicati\NGBBVHLMOY.sqlite --encryption-module=aes --compression-module=zip --dblock-size=500MB --passphrase=thepassphrase --check-filetime-only=true --zip-compression-level=0 --disable-module=console-password-input

“Purge the backup to match the currently existing files” should not be as impossible as it is now, IMO…