With the ââfull-resultâ option, I get an important list of files all located in the same subdirectory, but with ââconsole-log-level = verboseâ, I get a completely different result, without any file from the results list of the ââfull-resultâ option!? Where is the error? And what to do now?
Should I run the purge-broken-files command with the ââfull-resultâ option?
That GUI convenience option here is converted into the second form, below (--retention-policy)
The second place to set retention policy is on that same screen (Step 5, Options) under âAdvanced Optionsâ in that long list thereâs a âretention-policyâ (--retention-policy on the command line). Default value is empty/unused which means keep all backups. Looks like this:
@johnvk Thank you for this information; actually, the backup retention was set to âsmart backup retentionâ.
I changed it to âKeep all backupsâ as suggested by @JonMikelV but with or without the ââfull-resultâ option, the result of the purge-broken files command remains the same!?
Can you locate and provide any log or other information from (or maybe leading up to) the original error? Agreed recovery had more mess, but knowing what recovery is having to fix might help guide the efforts.
Errors could be in email to you, or Job â Show log â General, or About â Show log â General. Any line with about the right time can be clicked to see if it will open to reveal any details of what went wrong.
Version count confusion might explain error âUnexpected number of remote volumes marked as deletedâ which says its records had extra versions at the destination. This shows up in the restore dropdown too, where smart retention allowed more than 1 version per week. Possibly something went wrong in delete?
You could look directly on the destination to see how many dated files you have with dlist in their name.
Ideally weâd have a âlog-file with âlog-file-log-level=Information or higher. Job â Show log -->Remote is close, and could be helpful if we have to sort out further discrepancies between records and destination.
Things could possibly be made consistent by using the delete command on the excess versions, but Iâd prefer not trying that without saving off copies of the database (job Database menu) and destination files. Recovery (especially from an unknown initial failure â thus I ask) does not always work first time around.
Sometimes people who are in a hurry to get backups running just export the job, import, and backup to a different destination. This gives them the option of either trying to fix the old one (under less pressure), or simply keeping it around as-is (no repair attempt) in case a restore of an old version is needed someday.
The latter is what I did once, but I rely on Duplicati mainly for short-term recovery not long-term retrievals.
What would your recovery priorities be? Of course, it would have been even better if things hadnât died⊠Thatâs another reason Iâm looking for what went wrong. Ideally this should be fixed, not recovered fromâŠ
The last successful backup was made on 31/01/2019 and the first time it failed was on 01/02/2019. Here are the error messages of the different backup attempts since the first failure until I attempted a database repair:
And here is the error message of the attempt to repair the database:
Feb 7, 2019 3:39 AM: Failed while executing "Repair" with id: 3
Duplicati.Library.Interface.UserInformationException: Recreated database has missing blocks and 3 broken filelists. Consider using "list-broken-files" and "purge-broken-files" to purge broken data from the remote store and the database.
Ă Duplicati.Library.Main.Operation.RecreateDatabaseHandler.DoRun(LocalDatabase dbparent, Boolean updating, IFilter filter, NumberedFilterFilelistDelegate filelistfilter, BlockVolumePostProcessor blockprocessor)
Ă Duplicati.Library.Main.Operation.RecreateDatabaseHandler.Run(String path, IFilter filter, NumberedFilterFilelistDelegate filelistfilter, BlockVolumePostProcessor blockprocessor)
Ă Duplicati.Library.Main.Operation.RepairHandler.RunRepairLocal(IFilter filter)
Ă Duplicati.Library.Main.Operation.RepairHandler.Run(IFilter filter)
Ă Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
Ă Duplicati.Library.Main.Controller.Repair(IFilter filter)
Ă Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)
Looking at remote destination, I found 9 âdlistâ files:
Thatâs exactly what I did last time the backup crashed ⊠But, is it reasonable to have to waste so much time trying to get things right? And to have to start all over again when it crashes and finally give up and recreate a backup from scratch?
I find it extremely damaging not to be able to rely on a backup system. Duplicati is clearly not up to its ambitions and it can in no way be described as a reliable backup system.
Thank you to all three @Pectojin@johnvk and @ts678, but I realize during our discussions that finally, once this crashes, there is not much left to do ⊠Itâs time for me to find another solution other than Duplicati
Sorry we couldnât get things going for you. Part of the reason Duplicati is still in beta is because, while it works well for many users, when it doesnât work well itâs a pain to deal with.
Good luck with your future backups! If you find something you think is comparable to Duplicati, feel free to let us know.
And of course, please consider Duplicati again when we (eventually) get to a stable release.
I hear the concerns about the instability. At this point, they are valid.
However, I am sticking with Duplicati for the time being for the following reasons. These are just my opinions (subject to change):
Acronis is heavy on resources
Paragon is hard for (non-tech) users to understand (so if Iâm going to be helping them / setting it up anywayâŠ)
Carbonite slowed down 1 system I investigated, and does not include large files nor video files by default
BackBlaze Backup doesnt keep enough versions
Duplicati is open source
and standards based, meaning, you could (with difficulty) extract your backups from the .dblock files by hand, without Duplicati, if you had to. Ie, no lock-in.
Duplicati is free
(Perhaps since Duplicati is free) it is focused on giving the user maximum choice
Duplicati via hashing ensures integrity of backup filesâespecially important with online backups
This hashing complicates Duplicati and is responsible for a lot of the bugs.
File-sync programs that compare only dates and file sizes and store files as plain files in the backup are not as complicated, have fewer bugs, and the backup files are directly viewable/usable/recoverable. Those are significant advantages. However, if thereâs corruption, youâre out of luck. I canât depend on that for long-term, widespread usage.
Duplicati is compatible with an enormous range of platforms. Itâs quite remarkable.
Duplicati had 204 votes on alternatives.to - highest of the free options (thatâs what brought me here in the first place)
Duplicati has a very active forum and vigorous user community for help and work arounds
Duplicati is in active development with a capable team
The lead developer @kenkendk posts on the forum regularly.
Duplicati looks to me to be on track to be an excellent software program, and Iâm excited to be in on the ground floor.
I agree with your arguments ⊠But the balance is still in favor of other solutions, because Duplicati is not sufficiently operational and especially because I can not rely on him to have a safe backupâŠ