Hi there,
I need help with an issue that’s preventing me from restoring my Duplicati backup. To make the story short, I was using Duplicati to backup all my files to Backblaze B2 on a regular basis up until the start of October. All of my drives failed me at the same time around this time, and I lost the database file for my backup server, which wasn’t backed up, as I assumed only the dataset itself was needed to complete a quick restore.
The total dataset size in question is around 4TB and contains around 1,651,300 individual files over 18,673 150Mb blocks. I’ve yet to recover these files, and am at a loss as I’m encountering issues after issues.
When this occured, I tried to simply restore the files via the Duplicati Web UI on a fresh installation with new drives, but without the database, and seeing as the blocks were encrypted, and perhaps that I was trying to pull them directly from B2, Duplicati crashed before ever completing the index rebuild after a whole 2 weeks. I believe that there was a canary update implemented during that time but I didn’t see any improvements after trying again with that specific patch version for another whole week.
Seeing as the Web UI route wasn’t getting me anywhere, I set to follow the Disaster Recovery guide and attempt a command line restore: https://duplicati.readthedocs.io/en/stable/08-disaster-recovery
Out of frustration, I downloaded all the files from Backblaze on my rebuilt host, which was now running FreeNAS. This process alone cost me around 100$ CAD to pull.
One issue with this that I did not factor in, is that I unfortunately wasn’t able to install Duplicati on the host directly, or via a jail. I cannot financially afford to pull this dataset again from the cloud for a few more months, nor do I have sufficient space to move it elsewhere, so everything needs to be done directly from where the files are now stored.
As such, I’ve resorted to creating an NFS share on the FreeNAS host where my dataset is stored, and I have an Ubuntu VM running Duplicati on the same machine connected to said share running the Recovery tool. This VM is running Ubuntu 18.04.3 LTS and has 4 cores and 8GB of RAM assigned to it.
I first had to decrypt each file. This process failed about 3-4 times due to Out of Bounds exceptions. But I managed to decrypt each archive.
Same thing occured while indexing. If the indexing process stopped for whatever reason, Duplicati was unable to recover an indexing session, as it gave an error about running out of memory.
After going all this trouble, I now have all dblock.zip files decrypted and indexed. I was able to start a partial restore and can see all file names. The whole dataset seems to be intact.
However, I have been running into the following problem mid-restore, here’s one example:
331239: /mnt/data/Restored/srv/shared/Backups/BungeePC/BungeePC-Mar2019/Documents/Native Instruments/Massive/Sounds/!Dubstep/Dubstep lead 1/lead 3 (4).nmsv (2.74 KB) error: System.IO.IOException: Too many open files
at System.IO.FileStream…ctor (System.String path, System.IO.FileMode mode, System.IO.FileAccess access, System.IO.FileShare share, System.Int32 bufferSize, System.Boolean anonymous, System.IO.FileOptions options) [0x0025f] in <8f2c484307284b51944a1a13a14c0266>:0
at System.IO.FileStream…ctor (System.String path, System.IO.FileMode mode, System.IO.FileAccess access, System.IO.FileShare share) [0x00000] in <8f2c484307284b51944a1a13a14c0266>:0
at (wrapper remoting-invoke-with-check) System.IO.FileStream:.ctor (string,System.IO.FileMode,System.IO.FileAccess,System.IO.FileShare)
at Duplicati.CommandLine.RecoveryTool.Restore+CompressedFileMRUCache.ReadBlock (System.String filename, System.String hash) [0x00020] in <3f063307a1b340f3ac9162f9c12d9a98>:0
at Duplicati.CommandLine.RecoveryTool.Restore+HashLookupHelper.ReadHash (System.String hash) [0x00068] in <3f063307a1b340f3ac9162f9c12d9a98>:0
at Duplicati.CommandLine.RecoveryTool.Restore+HashLookupHelper.WriteHash (System.IO.Stream sw, System.String hash) [0x00000] in <3f063307a1b340f3ac9162f9c12d9a98>:0
at Duplicati.CommandLine.RecoveryTool.Restore.Run (System.Collections.Generic.List1[T] args, System.Collections.Generic.Dictionary
2[TKey,TValue] options, Duplicati.Library.Utility.IFilter filter) [0x0041e] in <3f063307a1b340f3ac9162f9c12d9a98>:0
Duplicati starts giving off System.IO.IOException: Too many open files for each files after a certain number of them and the rest of the restore fails and this spams the console.
As this is all running under Linux, I attempted to raise the file descriptor limit on both the VM and FreeNAS host as high as I could to try and resolve the issue, but this did not fix the problem.
I am super close to getting my files back but unsure where to go from there. Does anyone have any idea how I can get past this error message?
Any help is appreciated.