Failed ... "Backup" with id: 3 (Mac, Backblaze B2)

I’m new to Duplicati (and please forgive me if my Google-fu has failed me) and I’m having trouble completing a backup job. System is a Mac Mini with 1 internal and 1 external drive running 10.10.5 and I’m running 2.0.2.1_beta_2017-08-01. I’m backing up to Backblaze B2. The backup failed (again) with after running for 4 days. I’ve successfully run Duplicati on 4 other Macs for other family members (with much small backup volumes).

I originally tried backing up everything with a single backup job (> 800GB) but after that failed a couple of times I read that the # of files being backed up can be an issue so I split the job by disk. The backup of the internal drive succeeded (source 29.20 GB). I started the backup of the external drive with a subset of what I need backed up and it worked (275.83 GB) at first. When I added to the external drive backup job (>400 GB) it failed with <14GB remaining with a file not found error (the complete log message is pasted below).

Am I hitting the # of file limit again? I’m curious as to why a file on /var isn’t found. Any suggestions are appreciated.

David


Feb 5, 2018 1:00 PM: Failed while executing "Backup" with id: 3
System.IO.FileNotFoundException: Could not find file "/var/folders/9s/5yc8ww1s3wd1htnr2bjm5ms40000gn/T/dup-eb01483a-bb7f-46e4-9533-309f15c60a0f"
File name: '/var/folders/9s/5yc8ww1s3wd1htnr2bjm5ms40000gn/T/dup-eb01483a-bb7f-46e4-9533-309f15c60a0f'
  at System.IO.FileStream..ctor (System.String path, System.IO.FileMode mode, System.IO.FileAccess access, System.IO.FileShare share, System.Int32 bufferSize, System.Boolean anonymous, System.IO.FileOptions options) [0x0019e] in <ac812cae460544af83bd6cf54c5eee87>:0 
  at System.IO.FileStream..ctor (System.String path, System.IO.FileMode mode, System.IO.FileAccess access, System.IO.FileShare share) [0x00000] in <ac812cae460544af83bd6cf54c5eee87>:0 
  at (wrapper remoting-invoke-with-check) System.IO.FileStream:.ctor (string,System.IO.FileMode,System.IO.FileAccess,System.IO.FileShare)
  at System.IO.File.OpenRead (System.String path) [0x00000] in <ac812cae460544af83bd6cf54c5eee87>:0 
  at Duplicati.Library.Encryption.EncryptionBase.Encrypt (System.String inputfile, System.String outputfile) [0x00000] in <6ee1ff1d948e4a8ba81cc818b61813f0>:0 
  at Duplicati.Library.Main.BackendManager+FileEntryItem.Encrypt (Duplicati.Library.Interface.IEncryption encryption, Duplicati.Library.Main.IBackendWriter stat) [0x0001e] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.BackendManager.Put (Duplicati.Library.Main.Volumes.VolumeWriterBase item, Duplicati.Library.Main.Volumes.IndexVolumeWriter indexfile, System.Boolean synchronous) [0x00061] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.UploadRealFileList (Duplicati.Library.Main.BackendManager backend, Duplicati.Library.Main.Volumes.FilesetVolumeWriter filesetvolume) [0x00192] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String[] sources, Duplicati.Library.Utility.IFilter filter) [0x00860] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Controller+<>c__DisplayClass16_0.<Backup>b__0 (Duplicati.Library.Main.BackupResults result) [0x0030f] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action`1[T] method) [0x0014b] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Library.Main.Controller.Backup (System.String[] inputsources, Duplicati.Library.Utility.IFilter filter) [0x00068] in <118ad25945a24a3991f7b65e7a45ea1e>:0 
  at Duplicati.Server.Runner.Run (Duplicati.Server.Runner+IRunnerData data, System.Boolean fromQueue) [0x00454] in <a6c0c2089b9a44ec9be5057a44f12116>:0

Hi @davidy123, welcome to the forum!

I’m not a Mac expert so please correct me if I’m wrong, but I believe /var/folders/ (or /private/var/folders/) is used for per-user temp files. Apparently it showed up with MacOS 10.5 in an attempt to increase security by not using the less secure /tmp/ and /Library/Caches/ folders.

What it looks like to me is Duplicati is encrypting a dblock (archive) file in preparation for being uploaded (that’s the Duplicati.Library.Encryption.EncryptionBase.Encrypt part of the error) and suddenly the file (not sure if it’s the unencrypted source or being-encrypted-destination) disappears while it’s in use.

Questions I have include:

  • How’s your free disk space on the drive containing /var/folders/?
  • If you run the job again does it still fail at the same point (<14GB remaining)?
  • How big is your “Upload volume size” on step 5 (Options) of editing a job?
  • If you scale back to the 275.83 GB version of the backup does the error go away? If so, does a scale-up smaller than to 400GB (say, to 300GB) work?

While there is a technical limit to the number of files Duplicati can back up, I believe it’s REALLY REALLY big (like more than 100 trillion files) so you’re probably not running into that issue. :slight_smile:

(By the way, I edited your post by putting “~~~” before and after the error message to make it easier to read.)

@JonMikelV - Thanks for the welcome and your response. You have questions, I have (some) answers…

  • How’s your free disk space on the drive containing /var/folders/?
    52.9 GB available out of 319.21 GB.

  • If you run the job again does it still fail at the same point (<14GB remaining)?
    I’m running it again now.

  • How big is your “Upload volume size” on step 5 (Options) of editing a job?
    “Upload volume size” is set to 50 GB.

  • If you scale back to the 275.83 GB version of the backup does the error go away? If so, does a scale-up smaller than to 400GB (say, to 300GB) work?
    I’ll try this after the current job completes.

Thanks for the “~~~” tip, that is so much easier to read.

BTW - I’m a former CrashPlan user but I have already uninstalled the CrashPlan agent to avoid any potential file lock issues. That shouldn’t be an issue but I thought I would mention it anyway.

I think that’s your problem…the “Upload volume size” setting determines the target max size for each archive file to be uploaded.

If you really have it set to 50GB (not the 50MB default) then you’re likely using up almost your whole drive creating a single 50GB zip (or 7z) file to then be uploaded.

You can safely change that setting on an existing backup without any issues, though only archives made after the setting change will have the new size.

I’m a former CrashPlan use as well, but don’t recall having issues with both programs running as long as I used the default filters on the Options page.

My fault, it is actually set to 50 MB. Sorry about that.

The good news is that it just completed successfully. During its cleanup I did get a few messages about files it was looking to delete not being found, but then another message about the delete happening - except for 1 warning included in the message below. Despite completing successfully I have a feeling something is still off. I’ll continue to slowly add to this backup set over time.

Also, as an FYI I’ve pasted the “ls -l” of the /var/folders directory during and after backup run.

Thanks for your help with this!

-David

Before and most of the time during:

Mac_mini_2010:T davidy$ ls -l dup*
-rw-r--r--  1 davidy  staff   156 Feb  5 13:35 dup-d15b0d01-c667-4652-bdba-aecda46b226b
-rw-r--r--  1 davidy  staff   154 Feb  6 04:20 dup-f0fac0c9-7cc9-4fcf-a414-c6edd4819db4
-rw-r--r--  1 davidy  staff   154 Feb  6 04:20 dup-fc4b8d14-f4d6-4cd7-950b-063cdce9fe11
-rw-r--r--  1 davidy  staff  3343 Feb  5 13:00 dupl-usagereport-13062-20180201170246.json
-rw-r--r--  1 davidy  staff   902 Feb  5 13:32 dupl-usagereport-13062-20180205180551.json

I caught this once during:

Mac_mini_2010:T davidy$ ls -l dup*
-rw-r--r--  1 davidy  staff      156 Feb  5 13:35 dup-d15b0d01-c667-4652-bdba-aecda46b226b
-rw-r--r--  1 davidy  staff      154 Feb  6 04:20 dup-f0fac0c9-7cc9-4fcf-a414-c6edd4819db4
-rw-r--r--  1 davidy  staff  3555344 Feb  7 01:07 dup-fb8850cc-65fd-4dc4-bdff-f47a541b9814
-rw-r--r--  1 davidy  staff      154 Feb  6 04:20 dup-fc4b8d14-f4d6-4cd7-950b-063cdce9fe11
-rw-r--r--  1 davidy  staff     3343 Feb  5 13:00 dupl-usagereport-13062-20180201170246.json
-rw-r--r--  1 davidy  staff      902 Feb  5 13:32 dupl-usagereport-13062-20180205180551.json

After:

Mac_mini_2010:T davidy$ ls -l dup*
-rw-r--r--  1 davidy  staff   154 Feb  6 04:20 dup-f0fac0c9-7cc9-4fcf-a414-c6edd4819db4
-rw-r--r--  1 davidy  staff   154 Feb  6 04:20 dup-fc4b8d14-f4d6-4cd7-950b-063cdce9fe11
-rw-r--r--  1 davidy  staff  3343 Feb  5 13:00 dupl-usagereport-13062-20180201170246.json
-rw-r--r--  1 davidy  staff   902 Feb  5 13:32 dupl-usagereport-13062-20180205180551.json

Backup job Result message:

Feb 7, 2018 7:07 PM: Result
DeletedFiles: 0
DeletedFolders: 0
ModifiedFiles: 0
ExaminedFiles: 70432
OpenedFiles: 68837
AddedFiles: 68837
SizeOfModifiedFiles: 0
SizeOfAddedFiles: 221329907288
SizeOfExaminedFiles: 517502054447
SizeOfOpenedFiles: 221329907288
NotProcessedFiles: 0
AddedFolders: 4257
TooLargeFiles: 0
FilesWithError: 0
ModifiedFolders: 0
ModifiedSymlinks: 0
AddedSymlinks: 80
DeletedSymlinks: 0
PartialBackup: False
Dryrun: False
MainOperation: Backup
CompactResults: null
DeleteResults:
    DeletedSets: []
    Dryrun: False
    MainOperation: Delete
    CompactResults: null
    ParsedResult: Success
    EndTime: 2/8/2018 12:02:20 AM
    BeginTime: 2/7/2018 8:20:07 PM
    Duration: 03:42:12.6650320
    BackendStatistics:
        RemoteCalls: 15
        BytesUploaded: 5703588
        BytesDownloaded: 57921879
        FilesUploaded: 4
        FilesDownloaded: 3
        FilesDeleted: 4
        FoldersCreated: 0
        RetryAttempts: 1
        UnknownFileSize: 0
        UnknownFileCount: 0
        KnownFileCount: 25177
        KnownFileSize: 660038254053
        LastBackupDate: 2/5/2018 1:32:31 PM
        BackupListCount: 3
        TotalQuotaSpace: 0
        FreeQuotaSpace: 0
        AssignedQuotaSpace: -1
        ParsedResult: Success
RepairResults: null
TestResults:
    MainOperation: Test
    Verifications: [
        Key: duplicati-20180205T183231Z.dlist.zip.aes
        Value: [],
        Key: duplicati-i47925af206fb4d84a07146fab626bd2c.dindex.zip.aes
        Value: [],
        Key: duplicati-b9b33283b3c5e4236ae11d8501735479e.dblock.zip.aes
        Value: []
    ]
    ParsedResult: Success
    EndTime: 2/8/2018 12:07:17 AM
    BeginTime: 2/8/2018 12:05:08 AM
    Duration: 00:02:09.0684660
ParsedResult: Warning
EndTime: 2/8/2018 12:07:17 AM
BeginTime: 2/5/2018 6:32:22 PM
Duration: 2.05:34:54.6415600
Messages: [
    scheduling missing file for deletion, currently listed as Uploading: duplicati-20180201T165745Z.dlist.zip.aes,
    removing file listed as Deleting: duplicati-i6355a87598d94076bf0062f3987e1bde.dindex.zip.aes,
    removing file listed as Deleting: duplicati-b8091eac7533f4de19ae6b21c697da862.dblock.zip.aes,
    removing file listed as Deleting: duplicati-i76ccff7a0af045daacf0b91e77de3ff9.dindex.zip.aes,
    removing file listed as Deleting: duplicati-iaa9cc8c8d2a6472087645004c38bb8a5.dindex.zip.aes,
...
]
Warnings: [
    Expected there to be a temporary fileset for synthetic filelist (4, duplicati-b1cd428bb53b34d5a92b29580b179e126.dblock.zip.aes), but none was found?,
    Delete operation failed for duplicati-20180205T183230Z.dlist.zip.aes with FileNotFound, listing contents => The requested file does not exist
]
Errors: []

Thanks for the details - and good idea on the ls of /tmp!

The backup working (yay!) suggests it was probably a temporary issue with the /tmp folder causing your initial File Not Found issue. Hopefully future jobs will continue to run to completion.

The messages mostly look normal and are logging the synchronization of backup details between what’s logged in the local database and what’s at the destination.

I believe the “Scheduling missing file…” message relates to your File Not Found error such that Duplicati logged to the database “I’m going to upload this file” but now that file can’t be found at the destination, so it’s just going to remove the attempted upload from the status logs.

The “removing file listed as Deleting” is likely normal cleanup due to history or compacting cleanup. At each run Duplicati looks at the age of versions as well as “sparseness” of archive file contents at the destination.

Expired versions get flagged as “deletable”. Once enough versions are deletable, the archive files are either deleted or, if enough parts of the archive file are deletable, downloaded and re-compressed with other archive files. In both cases, the end result is a no longer needed archive file gets deleted from the destination after which it will be removed from the database.

Please let us know if future backups work for you or keep giving you troubles.