Backing Up Server 2012 R@ Access Denied

I backed up the (Win Svr 2012 R2) Server to Dropbox and got no errors (It’s not that big, about 150Gb Max). It’s a Google Compute Instance. It took about 20++ hours. A second backup (I assume these are incremental) finished much quicker, but left about 350 messages saying essentially access denied for particular files etc etc. I don’t think this is an issue because some files will be in use and some are system type files. Here’s a typical couple of messages:

Failed to process path: C:\Users\admin\AppData\Local\Google\Chrome\User Data\Default\Local Extension Settings\eiimnmioipafcokbfikbljfdeojpcgbh\LOCK
System.IO.IOException: The process cannot access the file because another process has locked a portion of the file.

at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.FileStream.ReadCore(Byte buffer, Int32 offset, Int32 count)
at System.IO.FileStream.Read(Byte array, Int32 offset, Int32 count)
at Duplicati.Library.Main.Blockprocessor.Readblock()
at Duplicati.Library.Main.Operation.BackupHandler.ProcessStream(Stream stream, CompressionHint hint, BackendManager backend, FileBackedStringList blocklisthashes, FileBackedStringList hashcollector, Boolean skipfilehash)
at Duplicati.Library.Main.Operation.BackupHandler.HandleFilesystemEntry(ISnapshotService snapshot, BackendManager backend, String path, FileAttributes attributes)


Failed to process metadata for “C:\Windows\System32\winevt\Logs\Setup.evtx”, storing empty metadata
System.UnauthorizedAccessException: Attempted to perform an unauthorized operation.
at System.Security.AccessControl.Win32.GetSecurityInfo(ResourceType resourceType, String name, SafeHandle handle, AccessControlSections accessControlSections, RawSecurityDescriptor& resultSd)
at System.Security.AccessControl.NativeObjectSecurity.CreateInternal(ResourceType resourceType, Boolean isContainer, String name, SafeHandle handle, AccessControlSections includeSections, Boolean createByName, ExceptionFromErrorCode exceptionFromErrorCode, Object exceptionContext)
at System.Security.AccessControl.NativeObjectSecurity…ctor(Boolean isContainer, ResourceType resourceType, String name, AccessControlSections includeSections, ExceptionFromErrorCode exceptionFromErrorCode, Object exceptionContext)
at System.Security.AccessControl.FileSecurity…ctor(String fileName, AccessControlSections includeSections)
at Duplicati.Library.Snapshots.SystemIOWindows.GetAccessControlFile(String path)
at Duplicati.Library.Snapshots.SystemIOWindows.GetMetadata(String path, Boolean isSymlink, Boolean followSymlink)
at Duplicati.Library.Main.Operation.BackupHandler.GenerateMetadata(ISnapshotService snapshot, String path, FileAttributes attributes)

My Question: Is this normal, ie. should I expect, when backing up a complete system, to get these sorts of errors?

The follow up: If NOT then what am I doing wrong?

OR if they are to be expected is there some IGNORE Switch when setting up the backup, ie. IGNORE XXXXX Type System Files?

Thanks in Advance…

OK, did the troll through many posts and this may be my problem (and solution). Simply not Running as Administrator. I am running it from the admin account. I’m happy to run as a service with admin rights. I’ll look at how to do that as well (Have installed service as Administrator, deleted the old backups and re-run the complete (initial) backup), I’ll see how that turns out, if OK then issue resolved:

In your case it looks like you are running Duplicati under an account that doesn’t have permissions to certain folders (such as “ D:\$RECYCLE.BIN\S-1-5-18 ” in your screen shot).

This isn’t an error with Duplicati, it’s Windows not letting Duplicati see the files so it can’t back them up. You can get around this a few ways including:

  • run Duplicati under an administrator account (either as a service or with “Run as administrator”)
  • give the user account running Duplicati access to those folders (not recommended)
  • excluding those folders (such as Recycle.bin) from your backup

I’d suggest you try manually running Duplicati with a right click and choosing “Run as administrator” (it might be under a “More…” submenu) and see if the warnings go away (or decrease).

  • If they go away, then we’ve confirmed that’s the issue.
  • If they only decrease then there’s probably another issue going on (perhaps in-use files not allowing Duplicati to read them).
  • If they stay the same then I’m wrong about the cause and we can try looking at something else.

Duplicati has recently extended its filtering system to include OS-specific filters, but you have to turn them on.

OS Specific Default Filters?

Where are the “Default Filters” documented

This might help some with complaining files. I suspect that tuning will happen as more experience is obtained.

Other things to reduce complaints are to run with more privilege, such as Administrator group or as a service.

–snapshot-policy can use VSS to read locked files, but VSS doesn’t guarantee a file is consistent at that time.

@ts678, thanks for the advice. So I have set the backup as a service according to this post, ie. the official way:

Migrating from User to Service install on Windows.

It appears to be running now without so many objections. Obviously it objects to the PageFile.sys and I’m sure there are a number of other OS specific files I should not backup (I select the complete C:\ drives since I did not want to miss important files by inadvertantly not selecting them).

The last incremental backup saw 117 issues, as noted I suspect they are file in use errors. I am getting errors like this:

Failed to process path: C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQLEXPRESS\MSSQL\DATA\modellog.ldf
System.IO.IOException: The process cannot access the file ‘C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQLEXPRESS\MSSQL\DATA\modellog.ldf’ because it is being used by another process.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy, Boolean useLongPath, Boolean checkHost)
at System.IO.FileStream…ctor(String path, FileMode mode, FileAccess access, FileShare share, Int32 bufferSize, FileOptions options, String msgPath, Boolean bFromProxy)
at System.IO.FileStream…ctor(String path, FileMode mode, FileAccess access, FileShare share)
at Duplicati.Library.Snapshots.SystemIOWindows.FileOpenRead(String path)
at Duplicati.Library.Main.Operation.BackupHandler.HandleFilesystemEntry(ISnapshotService snapshot, BackendManager backend, String path, FileAttributes attributes)

This is a TEMP file (AFAIK) for MYSQL. For a novice like me, I guess the obvious question is: Does the actual MYSQL database get backed up. I assume it does, but if it doesn’t then that’s a major issue. Do you have any thpughts / advice?

Again, thanks for your advice to-date.


Probably just an accident, but MySQL is not Microsoft SQL Server, but I can’t help with details for either. :disappointed:

.LDF File Extension (it’s not a TEMP file)

Google finds articles on the model database and its use. Other system databases may be master and msdb, then presumably there can be other databases that are not system ones. I can’t tell you what you might find, however if you’re only getting an error on modellog.ldf, maybe it’s possible that SQL Server isn’t much used.


Again, thank you…

Duplicati is not really suitable for full system backups (which would include operating system files, application files, registry, etc). It works great for general user data files but I would not use it to back up OS, program files, or even MS SQL data.

At best Duplicati will make a general snapshot so that it can back up “open” files which may include SQL MDF and LDF files. But these would be crash-consistent backups, not application-consistent backups. Duplicati (as far as I know) has no intelligence when it comes to MS SQL and can’t ask the database engine to create an application-consistent snapshot.

It looks like you may be trying to do this so I just wanted to give a word of warning.

We may be able to provide some advice depending on exactly what you are trying to protect and what types of recoveries you may need. (Bare metal, complete system recover, … etc)


Thanks for getting back. I do not pretend to be expert in the vagaries of system backup. Here’s what we have and what we are trying to do:

Google Compute Instance of 150G, We do periodically take snapshots
We have essentially three systems running (so we are not after bare metal, full system, BUT to backup these three systems and be confident that we could recover them - the system itself can be recovered from a GCI snapshot:

  1. JIRA (server and Service Desk) that uses MySQL
  2. Project in a Box (a project management document manager and repository) that uses MySQL
  3. Our own (in development) server (the mothership) that is basically Active directory using Apache DS (very low utilisation, but need to back it up as utilisation will ramp up (eventually)

Since I have full backup/restore with snapshots on a weekly basis (or whatever we decide) having a full backup with duplicati is not really essential. What is essential is that we backup those three systems. I have enough space in a Dropbox account and I did not want to miss anything so I simply backed up the whole of the C:\ drive (of a Server 2012 R2 install).

I would be happy to NOT backup the whole drive if I could be confident of backing up those three systems. Then a (disaster) recovery would be:

  1. Rollback to the next available snapshot
  2. Restore the latest backups of those three systems mentioned above from duplicati

The other options possibly include:

  1. Take a snapshot every say 48 hours and delete older ones / keep weekly? and monthly? snapshots
  2. Move to a commercial backup system (would they be much / any better) like say CrashPlan?
  3. Just snapshot on a weekly basis and accept the risk (on average 3.5 days since last snapshot)

We do not have a huge amount of traffic or data for that matter, a very small (4 person) company, but I’d like to use the Snapshot / Duplicati option backing up only those three systems IF I could be confident I had the necessary data being backed up.

Does that make sense?


Ok if you are using GCI snapshots that should work great for DR situations. Frequency of the snaps and retention would depend on your needs.

If you are trying to use Duplicati to augment the image level snapshots or to make it easier to do file level restores then it should work out pretty well. I like having both file level backups and image level backups myself.

For MS SQL data you may want to also set up a maintenance plan (In SQL Management Studio) that will take regular database backups (dumps). You could protect these dumps with Duplicati along with your other files. This is better than trying to back up the MDF/LDF files directly with Duplicati because as I mentioned above they won’t be application-aware backups. (Database dumps from SQL Mgmt Studio are application-aware backups.)

Good luck!

We need to clear up the SQL Server versus MySQL confusion. They’re completely different databases. SQL Server seems to have good VSS support. talks about SQL Server but not MySQL.

CrashPlan is a file-oriented product that can use VSS (Duplicati is also), and such a plan has inherent limits:
Back up open files and databases
Understand and troubleshoot backing up open files with Windows VSS

The above has links to explanations. Here’s another:
Crash-Consistent vs. Application-Consistent Backup

SQL Writer Service desribes the component of SQL Server that lets it better support backups that use VSS.

MSSQL backup how’s it work exactly? is some Duplicati-specific discussion, with a specific SQL Server plan. This discussion says how Duplicati asks the database engine to create an application-consistent snapshot, however it’s probably basically the same thing that any backup using VSS would do. See the above articles. Basically, it looks like getting the application-consistent backup needs VSS, and also some application help.

MS SQL does not backup modified files is likely why --disable-filetime-check is used in detailed steps above.

Successfully piecing together a system (or even an application) from a collection of files seems challenging. Don’t you want to carefully reinstall pieces out of the file backup, or does it really work to just install all of it?

I have little personal experience with live image or high-cost backups, but some image backups do use VSS, possibly giving a backup that’s not quite as “best practice” as database dumps, but offers other advantages.

Backup Internals: What is VSS, how does it work and why do we use it?

You mentioned LDF files in an earlier post, which is the default extension for MS SQL Database Log files. I have to admit I’m not familiar with MySQL so don’t know what the default extensions are.

But you should note that CrashPlan and Duplicati only support generic VSS snapshots. Those snaps will make sure the filesystem is in a consistent state, but cannot guarantee application level consistency. That’s an important distinction. Without application level consistency the best you will achieve is a crash-consistent backup.

Oops sorry @ts678, I thought you were the OP.


Thank you and my bad. PIAB does in fact, use its own instance of SQLExpress which I assume is the embedded?/Free version of SQL Server. JIRA OTOH uses MySQL. So I have both running. Apologies for the confusion, I though both applications used MySQL, they do not.

Thank you for your advice and the many links…

Yes, that’s exactly what I’d like to do. As in a snapshot a week, and backups nightly. This would allow, in the worst case, to recover the complete system from no more than a week ago, and then to recover those three systems I describe to within a day.


Again thanks. I know it’s difficult to comment on a system you know little about and I appreciate your advice. See my comment above WRT the database confusion, that was me and I apologise.

The issue still remains that my ideal situation would be to take a weekly snapshot (provided by the GCI infrastructure) and nightly incremental backups of those three important systems (so two database backups) and as ts678 points out: Successfully piecing together a system (or even an application) from a collection of files seems challenging. And I’m finding out it is.

Thanks to both of you for your advice and comments.