Backup job stuck but duplicati is very bad at communicating what's going on


#1

I recently setup backup job to Amazon Drive (several hundred GB). For the first 300 or so GB the upload went fine. But since this morning, the progress bar has not moved at all. For now, I am assuming that this is a problem on Amazon’s side (perhaps some kind of rate limiting or whatever). My issue here is that even if I wanted to report this problem as a duplicati issue, the duplicati UI is not helping me with that at all. Here is what I did

  1. I checked the logs: nothing (last entry is from two days ago, and unrelated).
  2. I tried live log: information: nothing.
  3. I tried to stop the job by clicking the X in the progress bar. It asked me whether I want to wait until the current file is uploaded or stop immediately. I chose, wait for upload to finish, but since nothing was being uploaded, I changed my mind and chose “Stop immediately” and got this:
    image
  4. I think after I clicked OK, it either started stopping the process anyway or I tried stopping it again, this time without an error message (I think it was the latter)
  5. It then gave me this “Thread was being aborted” error:
    image
    I have no idea what that means and whether “was being aborted” means that it crashed/aborted itself or whether I aborted it (which should not produce an error message, because I requested the abortion and if there has to be an error message it should at least be clear that this was a user requested abortion).
  6. When I clicked show, it showed me this:
    image
  7. When I clicked on General, I get some information for the first time:

    This seems to indicate that the upload stopped working this morning at 7:45. That is more than 12 hours ago but duplicati did not seem to see any problem. At least it did not provide any warning or anything. And I still have no idea why the upload stopped.
  8. Now I can also find some new entries in the general log (the one’s I checked in point 1 above):

    Here are the details:
29 Nov 2017 23:00: Failed while executing "Backup" with id: 2
System.Threading.ThreadAbortException: Thread was being aborted.
   at Duplicati.Library.Main.Operation.BackupHandler.HandleFilesystemEntry(ISnapshotService snapshot, BackendManager backend, String path, FileAttributes attributes)
   at Duplicati.Library.Main.Operation.BackupHandler.RunMainOperation(ISnapshotService snapshot, BackendManager backend)
   at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass16_0.<Backup>b__0(BackupResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)
   at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)
29 Nov 2017 23:00: Error in worker
System.Threading.ThreadAbortException: Thread was being aborted.
   at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)
   at Duplicati.Library.Utility.WorkerThread`1.Runner()
29 Nov 2017 23:00: Request for http://localhost:8200/api/v1/serverstate/pause?duration=5m gave error
System.InvalidOperationException: Cannot add stuff to HttpInput.Empty.
   at HttpServer.HttpInput.Add(String name, String value)
   at Duplicati.Server.WebServer.RESTHandler.DoProcess(RequestInfo info, String method, String module, String key)
29 Nov 2017 23:00: Reporting error gave error
System.ObjectDisposedException: Cannot write to a closed TextWriter.
   at System.IO.__Error.WriterClosed()
   at System.IO.StreamWriter.Flush(Boolean flushStream, Boolean flushEncoder)
   at Duplicati.Server.WebServer.RESTHandler.DoProcess(RequestInfo info, String method, String module, String key)

It would be great if duplicati could become a bit more talkative regarding failing backups.

BTW: the next backup job in the queue automatically started as scheduled and is uploading fine at the moment.


Large Backup keeps Failing
Duplicati job crashes when transfer rates fluctuate
Unable to continue my backups due to File Inconsistency - Please help
#2

Today I had another backup job to Anazon Cloud Drive fail. This time, at least the reason was logged (I double checked the logs for the one from 30 nov in the screenshot above but there is definitely nothing logged at 15:40). In fact, I suppose this failure went quite smoothely, since the backup job quit with an error message and started the next backup as scheduled. I suppose this is how it’s supposed to work.

Nevertheless, I think duplicati should be able to handle this situation better, i.e. without asking me to run repair. I’m not sure what might have led to a missing file in the backend, but since this was the first and only time this backup ran, it must have happened during that run and so I think duplicati should be able to handle this without user intervention.

So here are the logs:

4 Dec 2017 03:36: Message
Fatal error
Duplicati.Library.Interface.UserInformationException: Found 1 files that are missing from the remote storage, please run repair
   at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, String protectedfile)
   at Duplicati.Library.Main.Operation.BackupHandler.PostBackupVerification()
   at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)
4 Dec 2017 03:36: Message
Found 1 files that are missing from the remote storage, please run repair
4 Dec 2017 03:36: Message
Missing file: duplicati-20171203T131457Z.dlist.zip.aes
4 Dec 2017 03:36: Message
removing file listed as Deleting: duplicati-b7511cf4d558d4349bf7787c0a1df19f7.dblock.zip.aes
4 Dec 2017 03:36: Message
removing file listed as Deleting: duplicati-b6f493bcf388149e1aca3e988c6fc4cbf.dblock.zip.aes
4 Dec 2017 03:36: Message
removing file listed as Deleting: duplicati-ieff8d69a2e5e4ff08bf97eb9b2d91816.dindex.zip.aes
4 Dec 2017 03:36: Message
removing file listed as Deleting: duplicati-b7fd0cb0f23e741ebb56fbacb15e3c674.dblock.zip.aes
4 Dec 2017 03:36: Message
Renaming "duplicati-b7511cf4d558d4349bf7787c0a1df19f7.dblock.zip.aes" to "duplicati-b54d9f447feed4a9990082df177a100b2.dblock.zip.aes"
4 Dec 2017 03:36: Retry
Operation Put with file duplicati-b7511cf4d558d4349bf7787c0a1df19f7.dblock.zip.aes attempt 1 of 5 failed with message: The remote server returned an error: (500) Internal Server Error.
System.Net.WebException: The remote server returned an error: (500) Internal Server Error.
   at Duplicati.Library.Utility.AsyncHttpRequest.AsyncWrapper.GetResponseOrStream()
   at Duplicati.Library.Utility.AsyncHttpRequest.GetResponse()
   at Duplicati.Library.JSONWebHelper.GetResponse(AsyncHttpRequest req, Object requestdata)
   at Duplicati.Library.JSONWebHelper.PostMultipart(String url, Action`1 setup, MultipartItem[] parts)
   at Duplicati.Library.JSONWebHelper.PostMultipartAndGetJSONData[T](String url, Action`1 setup, MultipartItem[] parts)
   at Duplicati.Library.Backend.AmazonCloudDrive.AmzCD.Put(String remotename, Stream stream)
   at Duplicati.Library.Main.BackendManager.DoPut(FileEntryItem item)
   at Duplicati.Library.Main.BackendManager.ThreadRun()
4 Dec 2017 03:36: Message
Renaming "duplicati-b6f493bcf388149e1aca3e988c6fc4cbf.dblock.zip.aes" to "duplicati-ba709046aacb647d3873989852428df99.dblock.zip.aes"
4 Dec 2017 03:36: Retry
Operation Put with file duplicati-b6f493bcf388149e1aca3e988c6fc4cbf.dblock.zip.aes attempt 1 of 5 failed with message: Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
System.IO.IOException: Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host. ---> System.Net.Sockets.SocketException: An existing connection was forcibly closed by the remote host
   at System.Net.Sockets.Socket.Send(Byte[] buffer, Int32 offset, Int32 size, SocketFlags socketFlags)
   at System.Net.Sockets.NetworkStream.Write(Byte[] buffer, Int32 offset, Int32 size)
   --- End of inner exception stack trace ---
   at System.Net.Sockets.NetworkStream.Write(Byte[] buffer, Int32 offset, Int32 size)
   at System.Net.Security._SslStream.StartWriting(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest)
   at System.Net.Security._SslStream.ProcessWrite(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest)
   at System.Net.TlsStream.Write(Byte[] buffer, Int32 offset, Int32 size)
   at System.Net.ConnectStream.InternalWrite(Boolean async, Byte[] buffer, Int32 offset, Int32 size, AsyncCallback callback, Object state)
   at System.Net.ConnectStream.Write(Byte[] buffer, Int32 offset, Int32 size)
   at Duplicati.Library.Utility.Utility.CopyStream(Stream source, Stream target, Boolean tryRewindSource, Byte[] buf)
   at Duplicati.Library.JSONWebHelper.PostMultipart(String url, Action`1 setup, MultipartItem[] parts)
   at Duplicati.Library.JSONWebHelper.PostMultipartAndGetJSONData[T](String url, Action`1 setup, MultipartItem[] parts)
   at Duplicati.Library.Backend.AmazonCloudDrive.AmzCD.Put(String remotename, Stream stream)
   at Duplicati.Library.Main.BackendManager.DoPut(FileEntryItem item)
   at Duplicati.Library.Main.BackendManager.ThreadRun()
4 Dec 2017 03:36: Message
Renaming "duplicati-ieff8d69a2e5e4ff08bf97eb9b2d91816.dindex.zip.aes" to "duplicati-iec7e0b0aa3da4850afc82ea6034f74a9.dindex.zip.aes"
4 Dec 2017 03:36: Retry
Operation Put with file duplicati-ieff8d69a2e5e4ff08bf97eb9b2d91816.dindex.zip.aes attempt 1 of 5 failed with message: The remote server returned an error: (429) Too Many Requests.
System.Net.WebException: The remote server returned an error: (429) Too Many Requests.
   at Duplicati.Library.Utility.AsyncHttpRequest.AsyncWrapper.GetResponseOrStream()
   at Duplicati.Library.Utility.AsyncHttpRequest.GetResponse()
   at Duplicati.Library.JSONWebHelper.GetResponse(AsyncHttpRequest req, Object requestdata)
   at Duplicati.Library.JSONWebHelper.PostMultipart(String url, Action`1 setup, MultipartItem[] parts)
   at Duplicati.Library.JSONWebHelper.PostMultipartAndGetJSONData[T](String url, Action`1 setup, MultipartItem[] parts)
   at Duplicati.Library.Backend.AmazonCloudDrive.AmzCD.Put(String remotename, Stream stream)
   at Duplicati.Library.Main.BackendManager.DoPut(FileEntryItem item)
   at Duplicati.Library.Main.BackendManager.ThreadRun()
4 Dec 2017 03:36: Message
Renaming "duplicati-b7fd0cb0f23e741ebb56fbacb15e3c674.dblock.zip.aes" to "duplicati-b8ee2c189153349b98f9de2eed5900ab5.dblock.zip.aes"
4 Dec 2017 03:36: Retry
Operation Put with file duplicati-b7fd0cb0f23e741ebb56fbacb15e3c674.dblock.zip.aes attempt 1 of 5 failed with message: The stream does not support concurrent IO read or write operations.
System.NotSupportedException: The stream does not support concurrent IO read or write operations.
   at System.Net.ConnectStream.InternalWrite(Boolean async, Byte[] buffer, Int32 offset, Int32 size, AsyncCallback callback, Object state)
   at System.Net.ConnectStream.Write(Byte[] buffer, Int32 offset, Int32 size)
   at Duplicati.Library.Utility.Utility.CopyStream(Stream source, Stream target, Boolean tryRewindSource, Byte[] buf)
   at Duplicati.Library.JSONWebHelper.PostMultipart(String url, Action`1 setup, MultipartItem[] parts)
   at Duplicati.Library.JSONWebHelper.PostMultipartAndGetJSONData[T](String url, Action`1 setup, MultipartItem[] parts)
   at Duplicati.Library.Backend.AmazonCloudDrive.AmzCD.Put(String remotename, Stream stream)
   at Duplicati.Library.Main.BackendManager.DoPut(FileEntryItem item)
   at Duplicati.Library.Main.BackendManager.ThreadRun()

The stream does not support concurrent IO read or write operations
#3

It’s hard to say, but might include:

  • it really is there, just reported not
  • file was manually deleted by mistake (it happens, I know) :wink:
  • file was automatically moved to cold storage

I seem to recall there having been some issues with other providers where the file lists as reported by the provided were “delayed” from what was uploaded. So, for example, Duplicati would upload the final backup file, do a verification that included making sure that file exists, then saying “whoops - file does NOT exist, something went wrong”. But upon doing NOTHING and re-running the backup, the verification looks for the file again and it’s magically there.

I think there was another provider that had a bug in their API that was returning incomplete (or duplicate) file lists in the “list directory contents” requests which again caused Duplicati to complain that files were missing or extra - and again the issue was that the provider had a bug.

So since this is a fresh error for you, are you able to connect to ACD through some other tool and manually verify the file is missing?


This is a standard C# reported error when an in-process thread is abandoned. You can expect this error pretty much any time you tell Duplicati to “Stop now” instead of “Stop after upload”.

Since a manual stop means the user is most likely the cause of the “thread abort” error you should also check a few of the errors BEFORE the thread abort to see what was going on before we told it to quit.


#4

Yes, it was clearly me who cause that “error”. But that’s my point: if the user aborted the process, don’t make such a fuss about it, with a big red error message. Instead of giving me an error message when there is no error (I know, technically it might be one, but for the user there is no error when I told it to stop and it stops), I want one when the backup fails.

So just to be clear: I’m posting this stuff not to have the errors fixed (see this topic for that) but to point to some spots where the UX can be very much improved. The only thing that’s worse than a failing backup is a failing backup with cryptic or now error messages.


#5

Fair point - if I had paid attention to the topic subject I should have been able to figure that out. (Can I chalk it up to habit?)


#6

That is a bug in the webserver. I want to change it but I have so many other things that I think are more important.

I agree, and can add that @renestach has bugged me many times asking me to change it to something better.


#7

Is there an easy way to change the text of the error message, to add, for example, something like: “This is a known bug. Thank you for your patience”. If so, this might even be something that I could do and submit a PR.


#8

Yes, I think you can catch the error here and emit a custom message:


#9

Any idea why this happens? I’d like to fix it.

While debugging I found that this happened on a pause POST request while the exception is thrown whenever trying to add to an empty HttpInput, which can only happen when handling a GET request.

Edit: HttpForm inherits from HttpInput, so nevermind what I said before.