Continuous Backup failures

I like the idea of this and am very familiar with Enterprise dedup backup software (e.g. - DataDomain, Avamar) but can’t get this to work reliably for me. I am trying to protect my laptop (about 60GB in use) to my local NAS device. I’ve tried setting a mapped drive letter and direct to the NAS via FTP. The mapped driver letter works for a few days and then fails (various errors). I’ve made sure the drive is accessible, repair/delete/rebuild DB but no matter what I do it fails after a few days.
So, now I tried backing up using FTP and it won’t even complete the first backup.
I have tried changing the block size from 50MB to 10MB with no luck. It writes a few files (4) and then fails. I’ve attached the bugreport from the last FTP attempt.

I would be helpful if it were easier to look at the failure messages in the logs. It looks like they are stored in the SQLite db in yaml format. Maybe I should write an extractor.

bugreport.zip (232 KB)

Welcome to the forum @fptmark

Getting enough log detail to debug takes work. The “Show log” for the job is kind of skimpy. One often sees the issue on the one line summaries but not the needed detail behind the problem. That requires something like Viewing the Duplicati Server Logs (e.g. About --> Show log --> Live --> Retry) or setting up –log-file, whose default –log-file-log-level should be enough to see some details. Setting at Retry is useful to get more information on your retry history, which is configured by –number-of-retries (slightly misnamed). It looks from your DB bug report like FTP chewed through all 5. One can see the five tries from DB (below) but you need more log detail to get cause. Does Destination “Test connection” work?

Note that the same file (identified the same because size and hash are the same) is renamed for retry. You can see the same display on your job Show log --> Remote if you click on the lines to see details.

Please get some more log information on the FTP problem, and also try the Test connection button.

Thanks for your reply.

Test – I always make sure the test connection works. I have tried uploading files via FileZilla and the ftp command line. Most files works but some files (ntuser.dat.LOG1) fails from the command line. The file does get created but is empty. This seemed to somewhat track what I got from the log. I have increased the log level and set the retries to 10. Obviously I have a local conf problem or the command line ftp would work.

The current log is attached. Your help is appreciated.

Thx,

Mark

(Attachment DupLog.txt is missing)

Thanks for your reply.

Test – I always make sure the test connection works. I have tried uploading files via FileZilla and the ftp command line. Most files works but some files (ntuser.dat.LOG1) fails from the command line. The file does get created but is empty. This seemed to somewhat track what I got from the log. I have increased the log level and set the retries to 10. Obviously I have a local conf problem or the command line ftp would work.

The current log is attached. Your help is appreciated.
DupLog.zip (4.6 KB)

Could my problem be related to an older NAS (Zyxel 320)? I was checking it’s cpu utilization and saw it got to 100% just before the backup failed. I’ve shut off all services on the NAS (twonky), tried rebooting the NAS and it still fails.
Any recommendations for a home NAS? I’m looking at the synology 2 bay.

The more detailed error message (which is .NET Framework complaining vaguely about your NAS) is:

2019-10-13 18:57:56 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bc9f08acddca942cab61febfbc041f54d.dblock.zip.aes (49.93 MB)
2019-10-13 18:57:56 -07 - [Retry-Duplicati.Library.Main.Operation.Common.BackendHandler-RetryPut]: Operation Put with file duplicati-bc9f08acddca942cab61febfbc041f54d.dblock.zip.aes attempt 1 of 10 failed with message: The underlying connection was closed: The server committed a protocol violation.
System.Net.WebException: The underlying connection was closed: The server committed a protocol violation.
   at Duplicati.Library.Utility.AsyncHttpRequest.AsyncWrapper.GetResponseOrStream()
   at Duplicati.Library.Utility.AsyncHttpRequest.GetRequestStream(Int64 contentlength)
   at Duplicati.Library.Backend.FTP.Put(String remotename, Stream input)
   at Duplicati.Library.Main.Operation.Common.BackendHandler.<DoPut>d__23.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Duplicati.Library.Main.Operation.Common.BackendHandler.<>c__DisplayClass13_0.<<UploadFileAsync>b__2>d.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at Duplicati.Library.Main.Operation.Common.BackendHandler.<DoWithRetry>d__21`1.MoveNext()
2019-10-13 18:57:56 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Retrying: duplicati-bc9f08acddca942cab61febfbc041f54d.dblock.zip.aes (49.93 MB)
2019-10-13 18:58:06 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-bc9f08acddca942cab61febfbc041f54d.dblock.zip.aes (49.93 MB)
2019-10-13 18:58:06 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-bea1275da8c9e45cf837b41e9ac4f890b.dblock.zip.aes (49.93 MB)
2019-10-13 18:58:06 -07 - [Information-Duplicati.Library.Main.Operation.Common.BackendHandler-RenameRemoteTargetFile]: Renaming "duplicati-bc9f08acddca942cab61febfbc041f54d.dblock.zip.aes" to "duplicati-bea1275da8c9e45cf837b41e9ac4f890b.dblock.zip.aes"
2019-10-13 18:58:06 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bea1275da8c9e45cf837b41e9ac4f890b.dblock.zip.aes (49.93 MB)
2019-10-13 18:58:12 -07 - [Retry-Duplicati.Library.Main.Operation.Common.BackendHandler-RetryPut]: Operation Put with file duplicati-bea1275da8c9e45cf837b41e9ac4f890b.dblock.zip.aes attempt 2 of 10 failed with message: The underlying connection was closed: The server committed a protocol violation.

which doesn’t really help much. FTP is a very old rather simple protocol that commonly does nothing encrypted, so getting a network trace would be an option if you’re up to it. Other than that, I’m unsure how to get a better view of the problem unless your NAS has logs. You could certainly try workaround –ftp-passive=true, or see if FTP (Alternative) works better with your NAS. I can’t comment on its FTP.

If you decide you want to try a network trace, it would probably be with Wireshark unless the NAS has tcpdump (some seem to), or alternatively .NET Framework can possibly take a trace into a file on PC.

The Best NAS for Most Home Users likes the Synology DS218+ however a user has recently found it (and maybe Duplicati) not capable enough of backing up the NAS to Sharepoint, especially in Docker. There’s typically not a lot of CPU power or memory in the NAS. I have no personal recommendations.

I tried the ftp-passive=true and it didn’t help. I also noted an interesting behavior with the num-of-retries. It was 10 before and I noticed that 9 files were written to my target before it failed. After wiping everything out and changing num-of-reties to 5, 5 files were written to the target before it failed.
It makes me wonder if the system is correctly detecting that a file is complete. The size of each aes is correct (50 MB).

This is the rough target size limit. If you’re looking at sizes directly on the destination, you can get the actual size down to the byte, and you can possibly also get a sha256sum or similar to see if the many visible files are actually the same file renamed by Duplicati, and retried because it seemed to fail. This basically is what your theory is, except your phrasing sounds like different files were written (i.e. actual progress), whereas the log you sent with 10 retries looks like your FTP server sent an error every time.

Only way to know what your FTP server is actually doing is to check its logs if it has any, or watch the network activity. FTP comes in lots of varieties, and compatibility can be an issue. Duplicati has two client options, one in Microsoft .NET Framework, and “FTP (Alternative)” which is a third-party library. Please be sure to try both. If neither works with your FTP, then it’s down to diagnose or abandon FTP.

Duplicati.CommandLine.BackendTester.exe passing would be a good sign, but even if it fails, it might give some more idea of what’s going on. You would give it an empty folder on the NAS, based on the URL that export from the job gives, then adjusted to an unused folder and maybe other syntax tweaks.

Duplicati.CommandLine.BackendTool.exe is another command line tool if you want do-it-yourself tests. Sometimes these command line tools will dump more information to the screen, compared to log data.