Found 1 commands but expected 2 (trying simple backup test)

what’s wrong with this picture?

0# duplicati-cli backup /home/greg
Found 1 commands but expected 2, commands:
“/home/greg”
200# dpkg-query -W duplicati mono-devel
duplicati 2.0.5.1-1
mono-devel 6.8.0.105-0xamarin3+ubuntu1604b1

tia,
greg

ok i think i got it. while i’m here, should i worry over any of these messages:
AUTOUPDATER_Duplicati_POLICY=Never PASSPHRASE=… duplicati-cli backup file:///home/dup /home/greg --exclude=.profile --no-auto-compact=true --auto-vacuum=false -
-concurrency-max-threads=1 --debug-retry-errors=true --disable-filepath-cache=true --disable-piped-streaming=true --keep-versions=-1 --no-auto-compact=true --restore-permissions=true --throttle-upload=3kb
Backup started at 02/07/2020 17:10:04
Failed to load process type Duplicati.Library.Common.IO.VssBackupComponents assembly /usr/lib/duplicati/Duplicati.Library.IO.dll, error message: Could not load type of field ‘Duplicati.Library.Common.IO.VssBackupComponents:_vssBackupComponents’ (1) due to: Could not load file or assembly ‘AlphaVSS.Common, Version=1.4.0.0, Culture=neutral, PublicKeyToken=959d3993561034e3’ or one of its dependencies. => Could not load type of field ‘Duplicati.Library.Common.IO.VssBackupComponents:_vssBackupComponents’ (1) due to: Could not load file or assembly ‘AlphaVSS.Common, Version=1.4.0.0, Culture=neutral, PublicKeyToken=959d3993561034e3’ or one of its dependencies.
The option disable-filepath-cache is deprecated: The “disable-filepath-cache” option is no longer used and has been deprecated.
Checking remote backup …
Listing remote folder …
Listing remote folder …
Scanning local files …
19 files need to be examined (59.00 KB)
Uploading file (16.54 KB) …
Uploading file (2.20 KB) …
0 files need to be examined (0 bytes)
Uploading file (1.89 KB) …
Checking remote backup …
Listing remote folder …
Verifying remote backup …
Remote backup verification completed
Downloading file (2.20 KB) …
Downloading file (2.20 KB) …
Downloading file (2.20 KB) …
Downloading file (2.20 KB) …
Downloading file (2.20 KB) …
Failed to process file duplicati-20200207T171006Z.dlist.zip.aes => Offset and length were out of bounds for the array or count is greater than the number of elements from index to the end of the source collection.
Downloading file (1.89 KB) …
Downloading file (1.89 KB) …
Downloading file (1.89 KB) …
Downloading file (1.89 KB) …
Downloading file (1.89 KB) …
Failed to process file duplicati-ie719965207ad4f55b490dd48dcbf6616.dindex.zip.aes => Offset and length were out of bounds for the array or count is greater than the number of elements from index to the end of the source collection.
Downloading file (16.54 KB) …
Downloading file (16.54 KB) …
Downloading file (16.54 KB) …
Downloading file (16.54 KB) …
Downloading file (16.54 KB) …
Failed to process file duplicati-bcc5b311b17f94e0a8e8e083bb1248eb5.dblock.zip.aes => Offset and length were out of bounds for the array or count is greater than the number of elements from index to the end of the source collection.
Duration of backup: 00:02:11
Remote files: 3
Remote size: 20.63 KB
Total remote quota: 906.38 GB
Available remote quota: 819.84 GB
Files added: 11
Files deleted: 0
Files changed: 0
Data uploaded: 20.63 KB
Data downloaded: 0 bytes
Backup completed successfully!
3#

It looks like the backup completed but then the verification part failed. I would worry about it - verification should succeed. Restores may also fail for the same reason verification failed. Have you tried to restore?

i got the impression from the manual that --disable-piped-streaming=true would reduce pointless overhead since i intend to use --throttle-upload=50kb anyway to keep traffic low. however it appears --disable-piped-streaming=true breaks restore!

it’s a bit confusing what happens upon completion of the backup. first i see “Remote backup verification completed”, then “Downloading…”, and, with --disable-piped-streaming=true, “Failed to process file duplicati-20200208T025245Z.dlist.zip.aes => Offset and length were out of bounds for the array or count is greater than the number of elements from index to the end of the source collection”.

while we’re looking, at the beginning of the run, it always complains “Failed to load process type Duplicati.Library.Common.IO.VssBackupComponents assembly /usr/lib/duplicati/Duplicati.Library.IO.dll, error message: Could not load type of field ‘Duplicati.Library.Common.IO.VssBackupComponents:_vssBackupComponents’ (1) due to: Could not load file or assembly ‘AlphaVSS.Common, Version=1.4.0.0, Culture=neutral, PublicKeyToken=959d3993561034e3’ or one of its dependencies. => Could not load type of field ‘Duplicati.Library.Common.IO.VssBackupComponents:_vssBackupComponents’ (1) due to: Could not load file or assembly ‘AlphaVSS.Common, Version=1.4.0.0, Culture=neutral, PublicKeyToken=959d3993561034e3’ or one of its dependencies.” but goes ahead and seems to work.

dpkg-query -W duplicati mono-devel

duplicati 2.0.5.1-1
mono-devel 6.8.0.105-0xamarin3+ubuntu1604b1

cat /etc/lsb-release

DISTRIB_ID=Ubuntu
DISTRIB_RELEASE=16.04
DISTRIB_CODENAME=xenial
DISTRIB_DESCRIPTION=“Ubuntu 16.04.6 LTS”

uname -a

Linux … 3.10.0-957.12.2.vz7.96.21 #1 SMP Thu Jun 27 15:10:55 MSK 2019 x86_64 x86_64 x86_64 GNU/Linux

continuing same topic as i’m still just getting started, but if you’d rather i split this into a different thread just say so.

it looks like duplicati fails to fetch files with funny filenames. linux filenames can legally contain all manner of odd and annoying characters. yup, all those control characters and all. still, if it’s a legal file, i’d prefer it to get backed up. try creating some files with icky names such that you try the entire character set, i think nearly everything but /, not sure about null and maybe one or two others. and see how duplicati falls over.

I’m not familiar with this option at all, but that’s a good discovery. Can you please post an issue on GitHub: Issues · duplicati/duplicati · GitHub

Interesting… I’m not seeing that on my Debian system.

Can you give an example?

Error reported while accessing file: /mnt/gb/192.168.0.4/146/f%2f/fsb/fh/fg/fgsharp.g/faero/fWINDOWS/fZUBEH?R.GRP => Could not find file ‘/mnt/gb/192.168.0.4/146/f%2f/fsb/fh/fg/fgsharp.g/faero/fWINDOWS/fZUBEH?R.GRP’.
Failed to process path: /mnt/gb/192.168.0.4/146/f%2f/fsb/fh/fg/fgsharp.g/faero/fWINDOWS/fZUBEH?R.GRP => Path doesn’t exist!
Error reported while accessing file: /mnt/gb/192.168.0.4/146/f%2f/fsb/fh/fg/fmydata/feudora/fEmbedded/f??R???__.filestit.gif => Could not find file '/mnt/gb/192.168.0.4/146/f%2f/fsb/fh/fg/fmydata/feudora/fEmbedded/f??R???.filestit.gif’.
Failed to process path: /mnt/gb/192.168.0.4/146/f%2f/fsb/fh/fg/fmydata/feudora/fEmbedded/f?_?R???
.filestit.gif => Path doesn’t exist!
Error reported while accessing file: /mnt/gb/192.168.0.4/146/f%2f/fsb/fh/fg/fmydata/feudora/fEmbedded/f??R???__.filesboon.gif => Could not find file '/mnt/gb/192.168.0.4/146/f%2f/fsb/fh/fg/fmydata/feudora/fEmbedded/f??R???.filesboon.gif’.
Failed to process path: /mnt/gb/192.168.0.4/146/f%2f/fsb/fh/fg/fmydata/feudora/fEmbedded/f?_?R???
.filesboon.gif => Path doesn’t exist!

What do those filenames look like on the actual machine where they are hosted? I wonder if those “?” characters are really unicode, and it is getting mistranslated somehow when mounted on your Duplicati machine.

i’d say that misses the point. they’re on linux, and tho both the content, and the filenames, may contain all manner of unprintable characters, they’re legal files and filenames, and can be handled by standard gnu/linux utilities (tho they likely require special quoting in the shell). and i’d prefer to see all my legal files fully backed up.

iiuc filenames are kind of like system time. you may interpret system time into any time zone, but system time isn’t beholden to any time zone. as for filenames, perhaps you may make a reasonable guess as to the language of their creator, but the filenames themselves aren’t beholden to any language or locale or character set.

as for what are the actual characters in my examples, well, i’d say, pick any, try all, duplicati should work for all legal filename characters.

but since you asked:
0000000 f Z U B E H 231 R . G R P \n >fZUBEH.R.GRP.<

0000000 f 250 _ 372 R 246 247 247 223 _ _ . f i l e >f._.R…__.file<
0000020 s t i t . g i f \n >stit.gif.<

0000000 f 250 _ 372 R 246 247 247 223 _ _ . f i l e >f._.R…__.file<
0000020 s b o o n . g i f \n >sboon.gif.<

I’m asking so I can try to reproduce your issue so I can help debug it. My goal isn’t to try and tell you to use different filenames.

How did you generate this output? I’m guessing the number values are for when the characters are non-printable ASCII? I’m thrown off by the ‘372’ value since that is >255.

My guess is that 372 is octal. This looks like what I sometimes see in od -c except my part on the right of each line isn’t there, so it might be od -tcz or something. Format is nicer without forum reformatting:

0000000   f   Z   U   B   E   H 231   R   .   G   R   P  \n              >fZUBEH.R.GRP.<

0000000   f 250   _ 372   R 246 247 247 223   _   _   .   f   i   l   e  >f._.R....__.file<
0000020   s   t   i   t   .   g   i   f  \n                              >stit.gif.<

0000000   f 250   _ 372   R 246 247 247 223   _   _   .   f   i   l   e  >f._.R....__.file<
0000020   s   b   o   o   n   .   g   i   f  \n                          >sboon.gif.<

(comparison)

$ /bin/echo -ne 'fZUBEH\0231R.GRP\n' | od -tcz
0000000   f   Z   U   B   E   H 231   R   .   G   R   P  \n              >fZUBEH.R.GRP.<
0000015
$ 

I was originally wondering if this was UTF-8, but octal 250 (assuming it’s octal…) is 10101000 so doesn’t fit the UTF-8 Description, to the extent I’m familiar with it (not very). If high-bit characters genuinely weren’t handled by Duplicati, I’d expect Unicode printables to fail too, but they aren’t…

Are other files under /mnt/gb/192.168.0.4/146/f%2f/fsb/fh/fg OK? Any pattern to failure?
Earlier there was mention of “funny filenames”. Are there other sorts in same folder working OK?

glad you ask. now that i look, the files that were backed up are staggeringly sparse. none at all anywhere near the files that elicited the ‘Failed to process’ messages. a possible explanation is the backup didn’t come to completion. currently i’m running another. but with severely throttled communications so even a smallish backup takes a rather long time.

how do you escape forum formatting?

it appears --disable-piped-streaming=true breaks restore!

I’m not familiar with this option at all, but that’s a good discovery. Can you please post an issue on GitHub

done: --disable-piped-streaming=true breaks restore · Issue #4094 · duplicati/duplicati · GitHub

This report contains two different “Failed to process” message types:

Failed to process file referring to a destination file, which probably happens on verification after backup, and would probably name a destination file. Seeing your Downloading file show the same sized file five times before failing makes me suspect retries. –number-of-retries defaults to 5, but error detail isn’t shown until final failure unless log level is retry, e.g. About --> Show log --> Live --> Retry

Failed to process path would be in source file process and would name one. I don’t think it retries. These are the ones where characters in use was a concern, and I’m not sure where that stands now…

Can you clarify “sparse” (which I can think of as Sparse Files) and “anywhere near” (space, time, etc.)?

Put lines with three backticks above and below – ``` I’m not sure it’s perfect but it can help quite a bit…

ie for both of 2 backups, find shows mostly empty directories and precious few files, with those precious few files not being in the same directory nor directories even nearly adjacent to the files that “Failed to process”. these 2 backups are both to the same destination directory, and the only ones to that directory. however prior backups were sent to a different destination directory. to which i stopped sending because of “Unexpected difference in fileset”. however i didn’t think to ditch the local database when starting to use the new destination directory.

Guessing these are “Failed to process path”, meaning they’re source files. Are characters in names looking relevant to failure? What do the working files and empty directories have in their filenames?

What does “2 backups” mean? Is this two runs from one source area or from different source areas? Main reason I wonder if they’re different is prior “for both of 2 backups, find shows”. Why re-run find?

I’m not that familiar with it, but I’d thought dbconfig.json would assign a new DB for a new destination.
You can view your ~/.config/Duplicati/dbconfig.json to see if the way it looks leads you the same way.

i really appreciate your persistence in seeking solutions and understanding! i’m sorry to say i’m moving on at this point, due primarily to lack of dedup across hosts, to have a look at borg primarily, and perhaps duplicacy. many thanks for your efforts!

As I understand it (I don’t use it), multiple computers to same storage is the norm for Duplicacy (to allow dedup across hosts). On Duplicati, it’s a definite don’t-do, as different computers will step on each other. Varying source folders to the same destination from one computer would be safe, but would be unusual.

Good luck on your search!