Some unit tests seem broken on Windows

I’m seeing some unittests are now failing on Windows for me using the current master branch. The “RunCommand” test fails when copying the data files when it sees long paths.

Can anyone test RunCommand and see if they also see it fail?

This may be related:

I modified the tests so that they automatically download the required data instead of relying on an external process to do so. Prior to this, the tests would not run at all, or we would have to download the test data separately. Unfortunately, I guess this broke the tests on Windows.

I think it is but I’d like to get confirmation that someone else is seeing the same thing as me.

In TestUtils.cs CopyDirectoryRecursive() I tested replacing File.Copy(n, tf, true); with SystemIO.IO_OS.FileCopy(n, tf, true); which should then allow it to handle the long paths. But it still fails.

So if we look at SystemIOWindows.cs and IsPathTooLong() it has long paths as those over 260 chars. The file path that fails for me is 255 chars. Shouldn’t long paths be anything over maybe 254? Edit: It looks like >260 is correct.

I’m not sure why the copy is failing.

Did you manually download the test files before? Or were these tests not running at all?

I think when briefly I inspected the zip file, it seemed that it contained insanely long paths.

I don’t think I downloaded the file before, maybe this is now the first time I’m seeing the full test try to run. So this data file wasn’t changed? Does the unittest pass for you?

How were you running the tests before? I think the BulkData and SVNData tests should have failed for you if the external test data was not present. Given this pull request, I’m guessing that you were able to run some of these tests locally at some point?

For me (on Linux), these tests used to always fail since I never downloaded the test data. Now, the tests are responsible for obtaining the data and cleaning up after themselves.

I was able to run all the tests before. I think I might have just created by own file after seeing what file it wanted to use… not knowing there was an existing one to pull from S3.

You’re on linux and not windows at all?

It’s seems like AlphaFS should be handling this but I’m not sure it is. This code below should work for the Alpha line and fail as expected for System.IO.File

Yes, Linux only. I think an alternative would be for the setup methods to generate the test data instead of it being hosted somewhere else.

That would be nice.

So I’m not sure Alphafs is working as it should. It seems widely used so it’s odd for this to not work.

I don’t see how the TestUtils.cs CopyDirectoryRecursive() could be working at all on Windows with the test data because of the long paths. CopyDirectoryRecursive() is using all System.IO calls which can’t handle those paths.

I think I need to at least modify CopyDirectoryRecursive() to use the systemIO class.

There are a lot more cases of System.IO being used throughout though. Moving to .net 462 will completely resolves this issue since 462 handles long paths. Then we just use AlphaFS for the symlink handling and possibly permissions.

The move to 462 really needs to happen. Otherwise there is a lot of duplicate work to get everything to AlphaFS and then later on to then have to remove AlphaFS for most everything. 462 and core handle the long paths.

Linux handles all the long paths fine. On the Windows side the Duplicati implementation to handle long paths seems to be pretty broken.

It sure looks to me that in SystemIOWindows.cs that PathCombine() can’t possibly be handling long paths correctly. The 2nd if statement if (!IsPathTooLong(combinedPath + "\\" + paths[i])) only applies when the path is not too long. Once the path is too long then nothing if done and the path is basically truncated.

Then there is IsPathTooLong() which also is broken. File paths must be less than 260 but the method currently has long paths as over 260, it should be >=260 and not >260. Secondly, there is no testing of just the directory path which must be less than 248.

System.IO.PathTooLongException : The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters

Capture

Capture

Also for the unit tests that fail on the Windows side, the data.zip contents seem to be very odd and causes exceptions for me and it seems like there is something wrong with the “Icons” folder data.

My tests run fine if I replace the entire Icons directory with something similar like 3000 files from fontawesome.

I’d really like to simplify and strip the unneeded AlphaFS code in SystemIOWindows in the .net 462 branch since AlphaFS is then only needed for some file handling.

This is the my modified data.zip

I’d like to add some logic to the use of data.zip so a developer can use an alternative zip if desired.

I’m not familiar with the issues in Windows. My understanding is that the current data.zip file contains prohibitively long paths. Are you able to unzip the current data.zip without issues? Is it possible in Windows to create files with paths that are too long, but the APIs provided by Windows (or .NET) for interacting with them have limitations?

I guess that’s the feature that I broke with my changes. Perhaps the OneTimeTeardown method can simply check if the file exists before downloading it, or make use of an environment variable to use an alternative source of test data.

I’m not sure what the problem is with the files in data.zip. Odd exceptions get thrown even if I try to do a simple File.Copy in a console app. I do now recall this is why I didn’t use data.zip and had created by own.

I have this added and I’ll do a PR 3871

Currently data.zip gets downloaded for each unittest that uses it. I’ve made it so data.zip is not removed but will get downloaded if the s3 file gets updated.