javawindowsapache-sparkhadoop

Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z unresolved


I have looked through answers for similar issues and none have resolved the issue I'm having. Some hadoop commands seem to work (example hadoop fs -cat) while others not (hadoop fs -ls, which threw this error)

I have my path variable setup Path variable

path variable

and winutils even appears to be found

native libs

I also have hadoop.dll in the very same directory and in my Windows/System32 folder and it still doesn't seem to work after restarting my machine.

bin directory with winutils and hadoop.dll

Truly a head scratcher here because I would not have expected partial functionality if something was broken

Hadoop version is 3.3.6

Java is version "1.8.0_202"

Spark is 3.5.3


Solution

  • I changed my version of Hadoop to 3.2.2 and it now works, but tbh I have no idea why

    It seems like a lot of Hadoop is poorly documented and I'm not sure where I could have read up on a versioning issue (assuming that's what it was) other than going about this through trial and error like I did