I have looked through answers for similar issues and none have resolved the issue I'm having. Some hadoop commands seem to work (example hadoop fs -cat) while others not (hadoop fs -ls, which threw this error)
and winutils even appears to be found
I also have hadoop.dll in the very same directory and in my Windows/System32 folder and it still doesn't seem to work after restarting my machine.
Truly a head scratcher here because I would not have expected partial functionality if something was broken
Hadoop version is 3.3.6
Java is version "1.8.0_202"
Spark is 3.5.3
I changed my version of Hadoop to 3.2.2 and it now works, but tbh I have no idea why
It seems like a lot of Hadoop is poorly documented and I'm not sure where I could have read up on a versioning issue (assuming that's what it was) other than going about this through trial and error like I did