I have recently started learning Scala. I am trying to execute the tf jar
command on spark -shell prompt
jar tf C:/spark/lib/spark-examples-1.6.0-hadoop2.6.0.jar
And it's throwing error
error: ';' expected but double literal found.
Could someone please help me in finding the issue.
The Spark shell is a slightly customized Scala repl, it's not that kind of "shell". It has no idea about the usual bash
/cmd
-specific commands, and it assumes that you input only valid Scala definitions and expressions that you want evaluated. C:/spark/lib/spark-examples-1.6.0-hadoop2.6.0.jar
is not a valid Scala expression.
Note the exact position where it fails: it fails right after 1.6
. Everything before the 1.6.0
part can be interpreted as Scala expression C.:/(spark)./(lib)./(spark).-(examples).-(1.6)
, but it then fails to parse 1.6.0
, and exits with an error message.