When running code in the spark shell, it is often convenient to have small outputs/a sample printed in the shell directly rather than putting the output in a file.
By default, the shell will truncate such output after a (fairly small) given number of characters. Is there a way that this character limit can be increased? I'm running Spark 1.2
Thanks for reading
What do you mean by "output"?
If you want to print n lines of an RDD
use take()
:
myRDD.take(n).foreach(println)
According Spark Programming Guide 1.2.0, this function "Return an array with the first n elements of the dataset. Note that this is currently not executed in parallel. Instead, the driver program computes all the elements."