Spark Scala Command line Arguments

Like Tweet Pin it Share Share Email

Its pretty straight forward to add Command Line Arguments to Spark (scala) from shell.

$ ./spark-2.0.0-bin-hadoop2.6/bin/spark-shell -i ~/scalaparam.scala --conf spark.driver.args="param1value  param2value  param3value"

Parameter values are separated by  spaces  (param1value  param2value  param3value)

contents of  scalaparam.scala

val args = sc.getConf.get("spark.driver.args").split("\\s+")
val param1=args(0)
val param2=args(1)
val param3=args(2)
println("param1 passed from shell : " + param1)
println("param2 passed from shell : " + param2)
println("param3 passed from shell : " + param3)

The trick is sc.getConf.get(“spark.driver.args”).split(“\\s+”)   splitting the value based on space.  (remember: regular expression)


Comments (0)

Leave a Reply

Your email address will not be published. Required fields are marked *