Spark Scala Command line Arguments

Its pretty straight forward to add Command Line Arguments to Spark (scala) from shell.

$ ./spark-2.0.0-bin-hadoop2.6/bin/spark-shell -i ~/scalaparam.scala --conf spark.driver.args="param1value  param2value  param3value"

Parameter values are separated by  spaces  (param1value  param2value  param3value)

contents of  scalaparam.scala

val args = sc.getConf.get("spark.driver.args").split("\\s+")
val param1=args(0)
val param2=args(1)
val param3=args(2)
println("param1 passed from shell : " + param1)
println("param2 passed from shell : " + param2)
println("param3 passed from shell : " + param3)
System.exit(0)

The trick is sc.getConf.get(“spark.driver.args”).split(“\\s+”)   splitting the value based on space.  (remember: regular expression)

 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: