IntelliJ中运行SparkPi的问题

jdk1.8.0_40
scala-2.10.4
hadoop-2.6.0
spark-1.1.1-bin-hadoop2.4
下面是log中的一些关键信息
15/03/31 19:31:59 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/03/31 19:32:08 WARN TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1, slave3): java.lang.ClassNotFoundException: SparkPi$$anonfun$1
15/03/31 19:32:08 ERROR TaskSetManager: Task 1 in stage 0.0 failed 4 times; aborting job
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 0.0 failed 4 times, most recent failure: Lost task 1.3 in stage 0.0 (TID 6, slave3): java.lang.ClassNotFoundException: SparkPi$$anonfun$1
_

WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, worker1): java.lang.ClassNotFoundException: com.spark.firstApp.HelloSpark$$anonfun$2

进行如下设置,解决报错信息。

val conf = new SparkConf().setAppName("helloSpark").setMaster("spark://master:7077").set("spark.executor.memory", "2g")
val sc = new SparkContext(conf)
//注意添加下面的一句
sc.addJar("/home/spark/IdeaProjects/FirstApp/out/artifacts/FirstAppjar1/FirstApp.jar")

这对如下报错信息:

16/02/23 16:39:53 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, worker1): java.lang.ClassNotFoundException: com.spark.firstApp.HelloSpark$$anonfun$2

就是少了 spark example 的 jar b