spark读取本地文件报错

在scala编写spark程序使用了sc.textFile("file:///home/hadoop/2.txt"),

竟然报错java.io.FileNotFoundException: File file:/home/hadoop/2.txt does not exist,之后又用spark-shell测试,依旧报这样错误

 scala> val rdd = sc.textFile("file:///home/hadoop/2.txt")
rdd: org.apache.spark.rdd.RDD[String] = file:///home/hadoop/2.txt MapPartitionsRDD[5] at textFile at <console>:24
scala> rdd.take(1)
17/08/29 20:27:28 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 1.0 (TID 4, slaves3, executor 2): java.io.FileNotFoundException: File file:/home/hadoop/2.txt does not exist

我cat文件是有输出的

 [hadoop@master ~]$  cat /home/hadoop/2.txt
chen    001     {"phone":"187***","sex":"m","card":"123"}
zhou    002     {"phone":"187***","sex":"f","educetion":"1"}
qian    003     {"phone":"187***","sex":"f","book":"2"}
li      004     {"phone":"187***","sex":"f"}
wu      005     {"phone":"187***","sex":"f"}
zhang   006     {"phone":"187***","sex":"f"}
xia     007     {"phone":"187***","sex":"f"}
wang    008     {"phone":"187***","sex":"f"}
lv      009     {"phone":"187***","sex":"m"}

之后我将文件放在hdfs上面,就能读取的到,这是怎么回事

http://blog.csdn.net/zy_zhengyang/article/details/46853441