关于Spark on Yarn运行WordCount的问题

运行wordcount程序是,一直提示以下的内容,yarnAppState状态一直没有变成running:
14/05/13 15:05:25 INFO yarn.Client: Application report from ASM:
application identifier: application_1399949387820_0008
appId: 8
clientToAMToken: null
appDiagnostics:
appMasterHost: N/A
appQueue: default
appMasterRpcPort: 0
appStartTime: 1399964104011
yarnAppState: ACCEPTED
distributedFinalState: UNDEFINED
appTrackingUrl: master:8088/proxy/application_1399949387820_0008/
appUser: hadoop

我有3个虚拟机,master内存1g,slave内存512m,我的运行脚本如下:
export YARN_CONF_DIR=/home/hadoop/hadoop-2.2.0/etc/hadoop
SPARK_JAR=/home/hadoop/spark-0.9.0-incubating-bin-hadoop2/assembly/target/scala-2.10/spark-assembly_2.10-0.9.0-incubating-hadoop2.2.0.jar \
./spark-class org.apache.spark.deploy.yarn.Client \
--jar spark-wordcount-in-scala.jar \
--class WordCount \
--args yarn-standalone \
--args hdfs://master:6000/input \
--args hdfs://master:6000/output \
--num-workers 1 \
--master-memory 512m \
--worker-memory 512m \
--worker-cores 1

请各位大神帮帮忙!!!

请问解决了吗?我也遇到了这个问题,用的CDH5,然后跑了自带的SparkPi,也是一直都在distributedFinalState: UNDEFINED状态