使用 yarn rest api 提交 spark任务问题?

我在hdfs的根路径下传入了如下jar包,
spark_conf.zip 这个文件是我通过spark-submit 提交任务后再 straing目录下下载的

img

通过 yarn rest api http://rm-http-address:port/ws/v1/cluster/apps/new-application http://rm-http-address:port/ws/v1/cluster/apps/new-application 新建一个应用,但是hdfs上并且没有创建出相应的目录来。


{
  "application-id": "application_1628490789401_0017", 
  "application-name": "SparkPi", 
  "application-type": "spark", 
  "keep-containers-across-application-attempts": false, 
  "max-app-attempts": 1, 
  "resource": {
    "memory": 1024, 
    "vCores": 1
  }, 
  "unmanaged-AM": false, 
  "am-container-spec": {
    "commands": {
      "command": "{{JAVA_HOME}}/bin/java -server -Xmx1024m -Dspark.yarn.app.container.log.dir=<LOG_DIR> -Dspark.master=yarn -Dspark.submit.deployMode=cluster -Dspark.executor.cores=1 -Dspark.executor.memory=1g -Dspark.app.name=SparkPi org.apache.spark.deploy.yarn.ApplicationMaster --class org.apache.spark.examples.SparkPi --jar __app__.jar 1><LOG_DIR>/stdout 2><LOG_DIR>/stderr"
    }, 
    "environment": {
      "entry": [
        {
          "key": "SPARK_USER", 
          "value": "root"
        }, 
        {
          "key": "SPARK_YARN_MODE", 
          "value": true
        }, 
        {
          "key": "SPARK_YARN_STAGING_DIR", 
          "value": "hdfs://hadoop203:8020/user/root/.sparkStaging/application_1628490789401_0017"
        }, 
        {
          "key": "CLASSPATH", 
          "value": "{{PWD}}<CPS>{{PWD}}/__app__.jar<CPS>{{PWD}}/__spark_libs__/*<CPS>$HADOOP_CONF_DIR<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/*<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/lib/*<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/*<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/lib/*<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/*<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/lib/*<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*<CPS>{{PWD}}/__spark_conf__/__hadoop_conf__"
        }, 
        {
          "key": "SPARK_DIST_CLASSPATH", 
          "value": "{{PWD}}<CPS>{{PWD}}/__app__.jar<CPS>{{PWD}}/__spark_libs__/*<CPS>$HADOOP_CONF_DIR<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/*<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/lib/*<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/*<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/lib/*<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/*<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/lib/*<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*<CPS>{{PWD}}/__spark_conf__/__hadoop_conf__"
        }
      ]
    }, 
    "local-resources": {
      "entry": [
        {
          "key": "__app__.jar", 
          "value": {
            "resource": "hdfs://hadoop203:8020/spark-examples_2.11-2.4.0-cdh6.3.2.jar", 
            "size": 1716923, 
            "timestamp": 1628491231981, 
            "type": "FILE", 
            "visibility": "APPLICATION"
          }
        },
        {
          "key": "__spark_libs__", 
          "value": {
            "resource": "hdfs://hadoop203:8020/__spark_libs__.zip", 
            "size": 273473072, 
            "timestamp": 1628572632550, 
            "type": "ARCHIVE", 
            "visibility": "APPLICATION"
          }
        }
      ]
    }
  }
}

最后执行后结果日志如下:

img

img

不知道为什么 resources: 这个地方没有信息,正常通过spark-submit提交的任务有这个信息。如下图。

img

各位我这是出了什么问题了吗?

你好,我是有问必答小助手,非常抱歉,本次您提出的有问必答问题,技术专家团超时未为您做出解答


本次提问扣除的有问必答次数,将会以问答VIP体验卡(1次有问必答机会、商城购买实体图书享受95折优惠)的形式为您补发到账户。


​​因为有问必答VIP体验卡有效期仅有1天,您在需要使用的时候【私信】联系我,我会为您补发。