用idea读取hdfs上parquet文件, 启动spark master 报错问题

1.测试代码如下:

package com.sparksql

import org.apache.spark.sql.{DataFrame, SparkSession}

object DataFrameFromParquet {
  def main(args: Array[String]): Unit = {
    //SparkSession
    val spark: SparkSession = SparkSession
      .builder()
//      .master("local")
//ip是模拟的
      .master("spark://192.168.118.121:8080")
      .appName("DataFrame")
      .getOrCreate()
//ip是模拟的
    val userDF: DataFrame = spark.read.parquet("hdfs://192.168.118.121:9000/usr/users.parquet")

    //本地测试可以成功
//    val userDF: DataFrame = spark.read.parquet("D:\\a\\users.parquet")
    userDF.createOrReplaceTempView("user")

    spark.sql("select * from user").show(10)

    spark.stop()
  }

}

  1. 报错信息:


ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from /192.168.118.121:8080 is closed
22/01/13 02:01:08 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 192.168.118.121:8080
org.apache.spark.SparkException: Exception thrown in awaitResult


主要错误是这个,具体报错信息我就不截取了

我感觉就是连不上服务器,感兴趣大家看看呗,up24小时在线

  .master("spark://192.168.118.121:7077")