sparksql遇到一个无解问题:Caused by: org.apache.hadoop.ipc.RemoteException

代码


```java
def main(args: Array[String]): Unit = {
        System.setProperty("HADOOP_USER_NAME", "root")
        //todo 创建sparksql的运行环境
        val conf: SparkConf = new SparkConf().setMaster("local[*]").setAppName("hice")
        conf.set("spark.sql.hive.convertMetastoreOrc", "true")
        conf.set("spark.sql.orc.impl", "native")
        //启用hive支持
        val spark =SparkSession.builder().enableHiveSupport().config(conf).getOrCreate()

        spark.sql("use default")

        //查询基本数据
        spark.sql("select * from uu").show()

        //todo 关闭环境
        spark.close()

报错如下

```java
Caused by: org.apache.hadoop.ipc.RemoteException(java.lang.NullPointerException): java.lang.NullPointerException

建议百度。猜测是没有连接上。不好意思,这个没用过