【hive on spark Error】Execution Error, return code 1 from org.apache.hadoop

问题遇到的现象和发生背景

hive on spark 测试插入数据:
insert into table student values(1,'abc');
报错:

Failed to monitor Job[-1] with exception 'java.lang.IllegalStateException(Connection to remote Spark driver was lost)' Last known state = QUEUED
Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. RPC channel is closed.

img

又查看了日志:

img

img

操作环境、软件版本等信息

hadoop-3.1.3
hive-3.1.2
spark-3.0.0

尝试过的解决方法

修改 yarn-site.xml的yarn.scheduler.minimum-allocation-mb和yarn.scheduler.maximum-allocation-mb

img


 <property>
        <name>yarn.scheduler.minimum-allocation-mb</name>
        <value>512</value>
    </property>
    <property>
        <name>yarn.scheduler.maximum-allocation-mb</name>
        <value>4096</value>
    </property>
    <!-- yarn容器允许管理的物理内存大小 -->
    <property>
        <name>yarn.nodemanager.resource.memory-mb</name>
        <value>4096</value>
    </property>
我想要达到的结果

用hive使用spark引擎,插入数据

先将相关的进程全部关了,然后将hive中配置文件中的中文注释全删了,hadoop的中文注释也都删除了,然后重启hadoop,hive就好了