idea写spark导入报错

idea使用import org.spark.sql.SparkSession报错,显示无法解析符号,使用spark的jar包添加为库后也不可用
问题应该是出在pom文件上

<!--spark依赖-->
            <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-core_2.12</artifactId>
                <version>3.0.3</version>
            </dependency>

            <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-sql_2.12</artifactId>
                <version>3.0.3</version>
                <scope>provided</scope>
            </dependency>

            <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming -->
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-streaming_2.12</artifactId>
                <version>3.0.3</version>
                <scope>provided</scope>
            </dependency>

            <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-mllib -->
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-mllib_2.12</artifactId>
                <version>3.0.3</version>
                <scope>provided</scope>
            </dependency>

这是我的spark部分的pom,请教如何解决报错

img

img

看下idea外部库那里有没有对应的spark jar包

  • 这篇博客: Spark快速入门中的 5.3.1、修改pom文件 部分也许能够解决你的问题, 你可以仔细阅读以下内容或跳转源博客中阅读:
  •         <dependencies>
                <dependency>
                    <groupId>org.apache.spark</groupId>
                    <artifactId>spark-core_2.12</artifactId>
                    <version>2.4.3</version>
                    <scope>provided</scope>
                </dependency>
            </dependencies>