Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging
请问这个在idea中怎么回事?


```java
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:473)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
    at com.lzj.bigdata.spark.sql.SQL_Hive_test1$.main(SQL_Hive_test1.scala:10)
    at com.lzj.bigdata.spark.sql.SQL_Hive_test1.main(SQL_Hive_test1.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging
    at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
    ... 14 more

Process finished with exit code 1


我的代码是

```scala

object  SQL_Hive_test1 {

  def main(args: Array[String]): Unit = {
    System.setProperty("HADOOP_USER_NAME", "atcris")

    val sparkConf = new SparkConf().setMaster("local[*]").setAppName("SQL")
    val spark = SparkSession.builder().enableHiveSupport().config(sparkConf).getOrCreate()


    spark.sql("use project")
    //准备数据
    spark.sql(
      """
        |select
        |*
        |from(
        |    select
        |    * ,
        |    rank() over(partition by area order by clickcnt desc) as rank
        |    from(
        |        select
        |        area,
        |        product_name,
        |        count(*) as clickcnt
        |        from(
        |            select
        |                a.*,
        |                p.product_name,
        |                c.area,
        |                c.city_name
        |                from user_visit_action a
        |                join product_info p
        |                on a.click_product_id = p.product_id
        |                join city_info c
        |                on a.city_id = c.city_id
        |                where a.click_product_id > -1
        |            )t1 group by area,product_name
        |  )t2
        |)t3 where rank <= 3
        """.stripMargin).show

    //TODO 关闭环境
    spark.stop()
  }
}


在spark1.X以前,Logging使用的是org.apache.spark.Logging这个类,但是在spark2.X以后,对于org.apache.spark.Logging这个类,它已经不识别了,这个类被改成了import org.apache.spark.internal.Logging。你换一下试试