SLF4J报错,可能自己maven仓库的原因?

问题遇到的现象和发生背景
问题相关代码,请勿粘贴截图
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/D:/software/Program/Maven_Repository/Maven_Repository/Maven_Repository/org/apache/logging/log4j/log4j-slf4j-impl/2.4.1/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/D:/software/Program/Maven_Repository/Maven_Repository/Maven_Repository/org/slf4j/slf4j-log4j12/1.7.16/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.fusesource.jansi.internal.Kernel32.init()V
    at org.fusesource.jansi.internal.Kernel32.init(Native Method)
    at org.fusesource.jansi.internal.Kernel32.<clinit>(Kernel32.java:38)
    at org.fusesource.jansi.WindowsAnsiOutputStream.<clinit>(WindowsAnsiOutputStream.java:52)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.logging.log4j.core.appender.ConsoleAppender.getOutputStream(ConsoleAppender.java:205)
    at org.apache.logging.log4j.core.appender.ConsoleAppender.getManager(ConsoleAppender.java:178)
    at org.apache.logging.log4j.core.appender.ConsoleAppender.createDefaultAppenderForLayout(ConsoleAppender.java:106)
    at org.apache.logging.log4j.core.config.DefaultConfiguration.<init>(DefaultConfiguration.java:62)
    at org.apache.logging.log4j.core.LoggerContext.<init>(LoggerContext.java:75)
    at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.locateContext(ClassLoaderContextSelector.java:145)
    at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.getContext(ClassLoaderContextSelector.java:70)
    at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.getContext(ClassLoaderContextSelector.java:57)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:141)
    at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
    at org.apache.logging.log4j.LogManager.getContext(LogManager.java:185)
    at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:103)
    at org.apache.logging.slf4j.Log4jLoggerFactory.getContext(Log4jLoggerFactory.java:43)
    at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
    at org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:29)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:285)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:305)
    at org.apache.spark.network.util.JavaUtils.<clinit>(JavaUtils.java:41)
    at org.apache.spark.internal.config.ConfigHelpers$.byteFromString(ConfigBuilder.scala:67)
    at org.apache.spark.internal.config.ConfigBuilder$$anonfun$bytesConf$1.apply(ConfigBuilder.scala:235)
    at org.apache.spark.internal.config.ConfigBuilder$$anonfun$bytesConf$1.apply(ConfigBuilder.scala:235)
    at org.apache.spark.internal.config.TypedConfigBuilder$$anonfun$transform$1.apply(ConfigBuilder.scala:101)
    at org.apache.spark.internal.config.TypedConfigBuilder$$anonfun$transform$1.apply(ConfigBuilder.scala:101)
    at org.apache.spark.internal.config.TypedConfigBuilder.createWithDefault(ConfigBuilder.scala:143)
    at org.apache.spark.internal.config.package$.<init>(package.scala:121)
    at org.apache.spark.internal.config.package$.<clinit>(package.scala)
    at org.apache.spark.SparkConf$.<init>(SparkConf.scala:716)
    at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
    at org.apache.spark.SparkConf.set(SparkConf.scala:95)
    at org.apache.spark.SparkConf.set(SparkConf.scala:84)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7$$anonfun$apply$6.apply(SparkSession.scala:928)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7$$anonfun$apply$6.apply(SparkSession.scala:928)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
    at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
    at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
    at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:928)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
    at philips.dw.stg_user_log$.main(stg_user_log.scala:21)
    at philips.dw.stg_user_log.main(stg_user_log.scala)

运行结果及报错内容
我的解答思路和尝试过的方法
我想要达到的结果