在pycharm中使用了pyspark后,由于自己乱操作(比如配置了JAVA、Hadoop、SPARK后又将他们删除),运行代码时出现了如下问题,不知道该怎么求解。
Traceback (most recent call last):
File "E:\ps\pycharm\python-learning\eleventh try pyspark.py", line 9, in <module>
sc = SparkContext(conf=conf)
^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ps\Lib\site-packages\pyspark\context.py", line 198, in __init__
SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
File "E:\ps\Lib\site-packages\pyspark\context.py", line 432, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway(conf)
^^^^^^^^^^^^^^^^^^^^
File "E:\ps\Lib\site-packages\pyspark\java_gateway.py", line 99, in launch_gateway
proc = Popen(command, **popen_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\ps\Lib\subprocess.py", line 1026, in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
File "E:\ps\Lib\subprocess.py", line 1538, in _execute_child
hp, ht, pid, tid = _winapi.CreateProcess(executable, args,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [WinError 2] 系统找不到指定的文件。
【相关推荐】
下载地址
http://spark.apache.org/downloads.html本次是搭建环境使用的官网已编译的版本,如需自己编译可参照官网自行编译,地址为 http://spark.apache.org/docs/latest/building-spark.html
(spark_demo) shylin ~/Desktop/work/spark_demo cd ~/Downloads/spark-2.4.0-bin-hadoop2.7/bin
(spark_demo) shylin ~/Downloads/spark-2.4.0-bin-hadoop2.7/bin ./pyspark
Python 3.6.6 (v3.6.6:4cf1f54eb7, Jun 26 2018, 19:50:54)
[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.57)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
2019-05-24 10:55:19 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 2.4.0
/_/
Using Python version 3.6.6 (v3.6.6:4cf1f54eb7, Jun 26 2018 19:50:54)
SparkSession available as 'spark'.
>>>