jupyter notebook 中 SPARK_HOME env没有正确被加载

问题遇到的现象和发生背景

在jupyter notebook 中运行
import findspark
findspark.init()
出现报错:Couldn't find Spark, make sure SPARK_HOME env is set or Spark is in an expected location (e.g. from homebrew installation)

用代码块功能插入代码,请勿粘贴截图
import findspark
findspark.init()

运行结果及报错内容

Couldn't find Spark, make sure SPARK_HOME env is set or Spark is in an expected location (e.g. from homebrew installation)

我想要达到的结果

正确运行,不出现 SPARK_HOME env设置和路径位置的报错

一般我会在一开始设置好SPARK_HOME环境变量:

import os

os.environ['SPARK_HOME'] = /path/to/your/spark
os.environ['PYSPARK_PYTHON'] = /path/to/pyspark虚拟环境的python解释器
os.environ['PYSPARK_DRIVER_PYTHON'] = /path/to/pyspark虚拟环境的python解释器