问题遇到的现象和发生背景
python安装了pyspark包,运行pyspark会报错
Traceback (most recent call last):
File "D:\spark\python\pyspark\shell.py", line 31, in <module>
from pyspark import SparkConf
File "D:\spark\python\pyspark\__init__.py", line 51, in <module>
from pyspark.context import SparkContext
File "D:\spark\python\pyspark\context.py", line 31, in <module>
from pyspark import accumulators
File "D:\spark\python\pyspark\accumulators.py", line 97, in <module>
from pyspark.serializers import read_int, PickleSerializer
File "D:\spark\python\pyspark\serializers.py", line 72, in <module>
from pyspark import cloudpickle
File "D:\spark\python\pyspark\cloudpickle.py", line 145, in <module>
_cell_set_template_code = _make_cell_set_template_code()
File "D:\spark\python\pyspark\cloudpickle.py", line 126, in _make_cell_set_template_code
return types.CodeType(
TypeError: 'bytes' object cannot be interpreted as an integer
操作环境、软件版本等信息
win11
python3.10.9
pyspark3.3.2
尝试过的解决方法
尝试过换到pyspark2.4.8,以及采用pyspark-shell,但都会报这个错误。
我想要达到的结果
pyspark正常使用