我安装好了spark后使用spark-shell是正常的,sc也正常,但是启动pyspark时不成功,也没有出现spark这个图标,这是什么原因,是不是版本不兼容,并且sc未定义
我的spark是3.3.0版本。Python在cmd命令中查询到是3.6版本
报错代码如下
hadoop@ubuntu:/usr/local/spark$ bin/pyspark
Python 3.5.1+ (default, Mar 30 2016, 22:46:26)
[GCC 5.3.1 20160330] on linux
Type "help", "copyright", "credits" or "license" for more information.
Traceback (most recent call last):
File "/usr/local/spark/python/pyspark/shell.py", line 29, in <module>
from pyspark.context import SparkContext
File "/usr/local/spark/python/pyspark/__init__.py", line 53, in <module>
from pyspark.conf import SparkConf
File "/usr/local/spark/python/pyspark/conf.py", line 110
_jconf: Optional[JavaObject]
^
SyntaxError: invalid syntax
想知道什么原因怎么解决
spark重新安装过好几次也是这个问题