问题遇到的现象和发生背景
虚拟机里安装spark时报错
hadoop@pgm-VirtualBox:/usr/local/spark$ ./bin/run-example SparkPi
./bin/run-example: 行 25: /uer/local/spark/bin/spark-submit: 没有那个文件或目录
我的解答思路和尝试过的方法
重新安装了好几次都不行,又会报新的错,我删除了./spark/spark-2.4.0-bin-without-hadoop目录执行后又遇到了之前的没有文件或目录错误
hadoop@pgm-VirtualBox:~$ cd /usr/local
hadoop@pgm-VirtualBox:/usr/local$ sudo mv ./spark-2.4.0-bin-without-hadoop/ ./spark
mv: 无法将'./spark-2.4.0-bin-without-hadoop/' 移动至'./spark/spark-2.4.0-bin-without-hadoop': 目录非空
用之前备份过的重新安装spark,又出现了下面的错误
hadoop@pgm-VirtualBox:/usr/local/spark$ bin/run-example SparkPi
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/impl/StaticLoggerBinder
at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:205)
at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:119)
at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:108)
at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:71)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:79)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.slf4j.impl.StaticLoggerBinder
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 8 more