魂落忘川犹在川 2022-01-11 11:04 采纳率: 50%
浏览 49
已结题

spark3.0 运行报错

windows下idea里运行spark代码报错

  • spark-shell可用
  • 环境变量配置完成
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
22/01/11 10:41:38 INFO SparkContext: Running Spark version 3.0.1
22/01/11 10:41:38 INFO ResourceUtils: ==============================================================
22/01/11 10:41:38 INFO ResourceUtils: Resources for spark.driver:

22/01/11 10:41:38 INFO ResourceUtils: ==============================================================
22/01/11 10:41:38 INFO SparkContext: Submitted application: HelloWorld
22/01/11 10:41:38 INFO SecurityManager: Changing view acls to: jiantang.y
22/01/11 10:41:38 INFO SecurityManager: Changing modify acls to: jiantang.y
22/01/11 10:41:38 INFO SecurityManager: Changing view acls groups to: 
22/01/11 10:41:38 INFO SecurityManager: Changing modify acls groups to: 
22/01/11 10:41:38 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jiantang.y); groups with view permissions: Set(); users  with modify permissions: Set(jiantang.y); groups with modify permissions: Set()
22/01/11 10:41:39 INFO Utils: Successfully started service 'sparkDriver' on port 60984.
22/01/11 10:41:39 INFO SparkEnv: Registering MapOutputTracker
22/01/11 10:41:39 INFO SparkEnv: Registering BlockManagerMaster
22/01/11 10:41:39 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/01/11 10:41:39 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
java.lang.NoSuchFieldError: JAVA_9
  at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207)
  at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
  at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:93)
  at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:370)
  at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311)
  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:359)
  at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
  at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:272)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:447)
  ... 28 elided

源代码

import org.apache.spark.{SparkConf, SparkContext}

object HelloWord1 {

  def main(args: Array[String]): Unit = {
    val conf = new SparkConf().setMaster("local").setAppName("HelloWorld")

    val sc = new SparkContext(conf)

    val helloWorld = sc.parallelize(List("Hello,World!","Hello,Spark!","Hello,BigData!"))

    helloWorld.foreach(line => println(line))

  }
}
  • 写回答

1条回答 默认 最新

报告相同问题?

问题事件

  • 已结题 (查看结题原因) 1月11日
  • 创建了问题 1月11日

悬赏问题

  • ¥15 超声波模块测距控制点灯,灯的闪烁很不稳定,经过调试发现测的距离偏大
  • ¥15 import arcpy出现importing _arcgisscripting 找不到相关程序
  • ¥15 onvif+openssl,vs2022编译openssl64
  • ¥15 iOS 自定义输入法-第三方输入法
  • ¥15 很想要一个很好的答案或提示
  • ¥15 扫描项目中发现AndroidOS.Agent、Android/SmsThief.LI!tr
  • ¥15 怀疑手机被监控,请问怎么解决和防止
  • ¥15 Qt下使用tcp获取数据的详细操作
  • ¥15 idea右下角设置编码是灰色的
  • ¥15 全志H618ROM新增分区