2 leoe LEoe_ 于 2017.12.07 19:45 提问

java 远程连接spark 出现错误 20C

我使用的是sequenceiq/spark 搭建的docker集群,但是本机上能正常的运行,但是通过java远程连接访问的时候出现错误

代码为:

        SparkConf sparkConf = new SparkConf().setAppName("JavaTopGroup").setMaster("spark://10.73.21.221:7077");

        JavaSparkContext ctx = new JavaSparkContext(sparkConf);

出现的错误为:

 17/12/07 19:17:47 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
17/12/07 19:17:47 WARN StandaloneSchedulerBackend: Application ID is not initialized yet.
17/12/07 19:17:47 INFO SparkUI: Stopped Spark web UI at http://10.73.7.25:4040
17/12/07 19:17:47 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 8163.
17/12/07 19:17:47 INFO StandaloneSchedulerBackend: Shutting down all executors
17/12/07 19:17:47 INFO NettyBlockTransferService: Server created on 10.73.7.25:8163
17/12/07 19:17:47 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
17/12/07 19:17:47 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
17/12/07 19:17:47 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.73.7.25, 8163, None)
17/12/07 19:17:47 INFO BlockManagerMasterEndpoint: Registering block manager 10.73.7.25:8163 with 900.6 MB RAM, BlockManagerId(driver, 10.73.7.25, 8163, None)
17/12/07 19:17:47 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.73.7.25, 8163, None)
17/12/07 19:17:47 WARN StandaloneAppClient$ClientEndpoint: Drop UnregisterApplication(null) because has not yet connected to master
17/12/07 19:17:47 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.73.7.25, 8163, None)
17/12/07 19:17:47 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
17/12/07 19:17:47 INFO MemoryStore: MemoryStore cleared
17/12/07 19:17:47 INFO BlockManager: BlockManager stopped
17/12/07 19:17:47 INFO BlockManagerMaster: BlockManagerMaster stopped
17/12/07 19:17:47 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
17/12/07 19:17:47 ERROR TransportResponseHandler: Still have 3 requests outstanding when connection from /10.73.21.21:7077 is closed
17/12/07 19:17:47 INFO SparkContext: Successfully stopped SparkContext
17/12/07 19:17:47 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
    at scala.Predef$.require(Predef.scala:224)
    at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:524)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at org.com.will.sparkl.App.main(App.java:24)
17/12/07 19:17:48 INFO SparkContext: SparkContext already stopped.
Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
    at scala.Predef$.require(Predef.scala:224)
    at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:524)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at org.com.will.sparkl.App.main(App.java:24)
17/12/07 19:17:48 INFO ShutdownHookManager: Shutdown hook called
17/12/07 19:17:48 INFO ShutdownHookManager: Deleting directory C:\Users\will\AppData\Local\Temp\spark-c60f05a8-5476-469b-8c43-d8476796a1dd

1个回答

qq_41309368
qq_41309368   2017.12.07 19:57

yarn-client模式

ludhi
ludhi yarn-client模式
7 天之前 回复
LEoe_
LEoe_ 我刚学习 能详细说下么?
7 天之前 回复
Csdn user default icon
上传中...
上传图片
插入图片
准确详细的回答,更有利于被提问者采纳,从而获得C币。复制、灌水、广告等回答会被删除,是时候展现真正的技术了!