尘世壹俗人 2024-10-12 23:41 采纳率: 81.3%
浏览 7
已结题

spark的thriftserver服务启动时报错找不到类

配置spark的thriftserver服务后,在启动时日志报错找不到 org.apache.hadoop.hive.conf.HiveConf类,我已经确定版本是兼容的,且配置了启动时的jar路径,这为什么报错无法解决呢,有没有对开源相当了解的指点一二

[root@hdp2 spark-2.1.1]# cat /opt/spark-2.1.1/logs/spark-root-org.apache.spark.sql.hive.thriftserver.HiveThriftServer2-1-hdp2.out
Spark Command: /opt/jdk1.8.0_144/bin/java -cp /opt/spark-2.1.1/conf/:/opt/spark-2.1.1/jars/*:/opt/hadoop-2.7.2/etc/hadoop/ -Dspark.deploy.recoveryMode=ZOOKEEPER -Dspark.deploy.zookeeper.url=hdp1:2181,hdp2:2181,hdp3:2181 -Dspark.deploy.zookeeper.dir=/spark -Xmx512m org.apache.spark.deploy.SparkSubmit --master spark://hdp2:7077 --deploy-mode client --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 --name Thrift JDBC/ODBC Server spark-internal
========================================
Warning: Ignoring non-spark config property: hive.execution.engine=spark
Warning: Ignoring non-spark config property: hive.server2.thrift.bind.host=0.0.0.0
Warning: Ignoring non-spark config property: hive.metastore.uris=thrift://hdp2:9083
Warning: Ignoring non-spark config property: hive.server2.thrift.port=10000
24/10/12 23:32:28 INFO HiveThriftServer2: Started daemon with process name: 4619@hdp2
24/10/12 23:32:28 INFO SignalUtils: Registered signal handler for TERM
24/10/12 23:32:28 INFO SignalUtils: Registered signal handler for HUP
24/10/12 23:32:28 INFO SignalUtils: Registered signal handler for INT
24/10/12 23:32:28 INFO HiveThriftServer2: Starting SparkContext
24/10/12 23:32:29 INFO SparkContext: Running Spark version 2.1.1
24/10/12 23:32:29 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
24/10/12 23:32:29 INFO SecurityManager: Changing view acls to: root
24/10/12 23:32:29 INFO SecurityManager: Changing modify acls to: root
24/10/12 23:32:29 INFO SecurityManager: Changing view acls groups to: 
24/10/12 23:32:29 INFO SecurityManager: Changing modify acls groups to: 
24/10/12 23:32:29 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
24/10/12 23:32:29 INFO Utils: Successfully started service 'sparkDriver' on port 41279.
24/10/12 23:32:29 INFO SparkEnv: Registering MapOutputTracker
24/10/12 23:32:29 INFO SparkEnv: Registering BlockManagerMaster
24/10/12 23:32:29 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
24/10/12 23:32:29 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
24/10/12 23:32:29 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-14831f0f-90c8-4f8d-aed0-b6df01b5533d
24/10/12 23:32:29 INFO MemoryStore: MemoryStore started with capacity 93.3 MB
24/10/12 23:32:29 INFO SparkEnv: Registering OutputCommitCoordinator
24/10/12 23:32:29 INFO Utils: Successfully started service 'SparkUI' on port 4040.
24/10/12 23:32:29 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.239.187:4040
24/10/12 23:32:29 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://hdp2:7077...
24/10/12 23:32:29 INFO TransportClientFactory: Successfully created connection to hdp2/192.168.239.187:7077 after 10 ms (0 ms spent in bootstraps)
24/10/12 23:32:29 INFO StandaloneSchedulerBackend: Connected to Spark cluster with app ID app-20241012233229-0002
24/10/12 23:32:29 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 39172.
24/10/12 23:32:29 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20241012233229-0002/0 on worker-20241012224515-192.168.239.187-35613 (192.168.239.187:35613) with 6 cores
24/10/12 23:32:29 INFO NettyBlockTransferService: Server created on 192.168.239.187:39172
24/10/12 23:32:29 INFO StandaloneSchedulerBackend: Granted executor ID app-20241012233229-0002/0 on hostPort 192.168.239.187:35613 with 6 cores, 1024.0 MB RAM
24/10/12 23:32:29 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20241012233229-0002/1 on worker-20241012224515-192.168.239.188-33985 (192.168.239.188:33985) with 6 cores
24/10/12 23:32:29 INFO StandaloneSchedulerBackend: Granted executor ID app-20241012233229-0002/1 on hostPort 192.168.239.188:33985 with 6 cores, 1024.0 MB RAM
24/10/12 23:32:29 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20241012233229-0002/2 on worker-20241012224515-192.168.239.186-44001 (192.168.239.186:44001) with 6 cores
24/10/12 23:32:29 INFO StandaloneSchedulerBackend: Granted executor ID app-20241012233229-0002/2 on hostPort 192.168.239.186:44001 with 6 cores, 1024.0 MB RAM
24/10/12 23:32:29 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
24/10/12 23:32:29 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.239.187, 39172, None)
24/10/12 23:32:29 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.239.187:39172 with 93.3 MB RAM, BlockManagerId(driver, 192.168.239.187, 39172, None)
24/10/12 23:32:29 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.239.187, 39172, None)
24/10/12 23:32:29 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.239.187, 39172, None)
24/10/12 23:32:29 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20241012233229-0002/0 is now RUNNING
24/10/12 23:32:29 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20241012233229-0002/2 is now RUNNING
24/10/12 23:32:29 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20241012233229-0002/1 is now RUNNING
24/10/12 23:32:30 INFO EventLoggingListener: Logging events to hdfs://hdp1/spark/applicationHistorylog/app-20241012233229-0002
24/10/12 23:32:30 INFO StandaloneSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
24/10/12 23:32:30 INFO SharedState: Warehouse path is 'hdfs://hdp1/hiveData'.
24/10/12 23:32:31 INFO HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using file:/opt/hive-1.2.1/lib/*.jar,/path/to/hadoop/share/hadoop/common/lib/*.jar,/opt/hadoop-2.7.2/share/hadoop/common/lib/*.jar,/opt/hadoop-2.7.2/share/hadoop/yarn/lib/*.jar,/opt/spark-2.1.1/jars/*.jar
Exception in thread "main" java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
        at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
        at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
        at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
        at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
        at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
        at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
        at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:47)
        at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:81)
        at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
        ... 22 more
Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
        at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)
        at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
        at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
        at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
        at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
        at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
        at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
        ... 27 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
        ... 35 more
Caused by: java.lang.ClassNotFoundException: java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf when creating Hive client using classpath: file:/opt/hive-1.2.1/lib/*.jar,/path/to/hadoop/share/hadoop/common/lib/*.jar,/opt/hadoop-2.7.2/share/hadoop/common/lib/*.jar,/opt/hadoop-2.7.2/share/hadoop/yarn/lib/*.jar,/opt/spark-2.1.1/jars/*.jar
Please make sure that jars for your version of hive and hadoop are included in the paths passed to spark.sql.hive.metastore.jars.
        at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:270)
        at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:358)
        at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:262)
        at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:66)
        ... 40 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
        ... 43 more
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
        at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:97)
        ... 48 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.doLoadClass(IsolatedClientLoader.scala:221)
        at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.loadClass(IsolatedClientLoader.scala:210)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 49 more
24/10/12 23:32:31 INFO SparkContext: Invoking stop() from shutdown hook
24/10/12 23:32:31 INFO SparkUI: Stopped Spark web UI at http://192.168.239.187:4040
24/10/12 23:32:31 INFO StandaloneSchedulerBackend: Shutting down all executors
24/10/12 23:32:31 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
24/10/12 23:32:31 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
24/10/12 23:32:31 INFO MemoryStore: MemoryStore cleared
24/10/12 23:32:31 INFO BlockManager: BlockManager stopped
24/10/12 23:32:31 INFO BlockManagerMaster: BlockManagerMaster stopped
24/10/12 23:32:31 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
24/10/12 23:32:31 INFO SparkContext: Successfully stopped SparkContext
24/10/12 23:32:31 INFO ShutdownHookManager: Shutdown hook called
24/10/12 23:32:31 INFO ShutdownHookManager: Deleting directory /tmp/spark-7832ad16-3442-4a9f-99af-47f3a559098a
  • 写回答

2条回答 默认 最新

  • 尘世壹俗人 2024-10-19 12:24
    关注

    是个版本问题,我换了一个spark包就可以了

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(1条)

报告相同问题?

问题事件

  • 已结题 (查看结题原因) 10月19日
  • 已采纳回答 10月19日
  • 创建了问题 10月12日