Spark 读取 Hive 数据报错 NoSuchMethodError : org.apache.spark.sql.catalyst.catalog.SessionCatalog.
代码如下:
val spark: SparkSession = SparkSession.builder
.appName("distinct") // optional and will be autogenerated if not specified
//.master("yarn") // avoid hardcoding the deployment environment
.master("local[*]")
.enableHiveSupport() // self-explanatory, isn't it?
.getOrCreate
spark.sql("use ods_hive_test")
报错代码信息:
22/12/15 15:32:43 INFO BlockManagerMasterEndpoint: Registering block manager LAPTOP-62E6B54A:9900 with 1986.6 MB RAM, BlockManagerId(driver, LAPTOP-62E6B54A, 9900, None)
22/12/15 15:32:43 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, LAPTOP-62E6B54A, 9900, None)
22/12/15 15:32:43 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, LAPTOP-62E6B54A, 9900, None)
22/12/15 15:32:43 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/D:/JavaProject/SparkTest/Spark-restapi/springboot-spark-master/sparkjob/spark-warehouse/').
22/12/15 15:32:43 INFO SharedState: Warehouse path is 'file:/D:/JavaProject/SparkTest/Spark-restapi/springboot-spark-master/sparkjob/spark-warehouse/'.
22/12/15 15:32:43 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.catalog.SessionCatalog.<init>(Lscala/Function0;Lscala/Function0;Lorg/apache/spark/sql/catalyst/analysis/FunctionRegistry;Lorg/apache/spark/sql/internal/SQLConf;Lorg/apache/hadoop/conf/Configuration;Lorg/apache/spark/sql/catalyst/parser/ParserInterface;Lorg/apache/spark/sql/catalyst/catalog/FunctionResourceLoader;)V
at org.apache.spark.sql.hive.HiveSessionCatalog.<init>(HiveSessionCatalog.scala:50)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:53)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
at SparkTest$.main(SparkTest.scala:18)
at SparkTest.main(SparkTest.scala)
22/12/15 15:32:44 INFO SparkContext: Invoking stop() from shutdown hook