hive _jdbc 连接SASL认证问题

【java代码】
package org.neworigin.hive.Hive_JDBC;

import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;

public class App {
public static void main( String[] args ) throws ClassNotFoundException, SQLException {
Class.forName("org.apache.hive.jdbc.HiveDriver");
Connection conn=DriverManager.getConnection("jdbc:hive2://192.168.170.100:10000/student"," neworigin","123456 ");
PreparedStatement ppst=conn.prepareStatement("select * from jdbchive");
ResultSet rs = ppst.executeQuery();
while(rs.next()){
int id=rs.getInt("id");
String name=rs.getString("name");
int age =rs.getInt("age");
System.out.println(id+","+name+","+age);

}

conn.close();
ppst.cancel();
rs.close();
}
}
【运行结果】
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Users/Hw_PC/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.4.1/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/Hw_PC/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
12,chw,22
22,zrt,23
Exception in thread "main" java.sql.SQLException: org.apache.thrift.transport.TTransportException: SASL authentication not complete
at org.apache.hive.jdbc.HiveStatement.cancel(HiveStatement.java:174)
at org.neworigin.hive.Hive_JDBC.App.main(App.java:24)
Caused by: org.apache.thrift.transport.TTransportException: SASL authentication not complete
at org.apache.thrift.transport.TSaslTransport.write(TSaslTransport.java:474)
at org.apache.thrift.transport.TSaslClientTransport.write(TSaslClientTransport.java:37)
at org.apache.thrift.protocol.TBinaryProtocol.writeI32(TBinaryProtocol.java:178)
at org.apache.thrift.protocol.TBinaryProtocol.writeMessageBegin(TBinaryProtocol.java:106)
at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:70)
at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62)
at org.apache.hive.service.rpc.thrift.TCLIService$Client.send_CancelOperation(TCLIService.java:484)
at org.apache.hive.service.rpc.thrift.TCLIService$Client.CancelOperation(TCLIService.java:476)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.hive.jdbc.HiveConnection$SynchronizedHandler.invoke(HiveConnection.java:1412)
at com.sun.proxy.$Proxy10.CancelOperation(Unknown Source)
at org.apache.hive.jdbc.HiveStatement.cancel(HiveStatement.java:168)
... 1 more

(库中的数据可以全部提出来,但是后面老是报认证错误。。。。。。不解。。。。)

1个回答

这个问题的关键点在于,你先执行了conn.close();再去执行ppst.cancel()和rs.close();,顺序颠倒了!

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
zeppelin连接hive和spark遇到的问题

1.连接hive的时候 zeppelin使用hiveserver2连接hive,由于元数据过多,赶脚zeppelin每次都在遍历元数据,每次执行语句都有1个多小时的延迟 2.连接sparksql报错 java.lang.NoSuchFieldError: HIVE_STATS_JDBC_TIMEOUT at org.apache.spark.sql.hive.HiveUtils$.hiveClientConfig

JDBC连接hive连接超时

hiveserver2启动了,然后日志也正常,但是用kettle连接或者自己的java代码用jdbc连接都是报错,报错日志如下: java.sql.SQLException: Could not open connection to jdbc:hive2://192.168.162.129:10000/hivedb: java.net.ConnectException: Connection timed out: connect at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206) at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:178) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:582) at java.sql.DriverManager.getConnection(DriverManager.java:185) at com.ljq.hive.HiveJdbcClient.run(HiveJdbcClient.java:21) at com.ljq.hive.HiveJdbcClient.main(HiveJdbcClient.java:46) Caused by: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection timed out: connect at org.apache.thrift.transport.TSocket.open(TSocket.java:185) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:248) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:203) ... 6 more Caused by: java.net.ConnectException: Connection timed out: connect at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351) at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213) at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366) at java.net.Socket.connect(Socket.java:529) at org.apache.thrift.transport.TSocket.open(TSocket.java:180) ... 9 more error 实在是不知道怎么搞了

Hive JDBC 连接异常问题

代码: String driverName = "org.apache.hive.jdbc.HiveDriver"; String url = "jdbc:hive://192.168.1.108:10000/default"; String user = ""; String password = ""; String sql = ""; ResultSet res = null; Class.forName(driverName); Connection con = DriverManager.getConnection(url, user, password); 错: Exception in thread "main" java.sql.SQLException: No suitable driver found for jdbc:hive://192.168.1.108:10000/default at java.sql.DriverManager.getConnection(Unknown Source) at java.sql.DriverManager.getConnection(Unknown Source) 所需包都有还包异常。。。求助

在一个脚本里看到下面这段代码

HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-$SPARK_CONF_DIR/yarn-conf} HIVE_CONF_DIR=${HIVE_CONF_DIR:-/etc/hive/conf} if [ -d "$HIVE_CONF_DIR" ]; then HADOOP_CONF_DIR="$HADOOP_CONF_DIR:$HIVE_CONF_DIR" fi export HADOOP_CONF_DIR 里面有些用法不太了解,怕理解有偏差。有没有擅长shell的指点一下,

通过JDBC驱动连接Hive操作失败

hive版本:1.2.1 Hive.java程序片段: Class.forName("org.apache.hive.jdbc.HiveDriver"); 在centos上,通过javac -classpath对Hive.java 编译。相应的jar:hive-jdbc,hive-exec,hive-metastore 放在-classpath后面了。 编译成功; 通过jar cvf 打包成功为hive.jar. 然后通过 hadoop jar hive.jar com.aturbo.hadoop.Hive执行的时候,以后报下面的错: Exception in thread "main" java.lang.ClassNotFoundException: org.apache.hive.jdbc.HiveDriver at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:195) at com.aturbo.hadoop.Hive.main(Hive.java:12) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:160) 不知道什么原因,求大神帮忙!

Spring boot项目引入hive-jdbc依赖后报空指针异常?

1. 多模块的Spring boot项目,其中需要使用hbase和hive连接,一个是用于匹配数据,一个是写数据 2. 工程pom文件中没有引入hive-jdbc的依赖时,工程能够正常启动,但是访问接口流程时又找不到hive的驱动。报错如下:java.lang.ClassNotFoundException: org.apache.hive.jdbc.HiveDriver。所以还是要引hive-jdbc的依赖 3. 但是引入后工程启动空指针 ``` org.springframework.context.ApplicationContextException: Unable to start embedded container; nested exception is java.lang.NullPointerException at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.onRefresh(EmbeddedWebApplicationContext.java:137) ~[spring-boot-1.5.2.RELEASE.jar:1.5.2.RELEASE] at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:536) ~[spring-context-4.3.7.RELEASE.jar:4.3.7.RELEASE] at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.refresh(EmbeddedWebApplicationContext.java:122) ~[spring-boot-1.5.2.RELEASE.jar:1.5.2.RELEASE] at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:737) [spring-boot-1.5.2.RELEASE.jar:1.5.2.RELEASE] at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:370) [spring-boot-1.5.2.RELEASE.jar:1.5.2.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:314) [spring-boot-1.5.2.RELEASE.jar:1.5.2.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1162) [spring-boot-1.5.2.RELEASE.jar:1.5.2.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1151) [spring-boot-1.5.2.RELEASE.jar:1.5.2.RELEASE] at cn.xxt.data.collect.DataCollectApplication.main(DataCollectApplication.java:12) [classes/:?] Caused by: java.lang.NullPointerException at org.springframework.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:216) ~[spring-core-4.3.7.RELEASE.jar:4.3.7.RELEASE] at org.springframework.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:201) ~[spring-core-4.3.7.RELEASE.jar:4.3.7.RELEASE] at org.springframework.boot.context.embedded.jetty.JettyEmbeddedServletContainerFactory$Jetty93SessionConfigurer.getSessionManager(JettyEmbeddedServletContainerFactory.java:1016) ~[spring-boot-1.5.2.RELEASE.jar:1.5.2.RELEASE] at org.springframework.boot.context.embedded.jetty.JettyEmbeddedServletContainerFactory$Jetty93SessionConfigurer.configure(JettyEmbeddedServletContainerFactory.java:1001) ~[spring-boot-1.5.2.RELEASE.jar:1.5.2.RELEASE] at org.springframework.boot.context.embedded.jetty.JettyEmbeddedServletContainerFactory.configureSession(JettyEmbeddedServletContainerFactory.java:376) ~[spring-boot-1.5.2.RELEASE.jar:1.5.2.RELEASE] at org.springframework.boot.context.embedded.jetty.JettyEmbeddedServletContainerFactory.configureWebAppContext(JettyEmbeddedServletContainerFactory.java:370) ~[spring-boot-1.5.2.RELEASE.jar:1.5.2.RELEASE] at org.springframework.boot.context.embedded.jetty.JettyEmbeddedServletContainerFactory.getEmbeddedServletContainer(JettyEmbeddedServletContainerFactory.java:174) ~[spring-boot-1.5.2.RELEASE.jar:1.5.2.RELEASE] at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.createEmbeddedServletContainer(EmbeddedWebApplicationContext.java:164) ~[spring-boot-1.5.2.RELEASE.jar:1.5.2.RELEASE] at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.onRefresh(EmbeddedWebApplicationContext.java:134) ~[spring-boot-1.5.2.RELEASE.jar:1.5.2.RELEASE] ... 8 more ``` 4. 我的pom文件如下,有大神给看看啥原因吗? ``` <parent> <artifactId>demo-hbase</artifactId> <groupId>com.demo</groupId> <version>1.0.0-SNAPSHOT</version> </parent> <modelVersion>4.0.0</modelVersion> <artifactId>demo-hbase-common</artifactId> <dependencies> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-client</artifactId> </dependency> <dependency> <groupId>org.apache.hbase</groupId> <artifactId>hbase-client</artifactId> </dependency> <dependency> <groupId>org.apache.hive</groupId> <artifactId>hive-jdbc</artifactId> <exclusions> <exclusion> <groupId>org.eclipse.jetty.aggregate</groupId> <artifactId>jetty-all</artifactId> </exclusion> <exclusion> <groupId>org.apache.hive</groupId> <artifactId>hive-shims</artifactId> </exclusion> </exclusions> </dependency> </dependencies> ```

spark通过jdbc读取hive的表报错,我是在zeppelin里运行的

## 代码: import org.apache.spark.sql.hive.HiveContext val pro = new java.util.Properties() pro.setProperty("user", "****") pro.setProperty("password", "*****") val driverName = "org.apache.hadoop.hive.jdbc.HiveDriver"; Class.forName(driverName); val hiveContext = new HiveContext(sc) val hivetable = hiveContext.read.jdbc("jdbc:hive://*****/default", "*****", pro); ## 错误: import org.apache.spark.sql.hive.HiveContext pro: java.util.Properties = {} res15: Object = null res16: Object = null driverName: String = org.apache.hadoop.hive.jdbc.HiveDriver res17: Class[_] = class org.apache.hadoop.hive.jdbc.HiveDriver warning: there was one deprecation warning; re-run with -deprecation for details hiveContext: org.apache.spark.sql.hive.HiveContext = org.apache.spark.sql.hive.HiveContext@14f9cc13 java.sql.SQLException: Method not supported at org.apache.hadoop.hive.jdbc.HiveResultSetMetaData.isSigned(Unknown Source) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.getSchema(JdbcUtils.scala:232) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:64) at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:113) at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:45) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:330) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125) at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:166) ... 46 elided

用jdbc连接hive时resultset里面查不到数据,row是null

代码如下: public class ExtractJob { public static void main(String[] args) { String driverName = "org.apache.hive.jdbc.HiveDriver"; String url = "jdbc:hive2://***.***.***.***:10000/default"; Connection conn = null; Statement state = null; ResultSet rs = null; try { Class.forName(driverName); conn = DriverManager.getConnection(url,"hive","hive"); state = conn.createStatement(); state.execute("use test"); rs = state.executeQuery("select * from test1"); int columnCount = rs.getMetaData().getColumnCount(); String str = ""; while(rs.next()){ for(int i = 0;i<columnCount;i++){ str+=rs.getString(i); } System.out.println(str); } } catch (SQLException e) { e.printStackTrace(); }catch (ClassNotFoundException e) { e.printStackTrace(); }finally{ try { rs.close(); state.close(); conn.close(); } catch (SQLException e) { // TODO Auto-generated catch block e.printStackTrace(); }finally{ rs = null; state = null; conn = null; } } } }

运行hive-jdbc的代码,关于insert/values,为什么只是第一次能运行,第二次运行就报错?

问题:在hive1.1.0的shell界面不支持的insert/values语法为什么在hive-jdbc代码中能运行?而且为什么只是第一次能运行,第二次运行就报错? 描述: 在hive1.1.0的shell界面不支持的insert/values语法: ![图片说明](https://img-ask.csdn.net/upload/201903/11/1552294834_736720.png) 为什么在hive-jdbc代码中只有第一次能运行: ![图片说明](https://img-ask.csdn.net/upload/201903/11/1552294947_619465.png) ======= ![图片说明](https://img-ask.csdn.net/upload/201903/11/1552294874_478366.png) 而紧接着第二次执行又说sql语法错误? ![图片说明](https://img-ask.csdn.net/upload/201903/11/1552295708_930275.png) ps:67行就对应那句sql的执行

java web 连接hive问题

创建了java工程,直接main方法调用jdbc方式连接hive的方法,测试成功, 查询都好用,但是在servlet里面调用那个查询方法就抛异常java.lang.ClassNotFoundException: org.apache.hive.jdbc.HiveDriver, hive相关的jar包都不少,也在buildpath中加入项目里面了,真是搞不明白了,代码就在一个工程里面,方法直接在main函数里就可以执行 在servlet中执行就报错。有人遇见过类似的问题吗?通过servlet调用和直接main方法调用有什么本质区别呢?

hive jdbc执行报错MoveTask

详细错误 java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:296) at org.apache.hive.jdbc.HiveStatement.executeUpdate(HiveStatement.java:406) at com.dmp.dmp_hive.service.Media_toolsService.loadDate(Media_toolsService.java:96) at com.dmp.dmp_hive.service.Media_toolsService.run(Media_toolsService.java:38) at com.dmp.dmp_hive.test.Test.main(Test.java:48)

jdbc连接hive非default库

Connection con = DriverManager.getCo nnection("jdbc:hive2://10.0.31.89:10000/wsx", "root", ""); 指定了库没用,只能操作default库,怎么切换库啊

hive执行正确,hive-jdbc 别名 报语法错误

SQL在hive中执行是正确的,但是使用hive-jdbc ResultSet rs = st.executeQuery(sql) 执行,却在别名处报语法错误 ![图片说明](https://img-ask.csdn.net/upload/202001/03/1578040942_99148.png) SQL: SELECT aa.customerid FROM ( (SELECT customerid FROM oder WHERE saleno = 101870 AND orderstatus NOT IN (1000, 1007, 1008) AND obcustomertype != 1004 AND source = 1001 AND zipcode != 'null') as `aa` LEFT JOIN (SELECT customerid FROM oder WHERE saleno IN ( 101345, 101955, 101000, 101099, 101362 ) AND orderstatus NOT IN (1000, 1007, 1008) AND obcustomertype != 1004 AND source = 1001 AND zipcode != 'null') as `bb` ON aa.customerid = bb.customerid ) WHERE bb.customerid IS NULL

【hive问题】具体怎么使用load加载本地txt文件到表里面呢

那个公司给的hive连接地址,我在我电脑上用jdbc连接的hive,我目前有一个map中有很多条数据,要往hive里面存,我发现用insert into插500条数据,插了10分钟,好慢啊,我想试一试load批量加载,然后我找到了这个语句 load data local inpath 'wyp.txt' into table wyp; 然后那么问题来了,请问使用这个load,本地文件必须和hive在一个服务器上吗,我用的centos系统,不管我把txt文件放在那里,执行语句都是找不到文件,是因为不在一个服务器上吗

各位大佬们,解决下bug啊!!

Exception in thread "main" java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://192.168.42.130:10000/hive_1: null at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:209) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) at java.sql.DriverManager.getConnection(Unknown Source) at java.sql.DriverManager.getConnection(Unknown Source) at hive.JDBC_Hive.main(JDBC_Hive.java:16) Caused by: org.apache.thrift.transport.TTransportException at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:307) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227) at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182) ... 4 more 以上是报错信息. 代码如下: package hive; import java.sql.Connection; import java.sql.DriverManager; import java.sql.PreparedStatement; import java.sql.ResultSet; public class JDBC_Hive { public static void main(String[] args) throws Throwable { try { Class.forName("org.apache.hive.jdbc.HiveDriver"); } catch (Exception e) { e.printStackTrace(); } String url = "jdbc:hive2://192.168.42.130:10000/hive_1"; Connection conn = DriverManager.getConnection(url,"root","root"); String sql = "select count(*) num from nnn"; PreparedStatement prepareStatement = conn.prepareStatement(sql); long time = System.currentTimeMillis(); ResultSet rs = prepareStatement.executeQuery(sql); while (rs.next()) { int num = rs.getInt("num"); System.out.println(num); } long timeUsed = System.currentTimeMillis() - time; System.out.println("time " + timeUsed + "mm"); rs.close(); prepareStatement.close(); conn.close(); } } hiveserve2开了,thrift开了,jdbc还是连接不上hive.众筹解决方案啊!

hive jdbc连接不成功。。

用JDBC 连接hive,不成功。后台报错显示下面的日志。。。 [HiveServer2-Handler-Pool: Thread-28]: server.TThreadPoolServer (TThreadPoolServer.java:run(253)) - Error occurred during processing of message. java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Invalid status -128 at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:227) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) Caused by: org.apache.thrift.transport.TTransportException: Invalid status -128 at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:230) at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:184) at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:262) at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41) at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216) ... 4 more

Unable to read HiveServer2 uri from ZooKeeper

Exception in thread "main" java.sql.SQLException: org.apache.hive.jdbc.ZooKeeperHiveClientException: Unable to read HiveServer2 uri from ZooKeeper at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:127) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:247) at JDBCExample.main(JDBCExample.java:142) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140) Caused by: org.apache.hive.jdbc.ZooKeeperHiveClientException: Unable to read HiveServer2 uri from ZooKeeper at org.apache.hive.jdbc.ZooKeeperHiveClientHelper.getNextServerUriFromZooKeeper(ZooKeeperHiveClientHelper.java:109) at org.apache.hive.jdbc.Utils.resolveAuthorityUsingZooKeeper(Utils.java:492) at org.apache.hive.jdbc.Utils.resolveAuthority(Utils.java:464) at org.apache.hive.jdbc.Utils.parseURL(Utils.java:371) at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:125) ... 9 more Caused by: org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hiveserver2 at org.apache.zookeeper.KeeperException.create(KeeperException.java:99) at org.apache.zookeeper.KeeperException.create(KeeperException.java:51) at org.apache.zookeeper.ZooKeeper.getChildren(ZooKeeper.java:2231) at org.apache.curator.framework.imps.GetChildrenBuilderImpl$3.call(GetChildrenBuilderImpl.java:214) at org.apache.curator.framework.imps.GetChildrenBuilderImpl$3.call(GetChildrenBuilderImpl.java:203) at org.apache.curator.RetryLoop.callWithRetry(RetryLoop.java:107) at org.apache.curator.framework.imps.GetChildrenBuilderImpl.pathInForeground(GetChildrenBuilderImpl.java:199) at org.apache.curator.framework.imps.GetChildrenBuilderImpl.forPath(GetChildrenBuilderImpl.java:191) at org.apache.curator.framework.imps.GetChildrenBuilderImpl.forPath(GetChildrenBuilderImpl.java:38) at org.apache.hive.jdbc.ZooKeeperHiveClientHelper.getNextServerUriFromZooKeeper(ZooKeeperHiveClientHelper.java:91) ... 13 more

hive beeline 连接 User: root is not allowed to impersonate root

beeline 连接不上。已经困扰我半个月,请各位师兄指点一下。我部署hadoop是单机版的。hive 能做查询,能简历数据库。 用!connect jdbc:hive2://devcrm:10000 出现权限问题 ``` beeline> !connect jdbc:hive2://devcrm:10000 Connecting to jdbc:hive2://devcrm:10000 Enter username for jdbc:hive2://devcrm:10000: hadoop Enter password for jdbc:hive2://devcrm:10000: ****** 19/04/23 15:36:53 [main]: WARN jdbc.HiveConnection: Failed to connect to devcrm:10000 Error: Could not open client transport with JDBC Uri: jdbc:hive2://devcrm:10000: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate hadoop (state=08S01,code=0) ``` 用 beeline -u jdbc:hive2//devcrm:10000 -n hadoop连接也不行 ``` [root@devcrm hadoop]# beeline -u jdbc:hive2//devcrm:10000 -n hadoop SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/local/kafka/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/local/kafka/hadoop-2.7.6/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] scan complete in 1ms scan complete in 963ms No known driver to handle "jdbc:hive2//devcrm:10000" Beeline version 2.3.0 by Apache Hive ``` hive-site.xml文件 ``` <configuration> <property> <name>javax.jdo.option.ConnectionUserName</name> <value>root</value> <description>username to use against metastore database</description> </property> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>123</value> <description>password to use against metastore database</description> </property> <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://192.168.12.77:3306/hive?createDatabaseIfNotExist=true</value> <description>JDBC connect string for a JDBC metastore</description> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> <description>Driver class name for a JDBC metastore</description> </property> <property> <name>hive.server2.thrift.client.user</name> <value>hadoop</value> <description>Username to use against thrift client</description> </property> <property> <name>hive.server2.thrift.client.password</name> <value>hadoop</value> <description>Password to use against thrift client</description> </property> ``` core-site.xml文件 ``` <configuration> <!--指定namenode的地址--> <property> <name>fs.defaultFS</name> <value>hdfs://192.168.11.207:9000</value> </property> <!--用来指定使用hadoop时产生文件的存放目录--> <property> <name>hadoop.tmp.dir</name> <!--<value>file:/usr/local/kafka/hadoop-2.7.6/tmp</value>--> <value>file:/home/hadoop/temp</value> </property> <!--用来设置检查点备份日志的最长时间--> <!-- <name>fs.checkpoint.period</name> <value>3600</value> --> <!-- 表示设置 hadoop 的代理用户--> <property> <!--表示任意节点使用 hadoop 集群的代理用户 hadoop 都能访问 hdfs 集群--> <name>hadoop.proxyuser.root.hosts</name> <value>*</value> </property> <property> <!--表示代理用户的组所属--> <name>hadoop.proxyuser.root.groups</name> <value>*</value> </property> </configuration> ``` hdfs-site.xml 文件 ``` <configuration> <!--指定hdfs保存数据的副本数量--> <property> <name>dfs.replication</name> <value>1</value> </property> <!--指定hdfs中namenode的存储位置--> <property> <name>dfs.namenode.name.dir</name> <value>file:/usr/local/kafka/hadoop-2.7.6/tmp/dfs/name</value> </property> <!--指定hdfs中datanode的存储位置--> <property> <name>dfs.datanode.data.dir</name> <value>file:/usr/local/kafka/hadoop-2.7.6/tmp/dfs/data</value> </property> <property> <name>dfs.secondary.http.address</name> <value>192.168.11.207:50090</value> </property> <property> <name>dfs.permissions</name> <value>false</value> </property> <!-- 表示启用 webhdfs--> <property> <name>dfs.webhdfs.enabled</name> <value>true</value> </property> </configuration> ``` http://192.168.11.207:10002/页面能看到HiveServer2的启动时间 ![图片说明](https://img-ask.csdn.net/upload/201904/23/1556005658_291513.png) hive 的日志 ``` 2019-04-24T09:20:11,829 INFO [main] http.HttpServer: Started HttpServer[hiveserver2] on port 10002 2019-04-24T09:20:50,464 INFO [HiveServer2-Handler-Pool: Thread-38] thrift.ThriftCLIService: Client protocol version: HIVE_CLI_SERVICE_PROTOCOL_V10 2019-04-24T09:20:50,494 INFO [HiveServer2-Handler-Pool: Thread-38] conf.HiveConf: Using the default value passed in for log id: b0f59ac1-d17a-404f-8bf5-fbe4693c9964 2019-04-24T09:20:50,494 INFO [b0f59ac1-d17a-404f-8bf5-fbe4693c9964 HiveServer2-Handler-Pool: Thread-38] conf.HiveConf: Using the default value passed in for log id: b0f59ac1-d17a-404f-8bf5-fbe4693c9964 2019-04-24T09:20:50,494 INFO [HiveServer2-Handler-Pool: Thread-38] conf.HiveConf: Using the default value passed in for log id: b0f59ac1-d17a-404f-8bf5-fbe4693c9964 2019-04-24T09:20:50,495 INFO [b0f59ac1-d17a-404f-8bf5-fbe4693c9964 HiveServer2-Handler-Pool: Thread-38] conf.HiveConf: Using the default value passed in for log id: b0f59ac1-d17a-404f-8bf5-fbe4693c9964 2019-04-24T09:20:50,494 WARN [HiveServer2-Handler-Pool: Thread-38] service.CompositeService: Failed to open session java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate hadoop at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:89) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63) ~[hive-service-2.3.0.jar:2.3.0] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.7.0_80] at javax.security.auth.Subject.doAs(Subject.java:415) ~[?:1.7.0_80] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1758) ~[hadoop-common-2.7.6.jar:?] at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59) ~[hive-service-2.3.0.jar:2.3.0] at com.sun.proxy.$Proxy36.open(Unknown Source) ~[?:?] at org.apache.hive.service.cli.session.SessionManager.createSession(SessionManager.java:410) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.SessionManager.openSession(SessionManager.java:362) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.CLIService.openSessionWithImpersonation(CLIService.java:193) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:440) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:322) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1377) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1362) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) ~[hive-exec-2.3.0.jar:2.3.0] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [?:1.7.0_80] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [?:1.7.0_80] at java.lang.Thread.run(Thread.java:745) [?:1.7.0_80] Caused by: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate hadoop at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:606) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:544) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionImpl.open(HiveSessionImpl.java:164) ~[hive-service-2.3.0.jar:2.3.0] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_80] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_80] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_80] at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_80] at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78) ~[hive-service-2.3.0.jar:2.3.0] ... 21 more Caused by: org.apache.hadoop.ipc.RemoteException: User: root is not allowed to impersonate hadoop at org.apache.hadoop.ipc.Client.call(Client.java:1476) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.ipc.Client.call(Client.java:1413) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) ~[hadoop-common-2.7.6.jar:?] at com.sun.proxy.$Proxy29.getFileInfo(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:776) ~[hadoop-hdfs-2.7.6.jar:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_80] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_80] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_80] at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_80] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.7.6.jar:?] at com.sun.proxy.$Proxy30.getFileInfo(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2117) ~[hadoop-hdfs-2.7.6.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305) ~[hadoop-hdfs-2.7.6.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301) ~[hadoop-hdfs-2.7.6.jar:?] at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317) ~[hadoop-hdfs-2.7.6.jar:?] at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1425) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:704) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:650) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:582) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:544) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionImpl.open(HiveSessionImpl.java:164) ~[hive-service-2.3.0.jar:2.3.0] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_80] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_80] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_80] at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_80] at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78) ~[hive-service-2.3.0.jar:2.3.0] ... 21 more 2019-04-24T09:20:50,494 INFO [HiveServer2-Handler-Pool: Thread-38] session.SessionState: Updating thread name to b0f59ac1-d17a-404f-8bf5-fbe4693c9964 HiveServer2-Handler-Pool: Thread-38 2019-04-24T09:20:50,494 INFO [HiveServer2-Handler-Pool: Thread-38] session.SessionState: Resetting thread name to HiveServer2-Handler-Pool: Thread-38 2019-04-24T09:20:50,494 INFO [HiveServer2-Handler-Pool: Thread-38] session.SessionState: Updating thread name to b0f59ac1-d17a-404f-8bf5-fbe4693c9964 HiveServer2-Handler-Pool: Thread-38 2019-04-24T09:20:50,495 INFO [HiveServer2-Handler-Pool: Thread-38] session.SessionState: Resetting thread name to HiveServer2-Handler-Pool: Thread-38 2019-04-24T09:20:50,509 WARN [HiveServer2-Handler-Pool: Thread-38] thrift.ThriftCLIService: Error opening session: org.apache.hive.service.cli.HiveSQLException: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate hadoop at org.apache.hive.service.cli.session.SessionManager.createSession(SessionManager.java:419) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.SessionManager.openSession(SessionManager.java:362) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.CLIService.openSessionWithImpersonation(CLIService.java:193) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:440) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:322) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1377) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1362) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) ~[hive-exec-2.3.0.jar:2.3.0] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [?:1.7.0_80] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [?:1.7.0_80] at java.lang.Thread.run(Thread.java:745) [?:1.7.0_80] Caused by: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate hadoop at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:89) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63) ~[hive-service-2.3.0.jar:2.3.0] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.7.0_80] at javax.security.auth.Subject.doAs(Subject.java:415) ~[?:1.7.0_80] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1758) ~[hadoop-common-2.7.6.jar:?] at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59) ~[hive-service-2.3.0.jar:2.3.0] at com.sun.proxy.$Proxy36.open(Unknown Source) ~[?:?] at org.apache.hive.service.cli.session.SessionManager.createSession(SessionManager.java:410) ~[hive-service-2.3.0.jar:2.3.0] ... 13 more Caused by: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate hadoop at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:606) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:544) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionImpl.open(HiveSessionImpl.java:164) ~[hive-service-2.3.0.jar:2.3.0] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_80] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_80] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_80] at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_80] at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63) ~[hive-service-2.3.0.jar:2.3.0] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.7.0_80] at javax.security.auth.Subject.doAs(Subject.java:415) ~[?:1.7.0_80] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1758) ~[hadoop-common-2.7.6.jar:?] at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59) ~[hive-service-2.3.0.jar:2.3.0] at com.sun.proxy.$Proxy36.open(Unknown Source) ~[?:?] at org.apache.hive.service.cli.session.SessionManager.createSession(SessionManager.java:410) ~[hive-service-2.3.0.jar:2.3.0] ... 13 more Caused by: org.apache.hadoop.ipc.RemoteException: User: root is not allowed to impersonate hadoop at org.apache.hadoop.ipc.Client.call(Client.java:1476) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.ipc.Client.call(Client.java:1413) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) ~[hadoop-common-2.7.6.jar:?] at com.sun.proxy.$Proxy29.getFileInfo(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:776) ~[hadoop-hdfs-2.7.6.jar:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_80] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_80] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_80] at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_80] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.7.6.jar:?] at com.sun.proxy.$Proxy30.getFileInfo(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2117) ~[hadoop-hdfs-2.7.6.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305) ~[hadoop-hdfs-2.7.6.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301) ~[hadoop-hdfs-2.7.6.jar:?] at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317) ~[hadoop-hdfs-2.7.6.jar:?] at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1425) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:704) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:650) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:582) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:544) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionImpl.open(HiveSessionImpl.java:164) ~[hive-service-2.3.0.jar:2.3.0] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_80] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_80] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_80] at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_80] at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63) ~[hive-service-2.3.0.jar:2.3.0] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.7.0_80] at javax.security.auth.Subject.doAs(Subject.java:415) ~[?:1.7.0_80] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1758) ~[hadoop-common-2.7.6.jar:?] at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59) ~[hive-service-2.3.0.jar:2.3.0] at com.sun.proxy.$Proxy36.open(Unknown Source) ~[?:?] at org.apache.hive.service.cli.session.SessionManager.createSession(SessionManager.java:410) ~[hive-service-2.3.0.jar:2.3.0] ... 13 more ```

spark jdbc连接impala报错Method not supported

各位好 我的spark是2.1.0,用的hive-jdbc 2.1.0,现在写入impala的时候报以下错: java.sql.SQLException: Method not supported at org.apache.hive.jdbc.HivePreparedStatement.addBatch(HivePreparedStatement.java:75) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.savePartition(JdbcUtils.scala:589) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$saveTable$1.apply(JdbcUtils.scala:670) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$saveTable$1.apply(JdbcUtils.scala:670) at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:925) at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:925) at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1944) at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1944) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) at org.apache.spark.scheduler.Task.run(Task.scala:99) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Driver stacktrace: at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1435) at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1423) at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1422) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1422) at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802) at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802) at scala.Option.foreach(Option.scala:257) at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:802) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1650) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1605) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1594) at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:628) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1918) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1931) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1944) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1958) at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:925) at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:923) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:362) at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:923) at org.apache.spark.sql.Dataset$$anonfun$foreachPartition$1.apply$mcV$sp(Dataset.scala:2305) at org.apache.spark.sql.Dataset$$anonfun$foreachPartition$1.apply(Dataset.scala:2305) at org.apache.spark.sql.Dataset$$anonfun$foreachPartition$1.apply(Dataset.scala:2305) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:57) at org.apache.spark.sql.Dataset.withNewExecutionId(Dataset.scala:2765) at org.apache.spark.sql.Dataset.foreachPartition(Dataset.scala:2304) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.saveTable(JdbcUtils.scala:670) at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:77) at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:518) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:215) at org.apache.spark.sql.DataFrameWriter.jdbc(DataFrameWriter.scala:446) at com.aoyou.data.CustomerVisitProduct$.saveToHive(CustomerVisitProduct.scala:281) at com.aoyou.data.CustomerVisitProduct$.main(CustomerVisitProduct.scala:221) at com.aoyou.data.CustomerVisitProduct.main(CustomerVisitProduct.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.sql.SQLException: Method not supported at org.apache.hive.jdbc.HivePreparedStatement.addBatch(HivePreparedStatement.java:75) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.savePartition(JdbcUtils.scala:589) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$saveTable$1.apply(JdbcUtils.scala:670) at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$saveTable$1.apply(JdbcUtils.scala:670) at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:925) at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:925) at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1944) at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1944) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) at org.apache.spark.scheduler.Task.run(Task.scala:99) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 以下是代码实现 val sparkConf = new SparkConf().setAppName("save").set("spark.sql.crossJoin.enabled", "true"); val sparkSession = SparkSession .builder() .enableHiveSupport() .getOrCreate(); val dataframe = sparkSession.createDataFrame(rddSchema, new Row().getClass()) val property = new Properties(); property.put("user", "xxxxx") property.put("password", "xxxxx") dataframe.write.mode(SaveMode.Append).option("driver", "org.apache.hive.jdbc.HiveDriver").jdbc("jdbc:hive2://xxxx:21050/rawdata;auth=noSasl", "tablename", property) 请问这是怎么回事啊?感觉是驱动版本问题

在中国程序员是青春饭吗?

今年,我也32了 ,为了不给大家误导,咨询了猎头、圈内好友,以及年过35岁的几位老程序员……舍了老脸去揭人家伤疤……希望能给大家以帮助,记得帮我点赞哦。 目录: 你以为的人生 一次又一次的伤害 猎头界的真相 如何应对互联网行业的「中年危机」 一、你以为的人生 刚入行时,拿着傲人的工资,想着好好干,以为我们的人生是这样的: 等真到了那一天,你会发现,你的人生很可能是这样的: ...

程序员请照顾好自己,周末病魔差点一套带走我。

程序员在一个周末的时间,得了重病,差点当场去世,还好及时挽救回来了。

我以为我学懂了数据结构,直到看了这个导图才发现,我错了

数据结构与算法思维导图

String s = new String(" a ") 到底产生几个对象?

老生常谈的一个梗,到2020了还在争论,你们一天天的,哎哎哎,我不是针对你一个,我是说在座的各位都是人才! 上图红色的这3个箭头,对于通过new产生一个字符串(”宜春”)时,会先去常量池中查找是否已经有了”宜春”对象,如果没有则在常量池中创建一个此字符串对象,然后堆中再创建一个常量池中此”宜春”对象的拷贝对象。 也就是说准确答案是产生了一个或两个对象,如果常量池中原来没有 ”宜春” ,就是两个。...

技术大佬:我去,你写的 switch 语句也太老土了吧

昨天早上通过远程的方式 review 了两名新来同事的代码,大部分代码都写得很漂亮,严谨的同时注释也很到位,这令我非常满意。但当我看到他们当中有一个人写的 switch 语句时,还是忍不住破口大骂:“我擦,小王,你丫写的 switch 语句也太老土了吧!” 来看看小王写的代码吧,看完不要骂我装逼啊。 private static String createPlayer(PlayerTypes p...

Linux面试题(2020最新版)

文章目录Linux 概述什么是LinuxUnix和Linux有什么区别?什么是 Linux 内核?Linux的基本组件是什么?Linux 的体系结构BASH和DOS之间的基本区别是什么?Linux 开机启动过程?Linux系统缺省的运行级别?Linux 使用的进程间通信方式?Linux 有哪些系统日志文件?Linux系统安装多个桌面环境有帮助吗?什么是交换空间?什么是root帐户什么是LILO?什...

将一个接口响应时间从2s优化到 200ms以内的一个案例

一、背景 在开发联调阶段发现一个接口的响应时间特别长,经常超时,囧… 本文讲讲是如何定位到性能瓶颈以及修改的思路,将该接口从 2 s 左右优化到 200ms 以内 。 二、步骤 2.1 定位 定位性能瓶颈有两个思路,一个是通过工具去监控,一个是通过经验去猜想。 2.1.1 工具监控 就工具而言,推荐使用 arthas ,用到的是 trace 命令 具体安装步骤很简单,大家自行研究。 我的使用步骤是...

学历低,无法胜任工作,大佬告诉你应该怎么做

微信上收到一位读者小涛的留言,大致的意思是自己只有高中学历,经过培训后找到了一份工作,但很难胜任,考虑要不要辞职找一份他能力可以胜任的实习工作。下面是他留言的一部分内容: 二哥,我是 2016 年高中毕业的,考上了大学但没去成,主要是因为当时家里经济条件不太允许。 打工了三年后想学一门技术,就去培训了。培训的学校比较垃圾,现在非常后悔没去正规一点的机构培训。 去年 11 月份来北京找到了一份工...

JVM内存结构和Java内存模型别再傻傻分不清了

JVM内存结构和Java内存模型都是面试的热点问题,名字看感觉都差不多,网上有些博客也都把这两个概念混着用,实际上他们之间差别还是挺大的。 通俗点说,JVM内存结构是与JVM的内部存储结构相关,而Java内存模型是与多线程编程相关,本文针对这两个总是被混用的概念展开讲解。 JVM内存结构 JVM构成 说到JVM内存结构,就不会只是说内存结构的5个分区,而是会延展到整个JVM相关的问题,所以先了解下

和黑客斗争的 6 天!

互联网公司工作,很难避免不和黑客们打交道,我呆过的两家互联网公司,几乎每月每天每分钟都有黑客在公司网站上扫描。有的是寻找 Sql 注入的缺口,有的是寻找线上服务器可能存在的漏洞,大部分都...

Google 与微软的浏览器之争

浏览器再现“神仙打架”。整理 | 屠敏头图 | CSDN 下载自东方 IC出品 | CSDN(ID:CSDNnews)从 IE 到 Chrome,再从 Chrome 到 Edge,微软与...

讲一个程序员如何副业月赚三万的真实故事

loonggg读完需要3分钟速读仅需 1 分钟大家好,我是你们的校长。我之前讲过,这年头,只要肯动脑,肯行动,程序员凭借自己的技术,赚钱的方式还是有很多种的。仅仅靠在公司出卖自己的劳动时...

上班一个月,后悔当初着急入职的选择了

最近有个老铁,告诉我说,上班一个月,后悔当初着急入职现在公司了。他之前在美图做手机研发,今年美图那边今年也有一波组织优化调整,他是其中一个,在协商离职后,当时捉急找工作上班,因为有房贷供着,不能没有收入来源。所以匆忙选了一家公司,实际上是一个大型外包公司,主要派遣给其他手机厂商做外包项目。**当时承诺待遇还不错,所以就立马入职去上班了。但是后面入职后,发现薪酬待遇这块并不是HR所说那样,那个HR自...

女程序员,为什么比男程序员少???

昨天看到一档综艺节目,讨论了两个话题:(1)中国学生的数学成绩,平均下来看,会比国外好?为什么?(2)男生的数学成绩,平均下来看,会比女生好?为什么?同时,我又联想到了一个技术圈经常讨...

搜狗输入法也在挑战国人的智商!

故事总是一个接着一个到来...上周写完《鲁大师已经彻底沦为一款垃圾流氓软件!》这篇文章之后,鲁大师的市场工作人员就找到了我,希望把这篇文章删除掉。经过一番沟通我先把这篇文章从公号中删除了...

85后蒋凡:28岁实现财务自由、34岁成为阿里万亿电商帝国双掌门,他的人生底层逻辑是什么?...

蒋凡是何许人也? 2017年12月27日,在入职4年时间里,蒋凡开挂般坐上了淘宝总裁位置。 为此,时任阿里CEO张勇在任命书中力赞: 蒋凡加入阿里,始终保持创业者的冲劲,有敏锐的...

总结了 150 余个神奇网站,你不来瞅瞅吗?

原博客再更新,可能就没了,之后将持续更新本篇博客。

副业收入是我做程序媛的3倍,工作外的B面人生是怎样的?

提到“程序员”,多数人脑海里首先想到的大约是:为人木讷、薪水超高、工作枯燥…… 然而,当离开工作岗位,撕去层层标签,脱下“程序员”这身外套,有的人生动又有趣,马上展现出了完全不同的A/B面人生! 不论是简单的爱好,还是正经的副业,他们都干得同样出色。偶尔,还能和程序员的特质结合,产生奇妙的“化学反应”。 @Charlotte:平日素颜示人,周末美妆博主 大家都以为程序媛也个个不修边幅,但我们也许...

MySQL数据库面试题(2020最新版)

文章目录数据库基础知识为什么要使用数据库什么是SQL?什么是MySQL?数据库三大范式是什么mysql有关权限的表都有哪几个MySQL的binlog有有几种录入格式?分别有什么区别?数据类型mysql有哪些数据类型引擎MySQL存储引擎MyISAM与InnoDB区别MyISAM索引与InnoDB索引的区别?InnoDB引擎的4大特性存储引擎选择索引什么是索引?索引有哪些优缺点?索引使用场景(重点)...

如果你是老板,你会不会踢了这样的员工?

有个好朋友ZS,是技术总监,昨天问我:“有一个老下属,跟了我很多年,做事勤勤恳恳,主动性也很好。但随着公司的发展,他的进步速度,跟不上团队的步伐了,有点...

我入职阿里后,才知道原来简历这么写

私下里,有不少读者问我:“二哥,如何才能写出一份专业的技术简历呢?我总感觉自己写的简历太烂了,所以投了无数份,都石沉大海了。”说实话,我自己好多年没有写过简历了,但我认识的一个同行,他在阿里,给我说了一些他当年写简历的方法论,我感觉太牛逼了,实在是忍不住,就分享了出来,希望能够帮助到你。 01、简历的本质 作为简历的撰写者,你必须要搞清楚一点,简历的本质是什么,它就是为了来销售你的价值主张的。往深...

离职半年了,老东家又发 offer,回不回?

有小伙伴问松哥这个问题,他在上海某公司,在离职了几个月后,前公司的领导联系到他,希望他能够返聘回去,他很纠结要不要回去? 俗话说好马不吃回头草,但是这个小伙伴既然感到纠结了,我觉得至少说明了两个问题:1.曾经的公司还不错;2.现在的日子也不是很如意。否则应该就不会纠结了。 老实说,松哥之前也有过类似的经历,今天就来和小伙伴们聊聊回头草到底吃不吃。 首先一个基本观点,就是离职了也没必要和老东家弄的苦...

男生更看重女生的身材脸蛋,还是思想?

往往,我们看不进去大段大段的逻辑。深刻的哲理,往往短而精悍,一阵见血。问:产品经理挺漂亮的,有点心动,但不知道合不合得来。男生更看重女生的身材脸蛋,还是...

什么时候跳槽,为什么离职,你想好了么?

都是出来打工的,多为自己着想

程序员为什么千万不要瞎努力?

本文作者用对比非常鲜明的两个开发团队的故事,讲解了敏捷开发之道 —— 如果你的团队缺乏统一标准的环境,那么即使勤劳努力,不仅会极其耗时而且成果甚微,使用...

为什么程序员做外包会被瞧不起?

二哥,有个事想询问下您的意见,您觉得应届生值得去外包吗?公司虽然挺大的,中xx,但待遇感觉挺低,马上要报到,挺纠结的。

当HR压你价,说你只值7K,你该怎么回答?

当HR压你价,说你只值7K时,你可以流畅地回答,记住,是流畅,不能犹豫。 礼貌地说:“7K是吗?了解了。嗯~其实我对贵司的面试官印象很好。只不过,现在我的手头上已经有一份11K的offer。来面试,主要也是自己对贵司挺有兴趣的,所以过来看看……”(未完) 这段话主要是陪HR互诈的同时,从公司兴趣,公司职员印象上,都给予对方正面的肯定,既能提升HR的好感度,又能让谈判气氛融洽,为后面的发挥留足空间。...

面试:第十六章:Java中级开发(16k)

HashMap底层实现原理,红黑树,B+树,B树的结构原理 Spring的AOP和IOC是什么?它们常见的使用场景有哪些?Spring事务,事务的属性,传播行为,数据库隔离级别 Spring和SpringMVC,MyBatis以及SpringBoot的注解分别有哪些?SpringMVC的工作原理,SpringBoot框架的优点,MyBatis框架的优点 SpringCould组件有哪些,他们...

面试阿里p7,被按在地上摩擦,鬼知道我经历了什么?

面试阿里p7被问到的问题(当时我只知道第一个):@Conditional是做什么的?@Conditional多个条件是什么逻辑关系?条件判断在什么时候执...

终于懂了TCP和UDP协议区别

终于懂了TCP和UDP协议区别

立即提问
相关内容推荐