大数据老兵 2023-07-27 15:14 采纳率: 50%
浏览 52
已结题

按分区字段查询hive表,不带引号能正常查询,加上引号无法查询,如何解决?

按分区字段查询hive表,不带引号能正常查询,加上引号无法查询
hive日志如下:

NoViableAltException(155@[941:1: ddlStatement : ( createDatabaseStatement | switchDatabaseStatement | dropDatabaseStatement | createTableStatement | dropTableStatement | truncateTableStatement | alterStatement | descStatement | showStatement | metastoreCheck | createViewStatement | createMaterializedViewStatement | dropViewStatement | dropMaterializedViewStatement | createFunctionStatement | createMacroStatement | dropFunctionStatement | reloadFunctionStatement | dropMacroStatement | analyzeStatement | lockStatement | unlockStatement | lockDatabase | unlockDatabase | createRoleStatement | dropRoleStatement | ( grantPrivileges )=> grantPrivileges | ( revokePrivileges )=> revokePrivileges | showGrants | showRoleGrants | showRolePrincipals | showRoles | grantRole | revokeRole | setRole | showCurrentRole | abortTransactionStatement | killQueryStatement | resourcePlanDdlStatements );])
    at org.antlr.runtime.DFA.noViableAlt(DFA.java:158)
    at org.antlr.runtime.DFA.predict(DFA.java:116)
    at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:4480)
    at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:2578)
    at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1433)
    at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:230)
    at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:79)
    at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:72)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:617)
    at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1870)
    at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1817)
    at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1812)
    at org.apache.hadoop.hive.ql.reexec.ReExecDriver.compileAndRespond(ReExecDriver.java:126)
    at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:197)
    at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:262)
    at org.apache.hive.service.cli.operation.Operation.run(Operation.java:260)
    at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:575)
    at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:561)
    at sun.reflect.GeneratedMethodAccessor76.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
    at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
    at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
    at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
    at com.sun.proxy.$Proxy69.executeStatementAsync(Unknown Source)
    at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:315)
    at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:566)
    at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1557)
    at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1542)
    at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
    at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
    at org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:647)
    at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:750)
FAILED: ParseException line 1:5 cannot recognize input near 'SHOW' 'INDEX' 'ON' in ddl statement

p_date分区带引号p_date='20230130',报错如下

img

p_date分区不带引号p_date=20230130,正常查询如下

img

建表的时候指定了分区字段类型为string,partitioned by (p_date string) stored as orc;

怎样设置带不带引号都能查询?

  • 写回答

9条回答 默认 最新

  • Mr.Guoguo 2023-07-27 20:14
    关注

    所以你是用的表内字段做的分区?不用引号的话是直接查的表内字段,用引号的时候是去查的对应分区,而你的分区应该是有问题的,没有找到对应的这个分区才报错,20开头的分区都找不到,你那个日期的就更找不到了

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(8条)

报告相同问题?

问题事件

  • 系统已结题 8月4日
  • 已采纳回答 7月27日
  • 修改了问题 7月27日
  • 创建了问题 7月27日