Hive创建表报错求大神帮忙

Hive创建表报错查看日志
Datastore.Schema (Log4JLogger.java:error(115)) - An exception was thrown while adding/validating class(es) : Column length too big for column 'PARAM_VALUE' (max = 21845); use BLOB or TEXT instead

自己研究一天了 怎么搞啊

2个回答

字面意思可以理解,但HIVE的建表SQL中并没有找到超过长度的语句。

字面意思好像是字段定义的长度过长,可以使用blob或者text-->Column length too big for column 'PARAM_VALUE' (max = 21845); use BLOB or TEXT instead

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
sparksql整合hive创建外部表报错(求大佬解答)

sparksql整合hive创建外部表的时候报错 建表语句如下: ``` create external table if not exists bdm.itcast_bdm_order_goods( user_id string,--用户ID order_id string,--订单ID order_no string,--订单号 sku_id bigint,--SKU编号 sku_name string,--SKU名称 goods_id bigint,--商品编号 ) partitioned by (dt string) row format delimited fields terminated by ',' lines terminated by '\n' location '/business/itcast_bdm_order_goods'; ``` 报如下错误: ``` **Moved: 'hdfs://hann/business/itcast_bdm_order_goods' to trash at: hdfs://hann/user/root/.Trash/Current Error in query: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:java.lang.IllegalArgumentException: java.net.UnknownHostExc**eption: nhann); ``` 启动spark-sql的语句如下: ``` spark-sql --master spark://node01:7077 --driver-class-path /export/servers/hive-1.1.0-cdh5.14.0/lib/mysql-connector-java-5.1.38.jar --conf spark.sql.warehouse.dir=hdfs://hann/user/hive/warehouse ``` hive-site.xml配置文件如下: ``` <configuration> <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://node03.hadoop.com:3306/hive?createDatabaseIfNotExist=true</value> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> </property> <property> <name>javax.jdo.option.ConnectionUserName</name> <value>root</value> </property> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>123456</value> </property> <!-- <property> <name>hive.cli.print.current.db</name> <value>true</value> </property> <property> <name>hive.cli.print.header</name> <value>true</value> </property> <property> <name>hive.server2.thrift.bind.host</name> <value>node03.hadoop.com</value> </property> <property> <name>hive.metastore.uris</name> <value>thrift://node03.hadoop.com:9083</value> </property> <property> <name>hive.metastore.client.socket.timeout</name> <value>3600</value> </property>--> </configuration> ```

hive中表都消失了,求大神帮忙

本来是有的,在进入hive时所在的也是一个目录,肯定不是命令和所在目录问题。![图片说明](https://img-ask.csdn.net/upload/201701/11/1484128929_69275.png)![图片说明](https://img-ask.csdn.net/upload/201701/11/1484128751_148331.png)

hive建hbase关联表报错

hadoop-2.5.2 ,hbase-1.0.1.1,hive-0.9.0 hive> CREATE TABLE hbase_table_1(key int, value string) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val") TBLPROPERTIES ("hbase.table.name" = "xyz"); FAILED: Error in metadata: MetaException(message:java.io.IOException: java.lang.reflect.InvocationTargetException at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240) at org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:410) at org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:403) at org.apache.hadoop.hbase.client.ConnectionManager.getConnectionInternal(ConnectionManager.java:281) at org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:202) at org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStorageHandler.java:73) at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:147) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:398) at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:538) at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3305) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:242) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:134) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1326) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1118) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:689) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:557) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238) ... 25 more Caused by: java.lang.NoClassDefFoundError: org/apache/htrace/Trace at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:218) at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:481) at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65) at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:86) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:833) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:623) ... 30 more Caused by: java.lang.ClassNotFoundException: org.apache.htrace.Trace at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ... 36 more ) FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

hive on hbase报错,在hive中创建映射表,关联到hbase上

hive on hbase,在hive中创建映射表,关联到hbase上,在hive中已经创建hbase中的row_key,还是报错FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=xlwang5, access=EXECUTE, inode="/user/hive/warehouse":hive:hive:drwxrwx--T

通过HIVE建立非本地表报错!!

hive 1.0.0 hbase 1.0.0 and hadoop 2.6.0 and zookeeper 3.4.6 建立非本地表报错~~~求帮助!!我已经吧$HBASE_HOME/lib 下所有的包都复制到$HIVE_HOME/lib下了 ``` 5/03/10 03:56:00 [htable-pool1-t1]: DEBUG ipc.AbstractRpcClient: Connecting to v2/192.168.81.131:16020 15/03/10 03:56:02 [main]: ERROR exec.DDLTask: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.addFamily(Lorg/apache/hadoop/hbase/HColumnDescriptor;)V at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:196) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:608) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:601) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:90) at com.sun.proxy.$Proxy7.createTable(Unknown Source) at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:671) at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3973) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:295) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1604) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1364) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1177) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1004) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:994) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:201) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:153) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:364) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:712) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:631) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:570) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.hadoop.hbase.HTableDescriptor.addFamily(Lorg/apache/hadoop/hbase/HColumnDescriptor;)V 15/03/10 03:56:02 [main]: ERROR ql.Driver: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.hadoop.hbase.HTableDescriptor.addFamily(Lorg/apache/hadoop/hbase/HColumnDescriptor;)V 15/03/10 03:56:02 [main]: DEBUG ql.Driver: Shutting down query CREATE TABLE hbase_table_1(key int, value string) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val") TBLPROPERTIES ("hbase.table.name" = "xyz") 15/03/10 03:56:02 [main]: INFO log.PerfLogger: </PERFLOG method=Driver.execute start=1425930957507 end=1425930962331 duration=4824 from=org.apache.hadoop.hive.ql.Driver> ```

hive创建外部表关联hbase,执行hql出错

建表语句:create external table if not exists test_external(rowkey string,name string,age string) stored by 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' with serdeproperties ('hbase.columns.mapping'=':key,cf:name,cf:age','hbase.table.default.storage.type'='binary') tblproperties ('hbase.table.name'='default:test'); hql语句:Select count(*) from ( select t4.ROWKEY A14794456783233, t4.NAME A14794456783253, t4.AGE A14794456783282 from DEFAULT.test_external t4 ) a where 1=2 错误信息: java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: cannot find dir = hdfs://odscluster/user/hadoop/-mr-10003default.test_external{} in pathToPartitionInfo: [-mr-10003default.test_external{}] at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:542) at org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:624) at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:616) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:564) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:559) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:559) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:550) at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:429) at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:75)

Ubuntu14下hadoop1.2.1环境下Hive1.1.1报错,求大神解决,谢谢

Logging initialized using configuration in jar:file:/usr/local/hadoop/lib/hive-common-1.1.1.jar!/hive-log4j.properties Exception in thread "main" java.lang.UnsupportedClassVersionError: com/facebook/fb303/FacebookService$Iface : Unsupported major.minor version 51.0 at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:643) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:277) at java.net.URLClassLoader.access$000(URLClassLoader.java:73) at java.net.URLClassLoader$1.run(URLClassLoader.java:212) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:205) at java.lang.ClassLoader.loadClass(ClassLoader.java:323) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294) at java.lang.ClassLoader.loadClass(ClassLoader.java:268) at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:643) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:277) at java.net.URLClassLoader.access$000(URLClassLoader.java:73) at java.net.URLClassLoader$1.run(URLClassLoader.java:212) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:205) at java.lang.ClassLoader.loadClass(ClassLoader.java:323) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294) at java.lang.ClassLoader.loadClass(ClassLoader.java:268) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:274) at org.apache.hadoop.hive.metastore.MetaStoreUtils.getClass(MetaStoreUtils.java:1451) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:71) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2841) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2860) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:453) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:671) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:622) at org.apache.hadoop.util.RunJar.main(RunJar.java:160)

hive的编程题目,大神来帮忙一下

``` 原始数据8条: "1" "A" "R" 32 "H" "w" "2017-01-16" 1167 80 0 0 0 1200 0 "2" "A" "E" 67 "L" "b" "2016-06-12" 2488 50 1 1 3 4000 1 "3" "A" "E" 32 "H" "w" "2016-07-13" 2122 98 0 0 1 1500 1 "4" "A" "E" "H" "w" "2016-01-20" 2993 115 0 0 0 3200 1 "5" "A" "s" 22 "R" "b" "2016-08-31" 484 69 0 1 3 800 0 "6" "b" "H" 92 "H" "w" "2016-02-04" 1634 81 0 0 1 3400 1 "7" "A" "E" 76 "H" "w" "2016-06-14" 1127 92 0 0 0 6000 1 "8" "A" "R" 82 "r" "w" "2016-05-30" 196 146 1 3 4 2300 0 处理后的结果数据: 2 ['A','E'] 67 ['L','B'] 2010-06-12 2488 50 {1,1,3} 4000 1 3 ['A','E'] 32 ['H','W'] 2010-07-13 2122 98 {0,0,1} 1500 1 4 ['A','E'] 82 ['H','W'] 2010-01-20 2993 115 {0,0,0} 3200 1 5 ['A','S'] 22 ['R','B'] 2010-08-31 484 69 {0,1,3} 800 0 6 ['B','H'] 92 ['H','W'] 2010-02-04 1634 81 {0,0,1} 3400 1 7 ['A','E'] 76 ['H','W'] 2010-06-14 1127 92 {0,0,0} 6000 1 8 ['A','R'] 82 ['R','W'] 2010-05-30 196 146 {1,3,4} 2300 0 ``` ``` 原数据说明: 原数据总共8行14列,列与列之间的分隔符是\t键。 第1和7列的双引号去掉,且第7列的时间大于2017-01-01时,整行数据将被过滤掉。 第2、3、5、6列的双引号变成单引号,小写变大写,第2、3列组成字符数组,第5、6列组成字符数组。 第10、11、12列用"{}"括起来。 组成后的每一列之间用" "分割。 项目要求: 1.将Hive外部表加载原始数据到Hive表(用location 指定数据所在目录) 2.使用UDF函数对原始表数据做过滤,过滤结果如上结果数据。 3.将结果数据添加到新的hive表保存! ```

hive无法启动 远程连接mysql失败,求大神帮忙看看,万分感谢!!!

我的mysql是装在本地机器win7操作系统下的,服务正常,本地登录正常。 hive装在linux虚拟机上,虚拟机与本机网络没有问题,防火墙已关,hadoop安装也没有问题。 我在liunx虚拟机上安装了mysql的客户端,通过命令mysql -uhive -paxx1314 -h192.168.120.1 正常登录,信息如下 [root@zchlinux conf]# mysql -uhive -paxx1314 -h192.168.120.1 Welcome to the MySQL monitor. Commands end with ; or \g. Your MySQL connection id is 9 to server version: 5.0.18-nt Type 'help;' or '\h' for help. Type '\c' to clear the buffer. mysql> show databases; +--------------------+ | Database | +--------------------+ | information_schema | | hivedb | | test | +--------------------+ 3 rows in set (0.03 sec) mysql> 但当我启动hive时报错,报错信息如下: [root@zchlinux conf]# hive Logging initialized using configuration in jar:file:/opt/apache-hive-0.13.0-bin/lib/hive-common-0.13.0.jar!/hive- log4j.properties Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:344) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.util.RunJar.main(RunJar.java:160) Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2444) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2456) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:338) ... 7 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:525) at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410) ... 12 more Caused by: javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:mysql://192.168.120.1:3306/hivedb?createDatabaseIfNotExist=true&useUnicode=true&characterEncoding=UTF-8, username = hive. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ java.sql.SQLException: Access denied for user 'hive'@'192.168.120.128' (using password: YES) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1073) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3609) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3541) ......... ..... .. hive的配置文件hivehive-site.xml内容如下: <?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <configuration> <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://192.168.120.1:3306/hivedb? createDatabaseIfNotExist=true&amp;useUnicode=true&amp;characterEncoding=UTF-8</value> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> </property> <property> <name>javax.jdo.option.ConnectionUserName</name> <value>hive</value> </property> <property> <name>javax.jdo.option.ConnectionPassWord</name> <value>axx1314</value> </property> </configuration>

网站点击流数据分析---创建临时表错误

--抽取refer_url到中间表 "t_ods_tmp_referurl" --将来访url分离出host path query query id 在对来访数据进行抽取时,需要写一个函数,可以使用:Parse_url_tuple() drop table if exists t_ods_tmp_referurl; create table t_ ods _tmp_referurl as SELECT a.*,b.* FROM ods_origin_weblog a LATERAL VIEW parse_url_tuple(regexp_replace(http_referer, "\"", ""), 'HOST', 'PATH','QUERY', 'QUERY:id') b as host, path, query, query_id; 出错: 0: jdbc:hive2://node-1:10000> create table t_ods_tmp_referurl as 0: jdbc:hive2://node-1:10000> SELECT a.*,b.* 0: jdbc:hive2://node-1:10000> FROM ods_weblog_origin a 0: jdbc:hive2://node-1:10000> LATERAL VIEW parse_url_tuple(regexp_replace(http_referer, "\"", ""), 'HOST', 'PATH','QUERY', 'QUERY:id') b as host, path, query, query_id; INFO : Number of reduce tasks is set to 0 since there's no reduce operator 一直停留在这个页面。请教?

hive执行select报错,求大神指导

hive> select * from user; Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.mapred.JobConf.unset(Ljava/lang/String;)V at org.apache.hadoop.hive.ql.io.HiveInputFormat.pushFilters(HiveInputFormat.java:432) at org.apache.hadoop.hive.ql.exec.FetchTask.initialize(FetchTask.java:76) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:443) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:303) at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1067)

Hive请正常,在show databases;报错,求大神解答

2018-02-01T09:46:28,400 WARN [9a4cc1b4-8396-471b-8df0-b1eb3ca1fd82 main] ql.Driver: Caught exception attempting to write metadata call information org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:236) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:388) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:332) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:312) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:354) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:350) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.Driver.dumpMetaCallTimingWithoutEx(Driver.java:683) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:621) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233) ~[hive-cli-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184) ~[hive-cli-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403) ~[hive-cli-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) ~[hive-cli-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) ~[hive-cli-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686) ~[hive-cli-2.3.2.jar:2.3.2] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_151] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_151] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_151] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_151] at org.apache.hadoop.util.RunJar.run(RunJar.java:239) ~[hadoop-common-2.9.0.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:153) ~[hadoop-common-2.9.0.jar:?] Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1701) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:83) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3600) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3652) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3632) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3894) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:248) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:231) ~[hive-exec-2.3.2.jar:2.3.2] ... 23 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_151] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_151] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_151] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_151] at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1699) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:83) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3600) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3652) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3632) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3894) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:248) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:231) ~[hive-exec-2.3.2.jar:2.3.2] ... 23 more Caused by: org.apache.hadoop.hive.metastore.api.MetaException: Version information not found in metastore. at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:83) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:92) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6893) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:164) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70) ~[hive-exec-2.3.2.jar:2.3.2] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_151] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_151] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_151] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_151] at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1699) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:83) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3600) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3652) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3632) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3894) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:248) ~[hive-exec-2.3.2.jar:2.3.2] at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:231) ~[hive-exec-2.3.2.jar:2.3.2] ... 23 more 2018-02-01T09:46:28,400 INFO [9a4cc1b4-8396-471b-8df0-b1eb3ca1fd82 main] ql.Driver: Completed compiling command(queryId=root_20180201094627_6a378f28-ae24-4c00-8d15-a8df87e7020e); Time taken: 0.474 seconds

Hive分割符的问题,求大神解决

问题描述如下:首先我用的Hadoop集群环境是CDH版本5.3.0的,在往HDFS中导入数据 的时候,文本文件用^A(文本文件格式UTF-8,也就是Ctrl-A作为分隔符)。 然后Hive SQL的脚本用书写如下: use default; DROP TABLE IF EXISTS test; create external table test ( test1 string, test2 string, test3 string ) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LOCATION '/tmp/test'; 用于测试的文本文件内容如下: 1^A2^A3 A^AB^AC 我^A爱^A你 而我得到建立的表的样式在附件图片。 也就是说完全没有按照Hive相关的文档上说的用\001作为分隔符,纠结了我好长时间的问题,求大神帮忙解决![图片说明](https://img-ask.csdn.net/upload/201504/17/1429252966_205380.jpg) 没有金币了,先谢谢各位帮忙

hive+mongodb报错IllegalAccessError

用hive+mongodb, ``` CREATE EXTERNAL TABLE test ( id string, test string ) STORED BY 'com.mongodb.hadoop.hive.MongoStorageHandler' WITH SERDEPROPERTIES('mongo.columns.mapping'='{"id":"_id","test":"test"}') TBLPROPERTIES('mongo.uri'='mongodb://root:root@127.0.0.1:27017/test.test'); ``` 创建成功过后,执行查询会报一下错误,不知道谁能帮忙看看 ``` Exception in thread "main" java.lang.IllegalAccessError: tried to access field org.apache.hadoop.hive.ql.io.HiveInputFormat.LOG from class com.mongodb.hadoop.hive.input.HiveMongoInputFormat at com.mongodb.hadoop.hive.input.HiveMongoInputFormat.getSplits(HiveMongoInputFormat.java:81) at com.mongodb.hadoop.hive.input.HiveMongoInputFormat.getSplits(HiveMongoInputFormat.java:44) at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextSplits(FetchOperator.java:363) at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:295) at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:446) at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:415) at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:140) at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1693) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681) ```

hive使用load导入csv文件,怎么自动创建分区?

数据表格式如下 name,year,month,day 4个字段,后三个是分区字段 然后现有一csv文件内容为 jamie,1996,04,10 hims,1995,05,17 kash,1997,12,11 怎么在使用load data导入的时候成功导入,并且自动创建分区? 求语句

hive库database字符集问题

能不能在hive里面创建一个database,给这个database设置单独的字符集

hive库中的表查询时报错,其他的表可以正常查询,只有这个表不行

Diagnostic Messages for this Task: Error: java.io.IOException: java.lang.reflect.InvocationTargetException at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97) at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57) at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:266) at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.<init>(HadoopShimsSecure.java:213) at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getRecordReader(HadoopShimsSecure.java:333) at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:720) at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:169) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1785) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:252) ... 11 more Caused by: java.lang.IndexOutOfBoundsException: toIndex = 608 at java.util.ArrayList.subListRangeCheck(ArrayList.java:1004) at java.util.ArrayList.subList(ArrayList.java:996) at org.apache.hadoop.hive.ql.io.orc.RecordReaderFactory.getSchemaOnRead(RecordReaderFactory.java:161) at org.apache.hadoop.hive.ql.io.orc.RecordReaderFactory.createTreeReader(RecordReaderFactory.java:66) at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.<init>(RecordReaderImpl.java:202) at org.apache.hadoop.hive.ql.io.orc.ReaderImpl.rowsOptions(ReaderImpl.java:541) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.createReaderFromFile(OrcInputFormat.java:232) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.<init>(OrcInputFormat.java:165) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.getRecordReader(OrcInputFormat.java:1156) at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.<init>(CombineHiveRecordReader.java:67) ... 16 more

根据数据文件字段动态创建对应的hive表

A文件 name: 字段一: 字段二: 字段三: ....... 字段n A具体字段个数,名称,name 未知,所有字段内容定义为 string类型 现在需要一个hive脚本 通过接收A文件传递出的参数(A文件的名称,字段名称,字段个数), 将A文件的文件名称作为所创建的hive表名称,字段名称作为列名称,根据A文件的字段个数定义所创建的hive表的列数 最后在外部执行创建A文件对应的hive表 example 若A文件传递出name为mytest,字段为三个字段,分别为name1,name2,name3;则 所需要的表为 test.sql use mydb; create table if not exists tablename( name1 string, name2 string, name3 string ); alter table tablename rename to '${hiveconf:tablename}'; 最后在外部执行创建A文件对应的hive表语句 hive -hiveconf tablename=mytest -f test.sql 希望 将用两个变量替代 tablename 和 “中间创建的字段以及类型拼接在一起” 不知怎的总出错,希望有大神指点,不吝感激

Hive中在整合HBase的表中插入数据时报错

伪分布式模式下整合Hadoop 2.2.0(自己基于Ubuntu 64位系统编译的)+HBase 0.98+Hive 0.14,其他功能操作都正常,但是在Hive中往基于HBase存储的表中插入数据时报错,网上找了很多方法,但都没用,具体错误如下: java.lang.IllegalArgumentException: Can not create a Path from an empty string at org.apache.hadoop.fs.Path.checkPathArg(Path.java:127) at org.apache.hadoop.fs.Path.<init>(Path.java:135) at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:213) at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:300) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:387) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548) at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:429) at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1604) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1364) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1177) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1004) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:994) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:247) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:199) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:410) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:783) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) Job Submission failed with exception 'java.lang.IllegalArgumentException(Can not create a Path from an empty string)' FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask 有没有人遇到过同样的问题?

在中国程序员是青春饭吗?

今年,我也32了 ,为了不给大家误导,咨询了猎头、圈内好友,以及年过35岁的几位老程序员……舍了老脸去揭人家伤疤……希望能给大家以帮助,记得帮我点赞哦。 目录: 你以为的人生 一次又一次的伤害 猎头界的真相 如何应对互联网行业的「中年危机」 一、你以为的人生 刚入行时,拿着傲人的工资,想着好好干,以为我们的人生是这样的: 等真到了那一天,你会发现,你的人生很可能是这样的: ...

程序员请照顾好自己,周末病魔差点一套带走我。

程序员在一个周末的时间,得了重病,差点当场去世,还好及时挽救回来了。

技术大佬:我去,你写的 switch 语句也太老土了吧

昨天早上通过远程的方式 review 了两名新来同事的代码,大部分代码都写得很漂亮,严谨的同时注释也很到位,这令我非常满意。但当我看到他们当中有一个人写的 switch 语句时,还是忍不住破口大骂:“我擦,小王,你丫写的 switch 语句也太老土了吧!” 来看看小王写的代码吧,看完不要骂我装逼啊。 private static String createPlayer(PlayerTypes p...

和黑客斗争的 6 天!

互联网公司工作,很难避免不和黑客们打交道,我呆过的两家互联网公司,几乎每月每天每分钟都有黑客在公司网站上扫描。有的是寻找 Sql 注入的缺口,有的是寻找线上服务器可能存在的漏洞,大部分都...

上班一个月,后悔当初着急入职的选择了

最近有个老铁,告诉我说,上班一个月,后悔当初着急入职现在公司了。他之前在美图做手机研发,今年美图那边今年也有一波组织优化调整,他是其中一个,在协商离职后,当时捉急找工作上班,因为有房贷供着,不能没有收入来源。所以匆忙选了一家公司,实际上是一个大型外包公司,主要派遣给其他手机厂商做外包项目。**当时承诺待遇还不错,所以就立马入职去上班了。但是后面入职后,发现薪酬待遇这块并不是HR所说那样,那个HR自...

女程序员,为什么比男程序员少???

昨天看到一档综艺节目,讨论了两个话题:(1)中国学生的数学成绩,平均下来看,会比国外好?为什么?(2)男生的数学成绩,平均下来看,会比女生好?为什么?同时,我又联想到了一个技术圈经常讨...

总结了 150 余个神奇网站,你不来瞅瞅吗?

原博客再更新,可能就没了,之后将持续更新本篇博客。

副业收入是我做程序媛的3倍,工作外的B面人生是怎样的?

提到“程序员”,多数人脑海里首先想到的大约是:为人木讷、薪水超高、工作枯燥…… 然而,当离开工作岗位,撕去层层标签,脱下“程序员”这身外套,有的人生动又有趣,马上展现出了完全不同的A/B面人生! 不论是简单的爱好,还是正经的副业,他们都干得同样出色。偶尔,还能和程序员的特质结合,产生奇妙的“化学反应”。 @Charlotte:平日素颜示人,周末美妆博主 大家都以为程序媛也个个不修边幅,但我们也许...

如果你是老板,你会不会踢了这样的员工?

有个好朋友ZS,是技术总监,昨天问我:“有一个老下属,跟了我很多年,做事勤勤恳恳,主动性也很好。但随着公司的发展,他的进步速度,跟不上团队的步伐了,有点...

我入职阿里后,才知道原来简历这么写

私下里,有不少读者问我:“二哥,如何才能写出一份专业的技术简历呢?我总感觉自己写的简历太烂了,所以投了无数份,都石沉大海了。”说实话,我自己好多年没有写过简历了,但我认识的一个同行,他在阿里,给我说了一些他当年写简历的方法论,我感觉太牛逼了,实在是忍不住,就分享了出来,希望能够帮助到你。 01、简历的本质 作为简历的撰写者,你必须要搞清楚一点,简历的本质是什么,它就是为了来销售你的价值主张的。往深...

外包程序员的幸福生活

今天给你们讲述一个外包程序员的幸福生活。男主是Z哥,不是在外包公司上班的那种,是一名自由职业者,接外包项目自己干。接下来讲的都是真人真事。 先给大家介绍一下男主,Z哥,老程序员,是我十多年前的老同事,技术大牛,当过CTO,也创过业。因为我俩都爱好喝酒、踢球,再加上住的距离不算远,所以一直也断断续续的联系着,我对Z哥的状况也有大概了解。 Z哥几年前创业失败,后来他开始干起了外包,利用自己的技术能...

优雅的替换if-else语句

场景 日常开发,if-else语句写的不少吧??当逻辑分支非常多的时候,if-else套了一层又一层,虽然业务功能倒是实现了,但是看起来是真的很不优雅,尤其是对于我这种有强迫症的程序"猿",看到这么多if-else,脑袋瓜子就嗡嗡的,总想着解锁新姿势:干掉过多的if-else!!!本文将介绍三板斧手段: 优先判断条件,条件不满足的,逻辑及时中断返回; 采用策略模式+工厂模式; 结合注解,锦...

深入剖析Springboot启动原理的底层源码,再也不怕面试官问了!

大家现在应该都对Springboot很熟悉,但是你对他的启动原理了解吗?

离职半年了,老东家又发 offer,回不回?

有小伙伴问松哥这个问题,他在上海某公司,在离职了几个月后,前公司的领导联系到他,希望他能够返聘回去,他很纠结要不要回去? 俗话说好马不吃回头草,但是这个小伙伴既然感到纠结了,我觉得至少说明了两个问题:1.曾经的公司还不错;2.现在的日子也不是很如意。否则应该就不会纠结了。 老实说,松哥之前也有过类似的经历,今天就来和小伙伴们聊聊回头草到底吃不吃。 首先一个基本观点,就是离职了也没必要和老东家弄的苦...

2020阿里全球数学大赛:3万名高手、4道题、2天2夜未交卷

阿里巴巴全球数学竞赛( Alibaba Global Mathematics Competition)由马云发起,由中国科学技术协会、阿里巴巴基金会、阿里巴巴达摩院共同举办。大赛不设报名门槛,全世界爱好数学的人都可参与,不论是否出身数学专业、是否投身数学研究。 2020年阿里巴巴达摩院邀请北京大学、剑桥大学、浙江大学等高校的顶尖数学教师组建了出题组。中科院院士、美国艺术与科学院院士、北京国际数学...

男生更看重女生的身材脸蛋,还是思想?

往往,我们看不进去大段大段的逻辑。深刻的哲理,往往短而精悍,一阵见血。问:产品经理挺漂亮的,有点心动,但不知道合不合得来。男生更看重女生的身材脸蛋,还是...

为什么程序员做外包会被瞧不起?

二哥,有个事想询问下您的意见,您觉得应届生值得去外包吗?公司虽然挺大的,中xx,但待遇感觉挺低,马上要报到,挺纠结的。

当HR压你价,说你只值7K,你该怎么回答?

当HR压你价,说你只值7K时,你可以流畅地回答,记住,是流畅,不能犹豫。 礼貌地说:“7K是吗?了解了。嗯~其实我对贵司的面试官印象很好。只不过,现在我的手头上已经有一份11K的offer。来面试,主要也是自己对贵司挺有兴趣的,所以过来看看……”(未完) 这段话主要是陪HR互诈的同时,从公司兴趣,公司职员印象上,都给予对方正面的肯定,既能提升HR的好感度,又能让谈判气氛融洽,为后面的发挥留足空间。...

面试:第十六章:Java中级开发(16k)

HashMap底层实现原理,红黑树,B+树,B树的结构原理 Spring的AOP和IOC是什么?它们常见的使用场景有哪些?Spring事务,事务的属性,传播行为,数据库隔离级别 Spring和SpringMVC,MyBatis以及SpringBoot的注解分别有哪些?SpringMVC的工作原理,SpringBoot框架的优点,MyBatis框架的优点 SpringCould组件有哪些,他们...

面试阿里p7,被按在地上摩擦,鬼知道我经历了什么?

面试阿里p7被问到的问题(当时我只知道第一个):@Conditional是做什么的?@Conditional多个条件是什么逻辑关系?条件判断在什么时候执...

你期望月薪4万,出门右拐,不送,这几个点,你也就是个初级的水平

先来看几个问题通过注解的方式注入依赖对象,介绍一下你知道的几种方式@Autowired和@Resource有何区别说一下@Autowired查找候选者的...

面试了一个 31 岁程序员,让我有所触动,30岁以上的程序员该何去何从?

最近面试了一个31岁8年经验的程序猿,让我有点感慨,大龄程序猿该何去何从。

大三实习生,字节跳动面经分享,已拿Offer

说实话,自己的算法,我一个不会,太难了吧

程序员垃圾简历长什么样?

已经连续五年参加大厂校招、社招的技术面试工作,简历看的不下于万份 这篇文章会用实例告诉你,什么是差的程序员简历! 疫情快要结束了,各个公司也都开始春招了,作为即将红遍大江南北的新晋UP主,那当然要为小伙伴们做点事(手动狗头)。 就在公众号里公开征简历,义务帮大家看,并一一点评。《启舰:春招在即,义务帮大家看看简历吧》 一石激起千层浪,三天收到两百多封简历。 花光了两个星期的所有空闲时...

《Oracle Java SE编程自学与面试指南》最佳学习路线图2020年最新版(进大厂必备)

正确选择比瞎努力更重要!

《Oracle Java SE编程自学与面试指南》最佳学习路线图(2020最新版)

正确选择比瞎努力更重要!

字节跳动面试官竟然问了我JDBC?

轻松等回家通知

面试官:你连SSO都不懂,就别来面试了

大厂竟然要考我SSO,卧槽。

终于,月薪过5万了!

来看几个问题想不想月薪超过5万?想不想进入公司架构组?想不想成为项目组的负责人?想不想成为spring的高手,超越99%的对手?那么本文内容是你必须要掌握的。本文主要详解bean的生命...

自从喜欢上了B站这12个UP主,我越来越觉得自己是个废柴了!

不怕告诉你,我自从喜欢上了这12个UP主,哔哩哔哩成为了我手机上最耗电的软件,几乎每天都会看,可是吧,看的越多,我就越觉得自己是个废柴,唉,老天不公啊,不信你看看…… 间接性踌躇满志,持续性混吃等死,都是因为你们……但是,自己的学习力在慢慢变强,这是不容忽视的,推荐给你们! 都说B站是个宝,可是有人不会挖啊,没事,今天咱挖好的送你一箩筐,首先啊,我在B站上最喜欢看这个家伙的视频了,为啥 ,咱撇...

立即提问
相关内容推荐