2 fascinatinggirl fascinatingGirl 于 2016.03.04 17:25 提问

java连接单机hbase操作数据

16/03/04 17:09:56 INFO support.ClassPathXmlApplicationContext: Refreshing org.springframework.context.support.ClassPathXmlApplicationContext@1f6ae4d: startup date [Fri Mar 04 17:09:56 CST 2016]; root of context hierarchy
16/03/04 17:09:56 INFO xml.XmlBeanDefinitionReader: Loading XML bean definitions from class path resource [applicationContest.xml]
16/03/04 17:09:57 INFO support.DefaultListableBeanFactory: Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@1d8fe20: defining beans [hbaseConfiguration,htemplate]; root of factory hierarchy
16/03/04 17:09:57 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/03/04 17:09:57 INFO zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0xc4dc7c connecting to ZooKeeper ensemble=192.168.1.202:2181
16/03/04 17:09:57 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.5-cdh5.2.0--1, built on 10/11/2014 20:49 GMT
16/03/04 17:09:57 INFO zookeeper.ZooKeeper: Client environment:host.name=xiaoming
16/03/04 17:09:57 INFO zookeeper.ZooKeeper: Client environment:java.version=1.7.0_17
16/03/04 17:09:57 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
16/03/04 17:09:57 INFO zookeeper.ZooKeeper: Client environment:java.home=C:\Program Files (x86)\Java\jdk1.7.0_17\jre
16/03/04 17:09:57 INFO zookeeper.ZooKeeper: Client environment:java.class.path=F:\mess\12Hadoop\workspace\myhbasetest\WebRoot\WEB-INF\classes;F:\mess\12Hadoop\workspace\myhbasetest\WebRoot\WEB-INF\lib\commons-codec-1.7.jar;F:\mess\12Hadoop\workspace\myhbasetest\WebRoot\WEB-INF\lib\commons-collections-3.2.1.jar;F:\mess\12Hadoop\workspace\myhbasetest\WebRoot\WEB-INF\lib\commons-configuration-1.6.jar;F:\mess\12Hadoop\workspace\myhbasetest\WebRoot\WEB-INF\lib\commons-lang-2.6.jar;F:\mess\12Hadoop\workspace\myhbasetest\WebRoot\WEB-INF\lib\commons-logging-1.1.1.jar;F:\mess\12Hadoop\workspace\myhbasetest\WebRoot\WEB-INF\lib\guava-12.0.1.jar;F:\mess\12Hadoop\workspace\myhbasetest\WebRoot\WEB-INF\lib\hadoop-auth-2.5.0-cdh5.2.0.jar;F:\mess\12Hadoop\workspace\myhbasetest\WebRoot\WEB-INF\lib\hadoop-common-2.5.0-cdh5.2.0.jar;F:\mess\12Hadoop\workspace\myhbasetest\WebRoot\WEB-INF\lib\hadoop-core-2.5.0-mr1-cdh5.2.0.jar;F:\mess\12Hadoop\workspace\myhbasetest\WebRoot\WEB-INF\lib\hbase-client-0.98.6-cdh5.2.0.jar;F:\mess\12Hadoop\workspace\myhbasetest\WebRoot\WEB-INF\lib\hbase-common-0.98.6-cdh5.2.0.jar;F:\mess\12Hadoop\workspace.......
16/03/04 17:09:57 INFO zookeeper.ClientCnxn: Opening socket connection to server 192.168.1.202/192.168.1.202:2181. Will not attempt to authenticate using SASL (unknown error)
16/03/04 17:09:57 INFO zookeeper.ClientCnxn: Socket connection established to 192.168.1.202/192.168.1.202:2181, initiating session
16/03/04 17:09:57 INFO zookeeper.ClientCnxn: Session establishment complete on server 192.168.1.202/192.168.1.202:2181, sessionid = 0x15341c9328d0012, negotiated timeout = 40000
16/03/04 17:17:50 WARN client.HConnectionManager$HConnectionImplementation: Encountered problems when prefetch hbase:meta table:
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=31, exceptions:
.......Fri Mar 04 17:17:50 CST 2016, org.apache.hadoop.hbase.client.RpcRetryingCaller@b7b17a, java.net.UnknownHostException: unknown host: hbase
......at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:129)
at org.apache.hadoop.hbase.client.HTable.getRowOrBefore(HTable.java:714)
at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:144)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.prefetchRegionCache(HConnectionManager.java:1140)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1204)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1092)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1049)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getRegionLocation(HConnectionManager.java:890)
at org.apache.hadoop.hbase.client.RegionServerCallable.prepare(RegionServerCallable.java:72)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:113)
at org.apache.hadoop.hbase.client.HTable.get(HTable.java:780)
at org.springframework.data.hadoop.hbase.HbaseTemplate$2.doInTable(HbaseTemplate.java:182)
at org.springframework.data.hadoop.hbase.HbaseTemplate.execute(HbaseTemplate.java:58)
at org.springframework.data.hadoop.hbase.HbaseTemplate.get(HbaseTemplate.java:168)
at org.springframework.data.hadoop.hbase.HbaseTemplate.get(HbaseTemplate.java:158)
at com.bw.test.Mytest.get(Mytest.java:46)
at com.bw.test.Mytest.main(Mytest.java:40)
Caused by: java.net.UnknownHostException: unknown host: hbase
at org.apache.hadoop.hbase.ipc.RpcClient$Connection.(RpcClient.java:385)
at org.apache.hadoop.hbase.ipc.RpcClient.createConnection(RpcClient.java:351)
at org.apache.hadoop.hbase.ipc.RpcClient.getConnection(RpcClient.java:1530)
at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1442)
at org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1661)
at org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1719)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.get(ClientProtos.java:29966)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1562)
at org.apache.hadoop.hbase.client.HTable$2.call(HTable.java:710)
at org.apache.hadoop.hbase.client.HTable$2.call(HTable.java:708)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:114)
... 16 more

1个回答

fascinatingGirl
fascinatingGirl   2016.03.04 17:37
已采纳

看到有这句错误:Caused by: java.net.UnknownHostException: unknown host: hbase

我在myeclips的配置中都给的ip,发现不管用,还需要在window的hosts中配上映射,应为我Linux虚拟机名称是hbase

加上映射就好了,太折磨人了

Csdn user default icon
上传中...
上传图片
插入图片
准确详细的回答,更有利于被提问者采纳,从而获得C币。复制、灌水、广告等回答会被删除,是时候展现真正的技术了!
其他相关推荐
javaApi调用Hbase 单机模式
今天下载了hbase0.98.11版本  并安装到了192.168.0.2机器 hadoop zookeeper都不用安装 hbase-env.sh 配置  # The java implementation to use.  Java 1.6 required. export JAVA_HOME=/usr/lib/jvm/j2sdk1.6-oracle/ # Extra Java
eclipse 连接并操作单机版Hbase
用eclipse操作hbase
HBase学习(一)hbase安装(单机模式)和javaapi客户端访问hbase例子
平时习惯了window下开发,而Hbase只能在linux环境下运行,所以决定把hbase安装一台虚拟机上,在windows下运行eclipse连接虚拟机里的hbase 因为对linux命令了解不多,所以很多时候都是编码baidu边安装的。 安装Hbase 1、下载Hbase版本 在http://apache.claz.org/hbase/选择一个需要下载的hbase版本,我选择的是h
HBase(1.2.6)+Eclipse+maven (单机模式)
一、先启动HBase,然后打开HBase shell二、在Eclipse上配置HBase1. 新建一个maven项目2. 导入hbase相关的jar3. 在该项目下创建一个新文件夹conf,里面放入hbase的相关配置文件hbase-site.xml(hbase-site.xml的配置参见上一篇文章)4. 右键项目 Buid Path->Configure build path 在Libraries
windows10上安装单机版hbase到使用eclipse连接简单使用hbase
一、首先要安装hadoop 我使用的是hadoop-2.7.4  安装过程网上有很多教程 这里就不写了。二、首先去官网下载hbase 我下载的是hbase-1.3.0-bin.tar.gz三、下载好后,选择一个目录解压。我是直接解压到d盘四、打开hbase解压目录 修改conf/hbase-site.xml文件添加如下内容到hbase-site.xml文件的configuration标签下    ...
Spark操作hbase
在Spark是一种计算框架,在Spark环境下,不仅支持操作单机文件,HDFS文件,同时也可以用Spark对Hbase操作。  企业中数据源会从HBase取出,这就涉及到了读取hbase数据,本文为了尽可能的让大家能尽快实践和操作Hbase,使用的是Spark Shell 来进行Hbase操作。 一、环境: Haoop2.2.0 Hbase版本0.96.2-hadoop2, r158109
HBase单机模式下远程客户端访问无响应
HBase单机模式下远程客户端访问无响应,既不报错,也没有数据返回,就像程序卡死了一样。
Java操作远程Hbase详细步骤
本博文用于叙述Java访问远程Hbase数据库的过程以及代码实现
java实现hbase数据库的增删改查操作(新API)
java 实现 hbase数据库 的 增删改查 操作(新API)
解决java连接Hbase程序卡住执行也不报错的问题
问题描述: 最近在hadoop集群上搭建好Hbase,用hbase shell进行增删改查操作都没有问题,但是用java API来对hbase进行操作却出现一个很奇怪的问题,就是执行java程序后没有任何反应,程序也不报错。 首先贴出我的测试代码: import java.io.IOException; import org.apache.hadoop.conf.Configuration