2 lgs 4112 lgs_4112 于 2017.09.15 23:28 提问

hbase mapreduce 报错 java.lang.NullPointerException

http://bbs.csdn.net/topics/390865764 这篇文章出错相似,求教大牛们

2017-09-15 23:19:15 [WARN]-[] Your hostname, admin-PC resolves to a loopback/non-reachable address: fe80:0:0:0:0:5efe:c0a8:164%23, but we couldn't find any external IP address!
2017-09-15 23:19:15 [INFO]-[org.apache.hadoop.conf.Configuration.deprecation] session.id is deprecated. Instead, use dfs.metrics.session-id
2017-09-15 23:19:15 [INFO]-[org.apache.hadoop.metrics.jvm.JvmMetrics] Initializing JVM Metrics with processName=JobTracker, sessionId=
Exception in thread "main" java.lang.NullPointerException
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:487)
at org.apache.hadoop.util.Shell.run(Shell.java:460)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:720)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:813)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:796)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:656)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:444)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:308)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:147)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1304)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1304)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1325)
at TestOnlyMapper.main(TestOnlyMapper.java:35)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)

----------------------分割线 代码-------------------------------------------------
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.Cell;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.Result;
import org.apache.hadoop.hbase.client.Scan;
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
import org.apache.hadoop.hbase.mapreduce.TableMapper;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;

import org.apache.hadoop.mapreduce.lib.output.NullOutputFormat;

import java.io.IOException;

/**

  • Created by admin on 2017/9/15.
    */
    public class TestOnlyMapper {

    public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException {
    Configuration conf = HBaseConfiguration.create();
    conf.set("hbase.rootdir","hdfs://hadoop.master:8020/hdfs/hbase");
    conf.set("hbase.zookeeper.quorum","hadoop.master,hadoop.slave11,hadoop.slave12");
    conf.set("hbase.zookeeper.property.clientPort","2181");
    Job job= new Job(conf,"test");
    job.setJarByClass(TestOnlyMapper.class);
    Scan scan = new Scan();
    job.setMapSpeculativeExecution(false);
    job.setReduceSpeculativeExecution(false);

    TableMapReduceUtil.initTableMapperJob("test11",scan,OMapper.class,null,null,job);
    job.setOutputFormatClass(NullOutputFormat.class);
    job.waitForCompletion(true);
    

    }

}
class OMapper extends TableMapper
{
@Override
protected void map(ImmutableBytesWritable key, Result value, Context context) throws IOException, InterruptedException {
for(Cell cell:value.listCells())
{
System.out.println("---------------------");

        System.out.println("cell.getQualifier()= "+cell.getQualifier().toString());

        System.out.println("---------------------");
    }
}

}

1个回答

devmiao
devmiao   Ds   Rxr 2017.09.16 11:04
Csdn user default icon
上传中...
上传图片
插入图片
准确详细的回答,更有利于被提问者采纳,从而获得C币。复制、灌水、广告等回答会被删除,是时候展现真正的技术了!
其他相关推荐
MapReduce连接Hbase时报错及处理
MapReduce连接Hbase时报错及处理我的Map class如下:package com.hbasepackage;import java.io.IOException;import org.apache.hadoop.hbase.client.Result;import org.apache.hadoop.hbase.io.ImmutableBytesWritable;import org.
mapreduce自定义类型-空指针异常之坑NullPointerException
大数据小白一个。在使用mapreduce处理公司实际业务的过程中,有个mapreduce需要用到自定义类型,打包运行时,却遇到空指针NullPointerException异常,耽误了好长时间才找出问题的根源,特以此博客记录,留作学习使用。 场景:从hbase的一张表(activity_statistics)读取数据, 进行处理后, 写入另一张hbase表(activity_scores),ma
java连接Hbase数据报java.lang.NullPointerException空指针异常解决办法
Hi 各位,如果在使用java操作Hbase时,遇到如下报错: Exception in thread "main" java.lang.NullPointerException at org.apache.hadoop.hbase.zookeeper.ZKConfig.makeZKProps(ZKConfig.java:60) at org.apache.hadoop.hbase.zookee
hadoop2.7.1运行wordcount时NullPointerException
hadoop2.7.1运行wordcount时报错: java.lang.NullPointerException at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getBlockIndex(FileInputFormat.java:444) at org.apache.hadoop.mapreduce.lib.input.F
spark 从HIVE读数据导入hbase中发生空指针(java.lang.NullPointerException)问题的解决
陆续好多人会问,在写入Hbase的时候总是会出现空指针的问题,而检查程序,看起来一点也没有错。 如报的错误大致如下: Error: application failed with exception java.lang.RuntimeException: java.lang.NullPointerException     at org.apache.hadoop.hbase.cl
解决MapReduce任务在windows本地执行的NullPointerException问题
为了能在调试MapReduce任务阶段有更好的工作效率,我们可以把URI的Schema设置为file:///,这样MapReduce任务就可以范围windows本地文件夹。当我在尝试这么做的时候出现了如下的空指针异常 Exception in thread "main" java.lang.NullPointerException at java.lang.ProcessBuilder
编写MapReduce程序访问HBase 遇到的问题与解决方法
根据工作需求,需要测试 MapReduce 程序访问HBase 的性能。由于本人面对MapReduce,HBase都是新手,所以在这个过程中遇到了很多问题,主要如下 : MapReduce 程序如何引用第三方 jar 包 MapReduce 访问HBase 的安全认证问题 (kerberos) Hadoop HBase 的conf文件的设定问题
HBase与mapreduce集成操作,以及出错的解决方法
集成环境: hbase-0.98.6-hadoop2 + zookeeper + hadoop-2.6.0   主节点:Master  从节点:Slave1 和 Slave2 1. 将写好的代码打成xxxxxx.jar(例如:hbase-mr-testbasic.jar) 2. 在命令窗口执行如下命令: [root@Master 桌面]# cd /usr/soft/hadoop-2.6.0/
本地mapReduce项目报错:java.lang.NullPointerException at java.lang.ProcessBuilder.start...
问题: java.lang.NullPointerException at java.lang.ProcessBuilder.start(ProcessBuilder.java:1012) at org.apache.hadoop.util.Shell.runCommand(Shell.java:482) at org.apache.hadoop.util.Shell.run(Shell.
如何执行hbase 的mapreduce job
执行hbase mapreduce的两种方法: 1 使用hadoop命令执行mapreduce job.   采用此方式需要修改hadoop-env.sh,将hbase相关的jar包加入到HADOOP_CLASSPATH中去,写法如下:    export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HBASE_HOME/hbase-0.94.2-cdh4.2.