2 weixin 37557938 weixin_37557938 于 2017.09.06 09:16 提问

在eclipse运行hadoop mapreduce例子报错

在终端运行hadoop带的例子正常,hadoop节点正常,错误如下
17/09/05 20:20:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/09/05 20:20:16 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
17/09/05 20:20:16 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
Exception in thread "main" java.net.ConnectException: Call From master/192.168.1.110 to localhost:9000 failed on connection exception: java.net.ConnectException: 拒绝连接; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
at org.apache.hadoop.ipc.Client.call(Client.java:1479)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1785)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1068)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1064)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1064)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1426)
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:145)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:266)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
at mapreduce.Temperature.main(Temperature.java:202)
Caused by: java.net.ConnectException: 拒绝连接
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
... 28 more

1个回答

qq_38494537
qq_38494537   Rxr 2017.09.06 09:23
已采纳
Csdn user default icon
上传中...
上传图片
插入图片
准确详细的回答,更有利于被提问者采纳,从而获得C币。复制、灌水、广告等回答会被删除,是时候展现真正的技术了!
其他相关推荐
window7使用eclipse环境本地运行MapReduce程序方法-----源自网站“神算子”:www.wangsenfeng.com
一、编写目的     开发的MapReduce在提交到Hadoop集群运行之前,测试是否有bug,希望能在本地使用启动main方法的形式查看是否有错误存在,方便程序的检查和修改。本文档主要针对Windows环境下进行MapReduce开发。 二、环境     系统:Windows7     开发环境:eclipse     Hadoop版本:2.6.0
eclipse 运行MapReduce程序错误异常汇总(解决Map not fount)
错误一: Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class wordCount.wordCount$Map not found at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2074) at org.
Ubuntu系统下的Hadoop集群(4)_使用Eclipse编译运行MapReduce程序
使用Eclipse编译运行MapReduce程序_Hadoop_2.4.1 上篇介绍了使用命令行编译打包运行自己的MapReduce程序,使用 Eclipse 更加方便。要在 Eclipse 上编译和运行 MapReduce 程序,需要安装 hadoop-eclipse-plugin,可使用 Github 上的 hadoop2x-eclipse-plugin,测试环境: U
Eclipse 运行MapReduce 错误 以及参考解决方法
Exception in thread "main" java.io.IOException: Error generating shuffle secret key         at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:476)         at org.apache
Win系统下用Eclipse中运行远程hadoop MapReduce程序出现Permission denied错误
八月 23, 2014 9:12:31 下午 org.apache.hadoop.util.NativeCodeLoader WARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 八月 23, 2014 9:12:31 下午 or
使用Eclipse编译运行MapReduce程序 Hadoop2.7.3
使用Eclipse编译运行MapReduce程序 Hadoop2.7.31.安装hadoop、eclipse2.配置Eclipse主要参考博客:http://www.powerxing.com/hadoop-build-project-using-eclipse/2.1安装Hadoop-Eclipse-PluginHadoop-Eclipse-Plugin2.7.3下载地址:http://downl
Hadoop — 使用Eclipse编译运行MapReduce程序(Hadoop2.6.0)
hosts中的localhost 路径 权限
hadoop2.6.0在eclipse下面调试程序看不到日志的解决方法
使用Hadoop2.6.0,在eclipse下面调试mapreduce程序的时候,控制台不打印程序运行时的日志,而是显示如下信息: log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). log4j:WARN Please in
eclipse运行mapreduce报错Permission denied
今天用在eclipse-hadoop平台上运行map reduce出错了
windows eclipse运行mapreduce遇到权限问题该如何解决
根据hadoop开发方式总结及操作指导, 采用第一种开发方式,我们会遇到权限问题,面对这个问题,该如何解决,有两种解决办法: 第一种解决办法: 1、重新编译hadoop-core-xxx.jar包1. 原因 当用hadoop中自带的hadoop-core-xxx.jar包时,会出现UserGroupInformation类型的错误,这个是Windows下文件权限问题,在Linux