java 关于hadoop 以及intellij 路径设置问题
  1. 最近在学习hadoop,尝试在电脑上安装Pig。 安装的时候发现要在.bash中修改java_path。 Pig的要装要求是要求安装在hadoop以及java同一个地方,于是我重新安装了java JDK,与hadoop,pig一起安装在/usr/local/Cellar下。最后将.bashrc文件按要求修改。电脑上可以正常运行Pig。但是当我打开Intellij, 发现原来可与run的mapreduce program build 不了,错误提示说‘can not resolve symble apache’如下图所示。图片说明

于是我网上搜索解决方法,有人说是因为maven项目需要重启,但是我启动不了maven,原因是JDK导入发生了错误。目前的project ructure信息如图:图片说明

于是我向重新导入JDK,但是我发现我在/usr/local/Cellar目录下没有办法安装JDK。因为我在Oracle网站下载的JDK压缩包是dmg的形式,不能在命令行打开并安装。我尝试了hdiutil attach jdk-8u201-macosx-x64.dmg,也没有成功。

请问大家这个问题要如何解决呢?是将Pig和Hadoop文件重新移动到Library目录下吗?
目前我的bash如下图所示:

图片说明

谢谢大家!任何提议都感激不尽!

1个回答

问题解决了,是我把project的pom文件不知道搞到哪里去了导致不能import Apache. 重新创建了一个maven项目,再把原来的代码文件copy进去可以运行。

修改1:

最终原因是pom文件里面的设置问题
要把apache加进dependencies才可以

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
java连接hadoop hdfs文件系统报错
报错信息: java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "localhost.localdomain/127.0.0.1"; destination host is: "172.16.6.57":9000; at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763) at org.apache.hadoop.ipc.Client.call(Client.java:1229) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202) at $Proxy9.create(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83) at $Proxy9.create(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:193) at org.apache.hadoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:1324) at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1343) at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1255) at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1212) at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:276) at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:265) at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:82) at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:886) at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:781) at com.zk.hdfs.FileCopyToHdfs.uploadToHdfs(FileCopyToHdfs.java:44) at com.zk.hdfs.FileCopyToHdfs.main(FileCopyToHdfs.java:21) Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag. at com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73) at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124) at com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213) at com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746) at com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238) at com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282) at com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760) at com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288) at com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752) at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985) at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938) at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836) 代码是在网上找的: package com.zk.hdfs; import java.io.BufferedInputStream; import java.io.FileInputStream; import java.io.FileNotFoundException; import java.io.IOException; import java.io.InputStream; import java.io.OutputStream; import java.net.URI; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.IOUtils; import org.apache.hadoop.util.Progressable; public class FileCopyToHdfs { public static void main(String[] args) throws Exception { try { uploadToHdfs(); //deleteFromHdfs(); //getDirectoryFromHdfs(); // appendToHdfs(); // readFromHdfs(); } catch (Exception e) { // TODO Auto-generated catch block e.printStackTrace(); } finally { System.out.println("SUCCESS"); } } /**上传文件到HDFS上去*/ public static void uploadToHdfs() throws FileNotFoundException,IOException { String localSrc = "e:/test.txt"; String dst = "hdfs://172.16.6.57:9000/user/abc/zk/test1.txt"; InputStream in = new BufferedInputStream(new FileInputStream(localSrc)); Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(URI.create(dst), conf); OutputStream out = fs.create(new Path(dst), new Progressable() { public void progress() { System.out.print("."); } }); IOUtils.copyBytes(in, out, 4096, true); } } 总是报连接问题,网上搜不到资料,大牛帮下忙啊
关于hadoop_hive的问题
本人接触hadoop与hive不长时间 因为是接手别人的工作所以有点吃力。最近遇到一个问题,不知道应该怎样解决,希望有大神可以解答一下。 问题是这样的:我执行一个脚本从hive里面读取数据然后写入一个csv文件里,hiveQL语句其实也就是从一个表中读取数据,加上一些字段得时间段参数之类的。 但是在跑的时候时不时会出现问题 日志如下: Task with the most failures(4): ----- Task ID: task_1513574350768_3535_m_000655 URL: http://hadoopnode102:8088/taskdetails.jsp?jobid=job_1513574350768_3535&tipid=task_1513574350768_3535_m_000655 ----- Diagnostic Messages for this Task: Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (数据在这里就先不铺出来了) at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:185) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (数据在这里就先不铺出来了) at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:503) at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:176) ... 8 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.net.SocketTimeoutException: 75000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/132.96.186.7:58295 remote=/132.96.186.9:50010] at org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:723) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) at org.apache.hadoop.hive.ql.exec.FilterOperator.processOp(FilterOperator.java:120) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) at org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:95) at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:157) at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:493) ... 9 more Caused by: java.net.SocketTimeoutException: 75000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/132.96.186.7:58295 remote=/132.96.186.9:50010] at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:118) at java.io.FilterInputStream.read(FilterInputStream.java:83) at java.io.FilterInputStream.read(FilterInputStream.java:83) at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:2201) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1439) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1361) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:588) Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 但是跑多几次就可以跑出来, 所以我觉得应该不是数据问题,是配置上没有做好 而且这个查询只有map也没有reduce,不知道该如何排查。 希望有大神知道如何解决指导指导。谢谢~
centos6.8搭建hadoop2.X伪分布式无法启动namenode
能够格式化节点信息,但是namenode无法启动。在日志中出现如下错误 ``` STARTUP_MSG: build = Unknown -r Unknown; compiled by 'root' on 2017-05-22T10:49Z STARTUP_MSG: java = 1.8.0_144 ************************************************************/ 2020-01-31 16:37:06,931 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT] 2020-01-31 16:37:06,935 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode [] 2020-01-31 16:37:07,161 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties 2020-01-31 16:37:07,233 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s). 2020-01-31 16:37:07,233 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system started 2020-01-31 16:37:07,236 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: fs.defaultFS is hdfs://hadoop101:9000 2020-01-31 16:37:07,236 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Clients are to use hadoop101:9000 to access this namenode/service. 2020-01-31 16:37:07,409 INFO org.apache.hadoop.hdfs.DFSUtil: Starting Web-server for hdfs at: http://huawei_mate_10-53013e4c60:50070 2020-01-31 16:37:07,457 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog 2020-01-31 16:37:07,464 INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. 2020-01-31 16:37:07,469 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.namenode is not defined 2020-01-31 16:37:07,473 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2020-01-31 16:37:07,475 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: Failed to start namenode. java.lang.IllegalArgumentException: The value of property bind.address must not be null at com.google.common.base.Preconditions.checkArgument(Preconditions.java:88) at org.apache.hadoop.conf.Configuration.set(Configuration.java:1134) at org.apache.hadoop.conf.Configuration.set(Configuration.java:1115) at org.apache.hadoop.http.HttpServer2.initializeWebServer(HttpServer2.java:398) at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:351) at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:114) at org.apache.hadoop.http.HttpServer2$Builder.build(HttpServer2.java:290) at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:126) at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:752) at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:638) at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:811) at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:795) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1488) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1554) 2020-01-31 16:37:07,477 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1 2020-01-31 16:37:07,479 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down NameNode at hadoop101/192.168.117.101 ************************************************************/ ``` 主要的报错信息是 java.lang.IllegalArgumentException: The value of property bind.address must not be null core-site.xml的配置信息 <configuration> <!-- 指定HDFS中NameNode的地址 --> <property> <name>fs.defaultFS</name> <value>hdfs://hadoop101:9000</value> </property> <!-- hadoop101已经在hosts文件中配置 --> <!-- 指定Hadoop运行时产生文件的存储目录 --> <property> <name>hadoop.tmp.dir</name> <value>/opt/module/hadoop-2.7.2/data/tmp</value> </property> </configuration> 希望大神能够帮忙解答一下。万分感谢感谢
关于Hadoop的java_home配置的问题
export JAVA_HOME="/cygdrive/C:/Program Files/Java/jdk1.6.0_22"f 老是说没有这个路径 请问该怎么解决啊? 我是使用cygwin进行hadoop配置和使用的
运行PI实例检查Hadoop集群是否搭建成功时显示“bash:hadoop: command not found”该怎么解决?
运行PI实例检查Hadoop集群是否搭建成功时执行命令: [hfut@master ~]$ cd ~/hadoop-2.5.2/share/hadoop/mapreduce/ [hfut@master mapreduce]$ hadoop jar home/hfut/hadoop-2.5.2/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.5.2.jar pi 10 10 第二条命令执行完毕后显示bash:hadoop: command not found 如下图所示: ![图片说明](https://img-ask.csdn.net/upload/202001/30/1580370011_357824.png) 请问该怎么解决?
运行hive出现的错误Exception in thread "main" java.lang.ClassNotFoundException: org.apache.hive.beeline.HiveSchemaTool
``` [root@bigdata113 bin]# ./schematool -dbType mysql -initSchema Exception in thread "main" java.lang.ClassNotFoundException: org.apache.hive.beeline.HiveSchemaTool at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.hadoop.util.RunJar.run(RunJar.java:214) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) ``` 求怎么jie'ju
关于hadoop伪分布式安装问题
安装了6.7次hadoop了,每一次的问题都花样百出,这次又是开启NameNode和DataNode守护进程时出现问题了,0.0.0.0的问题按网上说关闭防火墙已经执行了却还是运行不了还是有bug 有大神么帮忙解决一下。。。
hadoop jar运行jar包问题!请问这种情况如何解决?谢谢
hadoop jar运行jar包,传入hadoop的文件路径提示不是文件或目录,但在linux使用java jar传linux路径有用! ![图片说明](https://img-ask.csdn.net/upload/201905/21/1558439375_749706.png)
Hadoop Distcp报错 队列问题
``` sudo -uxiaosi hadoop distcp hdfs:///user/xiaosi/tmp/data_group/histories/day=20161116 hdfs:///user/xiaosi/data_group/histories ``` 报错: ``` 17/01/17 19:18:46 ERROR security.UserGroupInformation: PriviledgedActionException as:xiaosi (auth:SIMPLE) cause:java.io.IOException: Failed to run job : User xiaosi cannot submit applications to queue root.default 17/01/17 19:18:46 ERROR tools.DistCp: Exception encountered java.io.IOException: Failed to run job : User xiaosi cannot submit applications to queue root.default at org.apache.hadoop.mapred.YARNRunner.submitJob(YARNRunner.java:299) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:430) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265) at org.apache.hadoop.tools.DistCp.execute(DistCp.java:153) at org.apache.hadoop.tools.DistCp.run(DistCp.java:118) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.tools.DistCp.main(DistCp.java:375) ```
sqoop1.99.6启动job报错
用start job -jid 1报错 Exception has occurred during processing command Exception: org.apache.sqoop.common.SqoopException Message: CLIENT_0001:Server has returned exception Stack trace: at org.apache.sqoop.client.request.ResourceRequest (ResourceRequest.java:129) at org.apache.sqoop.client.request.ResourceRequest (ResourceRequest.java:179) at org.apache.sqoop.client.request.JobResourceRequest (JobResourceRequest.java:112) at org.apache.sqoop.client.request.SqoopResourceRequests (SqoopResourceRequests.java:157) at org.apache.sqoop.client.SqoopClient (SqoopClient.java:452) at org.apache.sqoop.shell.StartJobFunction (StartJobFunction.java:80) at org.apache.sqoop.shell.SqoopFunction (SqoopFunction.java:51) at org.apache.sqoop.shell.SqoopCommand (SqoopCommand.java:135) at org.apache.sqoop.shell.SqoopCommand (SqoopCommand.java:111) at org.codehaus.groovy.tools.shell.Command$execute (null:-1) at org.codehaus.groovy.runtime.callsite.CallSiteArray (CallSiteArray.java:42) at org.codehaus.groovy.tools.shell.Command$execute (null:-1) at org.codehaus.groovy.tools.shell.Shell (Shell.groovy:101) at org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:-1) at sun.reflect.GeneratedMethodAccessor23 (null:-1) at sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method (Method.java:498) at org.codehaus.groovy.reflection.CachedMethod (CachedMethod.java:90) at groovy.lang.MetaMethod (MetaMethod.java:233) at groovy.lang.MetaClassImpl (MetaClassImpl.java:1054) at org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:128) at org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:173) at sun.reflect.GeneratedMethodAccessor22 (null:-1) at sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method (Method.java:498) at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce (PogoMetaMethodSite.java:267) at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite (PogoMetaMethodSite.java:52) at org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:141) at org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:121) at org.codehaus.groovy.tools.shell.Shell (Shell.groovy:114) at org.codehaus.groovy.tools.shell.Shell$leftShift$0 (null:-1) at org.codehaus.groovy.tools.shell.ShellRunner (ShellRunner.groovy:88) at org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:-1) at sun.reflect.GeneratedMethodAccessor20 (null:-1) at sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method (Method.java:498) at org.codehaus.groovy.reflection.CachedMethod (CachedMethod.java:90) at groovy.lang.MetaMethod (MetaMethod.java:233) at groovy.lang.MetaClassImpl (MetaClassImpl.java:1054) at org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:128) at org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:148) at org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:100) at sun.reflect.GeneratedMethodAccessor19 (null:-1) at sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method (Method.java:498) at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce (PogoMetaMethodSite.java:267) at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite (PogoMetaMethodSite.java:52) at org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:137) at org.codehaus.groovy.tools.shell.ShellRunner (ShellRunner.groovy:57) at org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:-1) at sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2) at sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method (Method.java:498) at org.codehaus.groovy.reflection.CachedMethod (CachedMethod.java:90) at groovy.lang.MetaMethod (MetaMethod.java:233) at groovy.lang.MetaClassImpl (MetaClassImpl.java:1054) at org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:128) at org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:148) at org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:66) at java_lang_Runnable$run (null:-1) at org.codehaus.groovy.runtime.callsite.CallSiteArray (CallSiteArray.java:42) at org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:108) at org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:112) at org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:463) at org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:402) at org.apache.sqoop.shell.SqoopShell (SqoopShell.java:130) Caused by: Exception: org.apache.sqoop.common.SqoopException Message: GENERIC_HDFS_CONNECTOR_0007:Invalid output directory - Unexpected exception Stack trace: at org.apache.sqoop.connector.hdfs.HdfsToInitializer (HdfsToInitializer.java:71) at org.apache.sqoop.connector.hdfs.HdfsToInitializer (HdfsToInitializer.java:35) at org.apache.sqoop.driver.JobManager (JobManager.java:449) at org.apache.sqoop.driver.JobManager (JobManager.java:373) at org.apache.sqoop.driver.JobManager (JobManager.java:276) at org.apache.sqoop.handler.JobRequestHandler (JobRequestHandler.java:380) at org.apache.sqoop.handler.JobRequestHandler (JobRequestHandler.java:116) at org.apache.sqoop.server.v1.JobServlet (JobServlet.java:96) at org.apache.sqoop.server.SqoopProtocolServlet (SqoopProtocolServlet.java:79) at javax.servlet.http.HttpServlet (HttpServlet.java:646) at javax.servlet.http.HttpServlet (HttpServlet.java:723) at org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:290) at org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206) at org.apache.hadoop.security.authentication.server.AuthenticationFilter (AuthenticationFilter.java:644) at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter (DelegationTokenAuthenticationFilter.java:304) at org.apache.hadoop.security.authentication.server.AuthenticationFilter (AuthenticationFilter.java:592) at org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:235) at org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206) at org.apache.catalina.core.StandardWrapperValve (StandardWrapperValve.java:233) at org.apache.catalina.core.StandardContextValve (StandardContextValve.java:191) at org.apache.catalina.core.StandardHostValve (StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve (ErrorReportValve.java:103) at org.apache.catalina.core.StandardEngineValve (StandardEngineValve.java:109) at org.apache.catalina.connector.CoyoteAdapter (CoyoteAdapter.java:293) at org.apache.coyote.http11.Http11Processor (Http11Processor.java:861) at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler (Http11Protocol.java:606) at org.apache.tomcat.util.net.JIoEndpoint$Worker (JIoEndpoint.java:489) at java.lang.Thread (Thread.java:748) Caused by: Exception: java.io.IOException Message: Failed on local exception: org.apache.hadoop.ipc.RpcException: RPC response exceeds maximum data length; Host Details : local host is: "node01/192.168.65.100"; destination host is: "node01":9870; Stack trace: at org.apache.hadoop.net.NetUtils (NetUtils.java:818) at org.apache.hadoop.ipc.Client (Client.java:1549) at org.apache.hadoop.ipc.Client (Client.java:1491) at org.apache.hadoop.ipc.Client (Client.java:1388) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker (ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker (ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy19 (null:-1) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB (ClientNamenodeProtocolTranslatorPB.java:907) at sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2) at sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method (Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler (RetryInvocationHandler.java:422) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call (RetryInvocationHandler.java:165) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call (RetryInvocationHandler.java:157) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call (RetryInvocationHandler.java:95) at org.apache.hadoop.io.retry.RetryInvocationHandler (RetryInvocationHandler.java:359) at com.sun.proxy.$Proxy20 (null:-1) at org.apache.hadoop.hdfs.DFSClient (DFSClient.java:1666) at org.apache.hadoop.hdfs.DistributedFileSystem$29 (DistributedFileSystem.java:1576) at org.apache.hadoop.hdfs.DistributedFileSystem$29 (DistributedFileSystem.java:1573) at org.apache.hadoop.fs.FileSystemLinkResolver (FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem (DistributedFileSystem.java:1588) at org.apache.hadoop.fs.FileSystem (FileSystem.java:1683) at org.apache.sqoop.connector.hdfs.HdfsToInitializer (HdfsToInitializer.java:58) at org.apache.sqoop.connector.hdfs.HdfsToInitializer (HdfsToInitializer.java:35) at org.apache.sqoop.driver.JobManager (JobManager.java:449) at org.apache.sqoop.driver.JobManager (JobManager.java:373) at org.apache.sqoop.driver.JobManager (JobManager.java:276) at org.apache.sqoop.handler.JobRequestHandler (JobRequestHandler.java:380) at org.apache.sqoop.handler.JobRequestHandler (JobRequestHandler.java:116) at org.apache.sqoop.server.v1.JobServlet (JobServlet.java:96) at org.apache.sqoop.server.SqoopProtocolServlet (SqoopProtocolServlet.java:79) at javax.servlet.http.HttpServlet (HttpServlet.java:646) at javax.servlet.http.HttpServlet (HttpServlet.java:723) at org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:290) at org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206) at org.apache.hadoop.security.authentication.server.AuthenticationFilter (AuthenticationFilter.java:644) at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter (DelegationTokenAuthenticationFilter.java:304) at org.apache.hadoop.security.authentication.server.AuthenticationFilter (AuthenticationFilter.java:592) at org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:235) at org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206) at org.apache.catalina.core.StandardWrapperValve (StandardWrapperValve.java:233) at org.apache.catalina.core.StandardContextValve (StandardContextValve.java:191) at org.apache.catalina.core.StandardHostValve (StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve (ErrorReportValve.java:103) at org.apache.catalina.core.StandardEngineValve (StandardEngineValve.java:109) at org.apache.catalina.connector.CoyoteAdapter (CoyoteAdapter.java:293) at org.apache.coyote.http11.Http11Processor (Http11Processor.java:861) at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler (Http11Protocol.java:606) at org.apache.tomcat.util.net.JIoEndpoint$Worker (JIoEndpoint.java:489) at java.lang.Thread (Thread.java:748) Caused by: Exception: java.lang.Throwable Message: RPC response exceeds maximum data length Stack trace: at org.apache.hadoop.ipc.Client$IpcStreams (Client.java:1864) at org.apache.hadoop.ipc.Client$Connection (Client.java:1183) at org.apache.hadoop.ipc.Client$Connection (Client.java:1079) 哪位大侠帮忙看看:主要应该是这句 Caused by: Exception: java.io.IOException Message: Failed on local exception: org.apache.hadoop.ipc.RpcException: RPC response exceeds maximum data length; Host Details : local host is: "node01/192.168.65.100"; destination host is: "node01":9870; 但不知道问题出在哪里 我的link配置: From database configuration Schema name: mysql Table name: help_topic Table SQL statement: Table column names: Partition column name: Null value allowed for the partition column: Boundary query: Incremental read Check column: Last value: To HDFS configuration Override null value: Null value: Output format: 0 : TEXT_FILE 1 : SEQUENCE_FILE Choose: 0 Compression format: 0 : NONE 1 : DEFAULT 2 : DEFLATE 3 : GZIP 4 : BZIP2 5 : LZO 6 : LZ4 7 : SNAPPY 8 : CUSTOM Choose: 0 Custom compression format: Output directory: hdfs://node01:9870/sqoop Append mode: Throttling resources
hadoop启动JobHistoryServer进程失败
最近在学hadoop,租了一台百度云服务器部署hadoop,启动了NameNode和DataNode还有ResourceManager,在配置完mapred-site.xml后打算启动JobHistoryServer进程看看工作记录,但是配置完ip地址后不是启动成功但打不开history,就是启动失败具体看一下截图: 这张图配置的是我的服务器的ip地址,目前只有一台: ![图片说明](https://img-ask.csdn.net/upload/202001/21/1579569635_360875.png) 配置完后启动日志里会报错,之后进程就退掉了: 报这个错:是端口号被占用: ![图片说明](https://img-ask.csdn.net/upload/202001/21/1579569775_793904.png): 但是我查看这个端口号没有被占用(其他进程已经启动): ![图片说明](https://img-ask.csdn.net/upload/202001/21/1579570143_381425.png): 如果把mapred-site.xmlwe文件里的ip改成下图后就可以成功启动JobHistoryServe进程: ![图片说明](https://img-ask.csdn.net/upload/202001/21/1579570293_948304.png): 进程已经启动了,日志里没有报错: ![图片说明](https://img-ask.csdn.net/upload/202001/21/1579570409_667202.png): 但是点击history后查看不了工作记录:下图 ![图片说明](https://img-ask.csdn.net/upload/202001/21/1579570530_855645.png): 这是点击history后拒绝访问了: ![图片说明](https://img-ask.csdn.net/upload/202001/21/1579570606_198892.png): 我也想过是不是服务器hosts文件的问题,按照下面的配置还是不对(在mared-site.xml中把ip改为hadoop44启动不了): ![图片说明](https://img-ask.csdn.net/upload/202001/21/1579570741_726008.png): 所以这是我遇到的问题,有没有大神解决一下,谢谢
Hadoop+Hbase报错java.net.UnknownHostException:
问题:java.net.UnknownHostException: Invalid host name: local host is: (unknown); destination host is: "master":9000; java.net.UnknownHostException; log错误日志 2017-07-13 21:26:45,915 FATAL [master:16000.activeMasterManager] master.HMaster: Failed to become active master java.net.UnknownHostException: Invalid host name: local host is: (unknown); destination host is: "master":9000; java.net.UnknownHostException; For more details see: http://wiki.apache.org/hadoop/UnknownHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:744) at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:409) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1518) at org.apache.hadoop.ipc.Client.call(Client.java:1451) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy18.setSafeMode(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode(ClientNamenodeProtocolTranslatorPB.java:666) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy19.setSafeMode(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.hbase.fs.HFileSystem$1.invoke(HFileSystem.java:279) at com.sun.proxy.$Proxy20.setSafeMode(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:2596) at org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:1223) at org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:1207) at org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:525) at org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:971) at org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:429) at org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:153) at org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:128) at org.apache.hadoop.hbase.master.HMaster.finishActiveMasterInitialization(HMaster.java:693) at org.apache.hadoop.hbase.master.HMaster.access$600(HMaster.java:189) at org.apache.hadoop.hbase.master.HMaster$2.run(HMaster.java:1803) at java.lang.Thread.run(Thread.java:745) Caused by: java.net.UnknownHostException ... 32 more 2017-07-13 21:26:45,924 FATAL [master:16000.activeMasterManager] master.HMaster: Unhandled exception. Starting shutdown. java.net.UnknownHostException: Invalid host name: local host is: (unknown); destination host is: "master":9000; java.net.UnknownHostException; For more details see: http://wiki.apache.org/hadoop/UnknownHost at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:744) at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:409) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1518) at org.apache.hadoop.ipc.Client.call(Client.java:1451) at org.apache.hadoop.ipc.Client.call(Client.java:1412) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy18.setSafeMode(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode(ClientNamenodeProtocolTranslatorPB.java:666) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy19.setSafeMode(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.hbase.fs.HFileSystem$1.invoke(HFileSystem.java:279) at com.sun.proxy.$Proxy20.setSafeMode(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:2596) at org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:1223) at org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:1207) at org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:525) at org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:971) at org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:429) at org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:153) at org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:128) at org.apache.hadoop.hbase.master.HMaster.finishActiveMasterInitialization(HMaster.java:693) at org.apache.hadoop.hbase.master.HMaster.access$600(HMaster.java:189) at org.apache.hadoop.hbase.master.HMaster$2.run(HMaster.java:1803) at java.lang.Thread.run(Thread.java:745) Caused by: java.net.UnknownHostException ... 32 more 2017-07-13 21:26:45,925 INFO [master:16000.activeMasterManager] regionserver.HRegionServer: STOPPED: Unhandled exception. Starting shutdown. 问题补充 1、防火墙均已关闭、root最高权限 2、hadoop启动正常jps查看已启动,通过浏览器访问50070,8088无任何问题 3、Zookeeper启动正常jps查看已启动 4、已删除hbase/lib下所有关于hadoop的jar并将 hadoop/share所有关于Hadoop的jar拷贝到hbase/lib下,并添加aws-java-sdk-core-1.11.158.jar和aws-java-sdk-s3-1.11.155.jar 版本说明 1、hadoop 2.7.2 2、hbase 1.2.6 3、zookeeper 3.4.2
hadoop jar运行jar包遇到的问题?
hadoop jar运行jar包,传入hadoop的文件路径提示不是文件或目录,但在linux使用java jar传linux路径有用! ![图片说明](https://img-ask.csdn.net/upload/201905/21/1558440862_303646.png)
执行jar报错 Hadoop java.io.IOException
[img=http://img.bbs.csdn.net/upload/201703/15/1489518401_142809.png][/img] Error: java.io.IOException: Initialization of all the collectors failed. Error in last collector was :interface javax.xml.soap.Text hadoop jar Hadoop_Demo1.jar /user/myData/ /user/out/ 执行简单jar包 17/03/15 02:52:37 INFO client.RMProxy: Connecting to ResourceManager at s0/192.168.253.130:8032 17/03/15 02:52:37 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this. 17/03/15 02:52:38 INFO input.FileInputFormat: Total input paths to process : 2 17/03/15 02:52:38 INFO mapreduce.JobSubmitter: number of splits:2 17/03/15 02:52:38 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1489512856623_0004 17/03/15 02:52:39 INFO impl.YarnClientImpl: Submitted application application_1489512856623_0004 17/03/15 02:52:39 INFO mapreduce.Job: The url to track the job: http://s0:8088/proxy/application_1489512856623_0004/ 17/03/15 02:52:39 INFO mapreduce.Job: Running job: job_1489512856623_0004 17/03/15 02:52:50 INFO mapreduce.Job: Job job_1489512856623_0004 running in uber mode : false 17/03/15 02:52:50 INFO mapreduce.Job: map 0% reduce 0% 17/03/15 02:55:18 INFO mapreduce.Job: map 50% reduce 0% 17/03/15 02:55:18 INFO mapreduce.Job: Task Id : attempt_1489512856623_0004_m_000001_0, Status : FAILED Error: java.io.IOException: Initialization of all the collectors failed. Error in last collector was :interface javax.xml.soap.Text at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:414) at org.apache.hadoop.mapred.MapTask.access$100(MapTask.java:81) at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:698) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:770) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: java.lang.ClassCastException: interface javax.xml.soap.Text at java.lang.Class.asSubclass(Class.java:3404) at org.apache.hadoop.mapred.JobConf.getOutputKeyComparator(JobConf.java:887) at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.init(MapTask.java:1004) at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:402) ... 9 more Container killed by the ApplicationMaster. 17/03/15 02:55:18 INFO mapreduce.Job: Task Id : attempt_1489512856623_0004_m_000000_0, Status : FAILED Error: java.io.IOException: Initialization of all the collectors failed. Error in last collector was :interface javax.xml.soap.Text at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:414) at org.apache.hadoop.mapred.MapTask.access$100(MapTask.java:81) at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:698) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:770) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: java.lang.ClassCastException: interface javax.xml.soap.Text at java.lang.Class.asSubclass(Class.java:3404) at org.apache.hadoop.mapred.JobConf.getOutputKeyComparator(JobConf.java:887) at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.init(MapTask.java:1004) at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:402) ... 9 more 17/03/15 02:55:19 INFO mapreduce.Job: map 0% reduce 0% 17/03/15 02:55:31 INFO mapreduce.Job: Task Id : attempt_1489512856623_0004_m_000000_1, Status : FAILED Error: java.io.IOException: Initialization of all the collectors failed. Error in last collector was :interface javax.xml.soap.Text at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:414) at org.apache.hadoop.mapred.MapTask.access$100(MapTask.java:81) at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:698) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:770) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
第一个hadoop程序就出现问题,就大佬帮忙看看。
如果程序打成jar包,用命令是可以运行的。但是在idea中就出现这样的错误 17/03/11 15:21:37 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Exception in thread "main" java.lang.VerifyError: Bad type on operand stack Exception Details: Location: org/apache/hadoop/mapred/JobTrackerInstrumentation.create(Lorg/apache/hadoop/mapred/JobTracker;Lorg/apache/hadoop/mapred/JobConf;)Lorg/apache/hadoop/mapred/JobTrackerInstrumentation; @5: invokestatic Reason: Type 'org/apache/hadoop/metrics2/lib/DefaultMetricsSystem' (current frame, stack[2]) is not assignable to 'org/apache/hadoop/metrics2/MetricsSystem' Current Frame: bci: @5 flags: { } locals: { 'org/apache/hadoop/mapred/JobTracker', 'org/apache/hadoop/mapred/JobConf' } stack: { 'org/apache/hadoop/mapred/JobTracker', 'org/apache/hadoop/mapred/JobConf', 'org/apache/hadoop/metrics2/lib/DefaultMetricsSystem' } Bytecode: 0x0000000: 2a2b b200 03b8 0004 b0 at org.apache.hadoop.mapred.LocalJobRunner.<init>(LocalJobRunner.java:573) at org.apache.hadoop.mapred.JobClient.init(JobClient.java:494) at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:479) at org.apache.hadoop.mapreduce.Job$1.run(Job.java:563) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) at org.apache.hadoop.mapreduce.Job.connect(Job.java:561) at org.apache.hadoop.mapreduce.Job.submit(Job.java:549) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580) at com.hadoop.maxtemperature.MaxTemperature.main(MaxTemperature.java:31) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147) <dependencies> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>2.7.3</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-core</artifactId> <version>1.2.1</version> </dependency> </dependencies>
xxl-job与hadoop同时使用
1. xxl-job如何与hadoop的分布式框架同时用?有人用过吗,有没有一些比较好的资料?期待大神们的回复。
win10下编译hadoop eclipse plugin报错
win10下编译hadoop eclipse plugin报错,请求各位大佬帮忙看一下 ``` D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin>ant jar -Dversion=2.8.3 -Declipse.home=C:\Users\Daybr\eclipse\java-neon\eclipse -Dhadoop.home=D:\hadoop-2.8.3\hadoop-2.8.3 Buildfile: D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\build.xml check-contrib: init: [echo] contrib: eclipse-plugin init-contrib: ivy-probe-antlib: ivy-init-antlib: ivy-init: [ivy:configure] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:configure] :: loading settings :: file = D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\ivy\ivysettings.xml ivy-resolve-common: ivy-retrieve-common: [ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead [ivy:cachepath] :: loading settings :: file = D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\ivy\ivysettings.xml compile: [echo] contrib: eclipse-plugin [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\build.xml:76: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds [javac] Compiling 45 source files to D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\build\contrib\eclipse-plugin\classes [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\Activator.java:22: 错误: 程序包org.eclipse.ui.plugin不存在 [javac] import org.eclipse.ui.plugin.AbstractUIPlugin; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\Activator.java:28: 错误: 找不到符号 [javac] public class Activator extends AbstractUIPlugin { [javac] ^ [javac] 符号: 类 AbstractUIPlugin [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\ErrorMessageDialog.java:22: 错误: 程序包org.eclipse.swt.widgets不存在 [javac] import org.eclipse.swt.widgets.Display; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\HadoopPerspectiveFactory.java:21: 错误: 程序包org.eclipse.debug.ui不存在 [javac] import org.eclipse.debug.ui.IDebugUIConstants; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\HadoopPerspectiveFactory.java:22: 错误: 程序包org.eclipse.jdt.ui不存在 [javac] import org.eclipse.jdt.ui.JavaUI; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\HadoopPerspectiveFactory.java:23: 错误: 程序包org.eclipse.ui不存在 [javac] import org.eclipse.ui.IFolderLayout; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\HadoopPerspectiveFactory.java:24: 错误: 程序包org.eclipse.ui不存在 [javac] import org.eclipse.ui.IPageLayout; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\HadoopPerspectiveFactory.java:25: 错误: 程序包org.eclipse.ui不存在 [javac] import org.eclipse.ui.IPerspectiveFactory; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\HadoopPerspectiveFactory.java:26: 错误: 程序包org.eclipse.ui.console不存在 [javac] import org.eclipse.ui.console.IConsoleConstants; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\HadoopPerspectiveFactory.java:34: 错误: 找不到符号 [javac] public class HadoopPerspectiveFactory implements IPerspectiveFactory { [javac] ^ [javac] 符号: 类 IPerspectiveFactory [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\HadoopPerspectiveFactory.java:36: 错误: 找不到符号 [javac] public void createInitialLayout(IPageLayout layout) { [javac] ^ [javac] 符号: 类 IPageLayout [javac] 位置: 类 HadoopPerspectiveFactory [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\ImageLibrary.java:25: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.FileLocator; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\ImageLibrary.java:26: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.Path; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\ImageLibrary.java:28: 错误: 程序包org.eclipse.swt.graphics不存在 [javac] import org.eclipse.swt.graphics.Image; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\ImageLibrary.java:29: 错误: 程序包org.eclipse.ui不存在 [javac] import org.eclipse.ui.ISharedImages; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\ImageLibrary.java:30: 错误: 程序包org.eclipse.ui不存在 [javac] import org.eclipse.ui.PlatformUI; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\ImageLibrary.java:31: 错误: 程序包org.eclipse.ui.plugin不存在 [javac] import org.eclipse.ui.plugin.AbstractUIPlugin; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\ImageLibrary.java:46: 错误: 找不到符号 [javac] private ISharedImages sharedImages = [javac] ^ [javac] 符号: 类 ISharedImages [javac] 位置: 类 ImageLibrary [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\ImageLibrary.java:70: 错误: 找不到符号 [javac] public static Image getImage(String name) { [javac] ^ [javac] 符号: 类 Image [javac] 位置: 类 ImageLibrary [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\ImageLibrary.java:95: 错误: 找不到符号 [javac] private Map<String, Image> imageMap = new HashMap<String, Image>(); [javac] ^ [javac] 符号: 类 Image [javac] 位置: 类 ImageLibrary [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\ImageLibrary.java:154: 错误: 找不到符号 [javac] private Image getImageByName(String name) { [javac] ^ [javac] 符号: 类 Image [javac] 位置: 类 ImageLibrary [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\MapReduceNature.java:29: 错误: 程序包org.eclipse.core.resources不存在 [javac] import org.eclipse.core.resources.IProject; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\MapReduceNature.java:30: 错误: 程序包org.eclipse.core.resources不存在 [javac] import org.eclipse.core.resources.IProjectNature; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\MapReduceNature.java:31: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.CoreException; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\MapReduceNature.java:32: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.NullProgressMonitor; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\MapReduceNature.java:33: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.Path; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\MapReduceNature.java:34: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.QualifiedName; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\MapReduceNature.java:35: 错误: 程序包org.eclipse.jdt.core不存在 [javac] import org.eclipse.jdt.core.IClasspathEntry; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\MapReduceNature.java:36: 错误: 程序包org.eclipse.jdt.core不存在 [javac] import org.eclipse.jdt.core.IJavaProject; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\MapReduceNature.java:37: 错误: 程序包org.eclipse.jdt.core不存在 [javac] import org.eclipse.jdt.core.JavaCore; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\MapReduceNature.java:44: 错误: 找不到符号 [javac] public class MapReduceNature implements IProjectNature { [javac] ^ [javac] 符号: 类 IProjectNature [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\MapReduceNature.java:48: 错误: 找不到符号 [javac] private IProject project; [javac] ^ [javac] 符号: 类 IProject [javac] 位置: 类 MapReduceNature [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\MapReduceNature.java:56: 错误: 找不到符号 [javac] public void configure() throws CoreException { [javac] ^ [javac] 符号: 类 CoreException [javac] 位置: 类 MapReduceNature [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\MapReduceNature.java:130: 错误: 找不到符号 [javac] public void deconfigure() throws CoreException { [javac] ^ [javac] 符号: 类 CoreException [javac] 位置: 类 MapReduceNature [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\MapReduceNature.java:137: 错误: 找不到符号 [javac] public IProject getProject() { [javac] ^ [javac] 符号: 类 IProject [javac] 位置: 类 MapReduceNature [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\MapReduceNature.java:145: 错误: 找不到符号 [javac] public void setProject(IProject project) { [javac] ^ [javac] 符号: 类 IProject [javac] 位置: 类 MapReduceNature [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizard.java:21: 错误: 程序包org.eclipse.core.resources不存在 [javac] import org.eclipse.core.resources.IFile; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizard.java:22: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.CoreException; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizard.java:23: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.IProgressMonitor; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizard.java:24: 错误: 程序包org.eclipse.jdt.core不存在 [javac] import org.eclipse.jdt.core.IJavaElement; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizard.java:25: 错误: 程序包org.eclipse.jdt.internal.ui.wizards不存在 [javac] import org.eclipse.jdt.internal.ui.wizards.NewElementWizard; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizard.java:28: 错误: 程序包org.eclipse.ui不存在 [javac] import org.eclipse.ui.INewWizard; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizard.java:29: 错误: 程序包org.eclipse.ui不存在 [javac] import org.eclipse.ui.IWorkbench; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizard.java:36: 错误: 找不到符号 [javac] public class NewDriverWizard extends NewElementWizard implements INewWizard, [javac] ^ [javac] 符号: 类 NewElementWizard [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizard.java:36: 错误: 找不到符号 [javac] public class NewDriverWizard extends NewElementWizard implements INewWizard, [javac] ^ [javac] 符号: 类 INewWizard [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:23: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.CoreException; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:24: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.FileLocator; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:25: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.IProgressMonitor; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:26: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.IStatus; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:27: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.Path; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:28: 错误: 程序包org.eclipse.jdt.core不存在 [javac] import org.eclipse.jdt.core.IType; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:29: 错误: 程序包org.eclipse.jdt.core不存在 [javac] import org.eclipse.jdt.core.JavaModelException; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:30: 错误: 程序包org.eclipse.jdt.core.search不存在 [javac] import org.eclipse.jdt.core.search.SearchEngine; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:31: 错误: 程序包org.eclipse.jdt.ui不存在 [javac] import org.eclipse.jdt.ui.IJavaElementSearchConstants; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:32: 错误: 程序包org.eclipse.jdt.ui不存在 [javac] import org.eclipse.jdt.ui.JavaUI; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:33: 错误: 程序包org.eclipse.jdt.ui.wizards不存在 [javac] import org.eclipse.jdt.ui.wizards.NewTypeWizardPage; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:38: 错误: 程序包org.eclipse.swt不存在 [javac] import org.eclipse.swt.SWT; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:39: 错误: 程序包org.eclipse.swt.layout不存在 [javac] import org.eclipse.swt.layout.GridData; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:40: 错误: 程序包org.eclipse.swt.layout不存在 [javac] import org.eclipse.swt.layout.GridLayout; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:41: 错误: 程序包org.eclipse.swt.widgets不存在 [javac] import org.eclipse.swt.widgets.Button; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:42: 错误: 程序包org.eclipse.swt.widgets不存在 [javac] import org.eclipse.swt.widgets.Composite; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:43: 错误: 程序包org.eclipse.swt.widgets不存在 [javac] import org.eclipse.swt.widgets.Event; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:44: 错误: 程序包org.eclipse.swt.widgets不存在 [javac] import org.eclipse.swt.widgets.Label; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:45: 错误: 程序包org.eclipse.swt.widgets不存在 [javac] import org.eclipse.swt.widgets.Listener; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:46: 错误: 程序包org.eclipse.swt.widgets不存在 [javac] import org.eclipse.swt.widgets.Text; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:47: 错误: 程序包org.eclipse.ui.dialogs不存在 [javac] import org.eclipse.ui.dialogs.SelectionDialog; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:54: 错误: 找不到符号 [javac] public class NewDriverWizardPage extends NewTypeWizardPage { [javac] ^ [javac] 符号: 类 NewTypeWizardPage [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizard.java:43: 错误: 找不到符号 [javac] public void run(IProgressMonitor monitor) { [javac] ^ [javac] 符号: 类 IProgressMonitor [javac] 位置: 类 NewDriverWizard [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizard.java:60: 错误: 找不到符号 [javac] public void init(IWorkbench workbench, IStructuredSelection selection) { [javac] ^ [javac] 符号: 类 IWorkbench [javac] 位置: 类 NewDriverWizard [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizard.java:90: 错误: 找不到符号 [javac] protected void finishPage(IProgressMonitor monitor) [javac] ^ [javac] 符号: 类 IProgressMonitor [javac] 位置: 类 NewDriverWizard [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizard.java:91: 错误: 找不到符号 [javac] throws InterruptedException, CoreException { [javac] ^ [javac] 符号: 类 CoreException [javac] 位置: 类 NewDriverWizard [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizard.java:96: 错误: 找不到符号 [javac] public IJavaElement getCreatedElement() { [javac] ^ [javac] 符号: 类 IJavaElement [javac] 位置: 类 NewDriverWizard [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:55: 错误: 找不到符号 [javac] private Button isCreateMapMethod; [javac] ^ [javac] 符号: 类 Button [javac] 位置: 类 NewDriverWizardPage [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:57: 错误: 找不到符号 [javac] private Text reducerText; [javac] ^ [javac] 符号: 类 Text [javac] 位置: 类 NewDriverWizardPage [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:59: 错误: 找不到符号 [javac] private Text mapperText; [javac] ^ [javac] 符号: 类 Text [javac] 位置: 类 NewDriverWizardPage [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:85: 错误: 找不到符号 [javac] public void createType(IProgressMonitor monitor) throws CoreException, [javac] ^ [javac] 符号: 类 IProgressMonitor [javac] 位置: 类 NewDriverWizardPage [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:85: 错误: 找不到符号 [javac] public void createType(IProgressMonitor monitor) throws CoreException, [javac] ^ [javac] 符号: 类 CoreException [javac] 位置: 类 NewDriverWizardPage [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:91: 错误: 找不到符号 [javac] protected void createTypeMembers(final IType newType, ImportsManager imports, [javac] ^ [javac] 符号: 类 IType [javac] 位置: 类 NewDriverWizardPage [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:91: 错误: 找不到符号 [javac] protected void createTypeMembers(final IType newType, ImportsManager imports, [javac] ^ [javac] 符号: 类 ImportsManager [javac] 位置: 类 NewDriverWizardPage [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:92: 错误: 找不到符号 [javac] final IProgressMonitor monitor) throws CoreException { [javac] ^ [javac] 符号: 类 IProgressMonitor [javac] 位置: 类 NewDriverWizardPage [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:92: 错误: 找不到符号 [javac] final IProgressMonitor monitor) throws CoreException { [javac] ^ [javac] 符号: 类 CoreException [javac] 位置: 类 NewDriverWizardPage [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:145: 错误: 找不到符号 [javac] public void createControl(Composite parent) { [javac] ^ [javac] 符号: 类 Composite [javac] 位置: 类 NewDriverWizardPage [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:199: 错误: 找不到符号 [javac] private void createMapperControls(Composite composite) { [javac] ^ [javac] 符号: 类 Composite [javac] 位置: 类 NewDriverWizardPage [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:204: 错误: 找不到符号 [javac] private void createReducerControls(Composite composite) { [javac] ^ [javac] 符号: 类 Composite [javac] 位置: 类 NewDriverWizardPage [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:209: 错误: 找不到符号 [javac] private Text createBrowseClassControl(final Composite composite, [javac] ^ [javac] 符号: 类 Composite [javac] 位置: 类 NewDriverWizardPage [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewDriverWizardPage.java:209: 错误: 找不到符号 [javac] private Text createBrowseClassControl(final Composite composite, [javac] ^ [javac] 符号: 类 Text [javac] 位置: 类 NewDriverWizardPage [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewMapReduceProjectWizard.java:29: 错误: 程序包org.eclipse.core.resources不存在 [javac] import org.eclipse.core.resources.IProject; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewMapReduceProjectWizard.java:30: 错误: 程序包org.eclipse.core.resources不存在 [javac] import org.eclipse.core.resources.IProjectDescription; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewMapReduceProjectWizard.java:31: 错误: 程序包org.eclipse.core.resources不存在 [javac] import org.eclipse.core.resources.ResourcesPlugin; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewMapReduceProjectWizard.java:32: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.CoreException; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewMapReduceProjectWizard.java:33: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.IConfigurationElement; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewMapReduceProjectWizard.java:34: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.IExecutableExtension; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewMapReduceProjectWizard.java:36: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.NullProgressMonitor; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewMapReduceProjectWizard.java:37: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.Path; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewMapReduceProjectWizard.java:38: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.QualifiedName; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewMapReduceProjectWizard.java:39: 错误: 程序包org.eclipse.core.runtime不存在 [javac] import org.eclipse.core.runtime.SubProgressMonitor; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewMapReduceProjectWizard.java:40: 错误: 程序包org.eclipse.jdt.ui.wizards不存在 [javac] import org.eclipse.jdt.ui.wizards.NewJavaProjectWizardPage; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewMapReduceProjectWizard.java:49: 错误: 程序包org.eclipse.swt不存在 [javac] import org.eclipse.swt.SWT; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewMapReduceProjectWizard.java:50: 错误: 程序包org.eclipse.swt.events不存在 [javac] import org.eclipse.swt.events.SelectionEvent; [javac] ^ [javac] D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\NewMapReduceProjectWizard.java:51: 错误: 程序包org.eclipse.swt.events不存在 [javac] import org.eclipse.swt.events.SelectionListener; [javac] ^ [javac] 注: D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\dfs\DFSFolder.java使用或覆盖了已过时的 API。 [javac] 注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。 [javac] 注: D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\src\java\org\apache\hadoop\eclipse\actions\DFSActionImpl.java使用了未经检查或不安全的操作。 [javac] 注: 有关详细信息, 请使用 -Xlint:unchecked 重新编译。 [javac] 注: 某些消息已经过简化; 请使用 -Xdiags:verbose 重新编译以获得完整输出 [javac] 100 个错误 BUILD FAILED D:\hadoop2x-eclipse-plugin-master-master\hadoop2x-eclipse-plugin-master-master\src\contrib\eclipse-plugin\build.xml:76: Compile failed; see the compiler error output for details. Total time: 4 seconds ```
关于hadoop安装的一个问题
当我hadoop namenode -formate之后,我运行start-dfs.sh之后,namenode的日志报错,错误如下: 2011-08-11 18:27:20,179 ERROR org.apache.hadoop.hdfs.server.namenode.FSNamesystem: FSNamesystem initialization failed. java.io.IOException: NameNode is not formatted. at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:317) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.loadFSImage(FSDirectory.java:87) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.initialize(FSNamesystem.java:311) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:292) at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:201) at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:279) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:956) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:965) core-site.xml的配置如下: <configuration> <property> <name>fs.default.name</name> <value>hdfs://localhost:9000</value> </property> <property> <name>hadoop.tmp.dir</name> <value>/opt/hadoop/hadooptmp</value> </property> </configuration> hdfs-site.xml的配置如下: <configuration> <property> <name>dfs.name.dir</name> <value>/opt/hadoop/hdfs/name</value> </property> <property> <name>dfs.data.dir</name> <value>/opt/hadoop/hdfs/data</value> </property> <property> <name>dfs.replication</name> <value>1</value> </property> </configuration> mapred-site.xml配置如下: <configuration> <property> <name>mapred.job.tracker</name> <value>localhost:8021</value> </property> <property> <name>mapred.local.dir</name> <value>/opt/hadoop/mapred/local</value> </property> <property> <name>mapred.system.dir</name> <value>/opt/hadoop/mapred/system</value> </property> </configuration> 从日志分析是没有formate,但是我的确formate了。namenode的name目录是不会自动创建的,我手动创建,并且修改了权限为chmod -R 777 hadoop ,所以从读写的权限来也没有问题。但是如果我不手动也会报错,错误是: 2011-08-11 18:37:09,401 ERROR org.apache.hadoop.hdfs.server.namenode.FSNamesystem: FSNamesystem initialization failed. org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /opt/hadoop/hdfs/name is in an inconsistent state: storage directory does not exist or is not accessible. at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:290) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.loadFSImage(FSDirectory.java:87) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.initialize(FSNamesystem.java:311) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:292) at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:201) at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:279) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:956) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:965) 有没有哪位能帮我解决下这个问题,非常感谢。我怀疑是hadoop权限的问题,希望大牛们赐教。
Hadoop框架搭建所遇问题
本人最近学习Hadoop,就按照教程搭建了一个Hadoop集群,可是运行的时候,总是遇到一个问题:datanode进程不出现。我查询了很多,试了很多,但是就是无济于事,就仅仅缺少datanode进程。还请高人指教提醒 具体报错如下: org.apache.hadoop.util.DiskChecker$DiskErrorException: Directory is not readable: /opt/modules/hadoop-2.5.0/data/tmp/dfs/data at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:174) at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:160) at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:143) at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:1866) at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:1908) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1890) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1782) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1829) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2005) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2029) 2018-04-03 16:09:50,281 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain java.io.IOException: All directories in dfs.datanode.data.dir are invalid: "/opt/modules/hadoop-2.5.0/data/tmp/dfs/data/" at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:1917) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1890) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1782) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1829) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2005) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2029)
终于明白阿里百度这样的大公司,为什么面试经常拿ThreadLocal考验求职者了
点击上面↑「爱开发」关注我们每晚10点,捕获技术思考和创业资源洞察什么是ThreadLocalThreadLocal是一个本地线程副本变量工具类,各个线程都拥有一份线程私...
程序员必须掌握的核心算法有哪些?
由于我之前一直强调数据结构以及算法学习的重要性,所以就有一些读者经常问我,数据结构与算法应该要学习到哪个程度呢?,说实话,这个问题我不知道要怎么回答你,主要取决于你想学习到哪些程度,不过针对这个问题,我稍微总结一下我学过的算法知识点,以及我觉得值得学习的算法。这些算法与数据结构的学习大多数是零散的,并没有一本把他们全部覆盖的书籍。下面是我觉得值得学习的一些算法以及数据结构,当然,我也会整理一些看过...
Linux(服务器编程):15---两种高效的事件处理模式(reactor模式、proactor模式)
前言 同步I/O模型通常用于实现Reactor模式 异步I/O模型则用于实现Proactor模式 最后我们会使用同步I/O方式模拟出Proactor模式 一、Reactor模式 Reactor模式特点 它要求主线程(I/O处理单元)只负责监听文件描述符上是否有事件发生,有的话就立即将时间通知工作线程(逻辑单元)。除此之外,主线程不做任何其他实质性的工作 读写数据,接受新的连接,以及处...
阿里面试官问我:如何设计秒杀系统?我的回答让他比起大拇指
你知道的越多,你不知道的越多 点赞再看,养成习惯 GitHub上已经开源 https://github.com/JavaFamily 有一线大厂面试点脑图和个人联系方式,欢迎Star和指教 前言 Redis在互联网技术存储方面使用如此广泛,几乎所有的后端技术面试官都要在Redis的使用和原理方面对小伙伴们进行360°的刁难。 作为一个在互联网公司面一次拿一次Offer的面霸,打败了...
五年程序员记流水账式的自白。
不知觉已中码龄已突破五年,一路走来从起初铁憨憨到现在的十九线程序员,一路成长,虽然不能成为高工,但是也能挡下一面,从15年很火的android开始入坑,走过java、.Net、QT,目前仍处于android和.net交替开发中。 毕业到现在一共就职过两家公司,目前是第二家,公司算是半个创业公司,所以基本上都会身兼多职。比如不光要写代码,还要写软著、软著评测、线上线下客户对接需求收集...
C语言魔塔游戏
很早就很想写这个,今天终于写完了。 游戏截图: 编译环境: VS2017 游戏需要一些图片,如果有想要的或者对游戏有什么看法的可以加我的QQ 2985486630 讨论,如果暂时没有回应,可以在博客下方留言,到时候我会看到。 下面我来介绍一下游戏的主要功能和实现方式 首先是玩家的定义,使用结构体,这个名字是可以自己改变的 struct gamerole { char n...
一文详尽系列之模型评估指标
点击上方“Datawhale”,选择“星标”公众号第一时间获取价值内容在机器学习领域通常会根据实际的业务场景拟定相应的不同的业务指标,针对不同机器学习问题如回归、分类、排...
究竟你适不适合买Mac?
我清晰的记得,刚买的macbook pro回到家,开机后第一件事情,就是上了淘宝网,花了500元钱,找了一个上门维修电脑的师傅,上门给我装了一个windows系统。。。。。。 表砍我。。。 当时买mac的初衷,只是想要个固态硬盘的笔记本,用来运行一些复杂的扑克软件。而看了当时所有的SSD笔记本后,最终决定,还是买个好(xiong)看(da)的。 已经有好几个朋友问我mba怎么样了,所以今天尽量客观...
程序员一般通过什么途径接私活?
二哥,你好,我想知道一般程序猿都如何接私活,我也想接,能告诉我一些方法吗? 上面是一个读者“烦不烦”问我的一个问题。其实不止是“烦不烦”,还有很多读者问过我类似这样的问题。 我接的私活不算多,挣到的钱也没有多少,加起来不到 20W。说实话,这个数目说出来我是有点心虚的,毕竟太少了,大家轻喷。但我想,恰好配得上“一般程序员”这个称号啊。毕竟苍蝇再小也是肉,我也算是有经验的人了。 唾弃接私活、做外...
压测学习总结(1)——高并发性能指标:QPS、TPS、RT、吞吐量详解
一、QPS,每秒查询 QPS:Queries Per Second意思是“每秒查询率”,是一台服务器每秒能够相应的查询次数,是对一个特定的查询服务器在规定时间内所处理流量多少的衡量标准。互联网中,作为域名系统服务器的机器的性能经常用每秒查询率来衡量。 二、TPS,每秒事务 TPS:是TransactionsPerSecond的缩写,也就是事务数/秒。它是软件测试结果的测量单位。一个事务是指一...
Python爬虫爬取淘宝,京东商品信息
小编是一个理科生,不善长说一些废话。简单介绍下原理然后直接上代码。 使用的工具(Python+pycharm2019.3+selenium+xpath+chromedriver)其中要使用pycharm也可以私聊我selenium是一个框架可以通过pip下载 pip installselenium -ihttps://pypi.tuna.tsinghua.edu.cn/simple/ ...
阿里程序员写了一个新手都写不出的低级bug,被骂惨了。
这种新手都不会范的错,居然被一个工作好几年的小伙子写出来,差点被当场开除了。
Java工作4年来应聘要16K最后没要,细节如下。。。
前奏: 今天2B哥和大家分享一位前几天面试的一位应聘者,工作4年26岁,统招本科。 以下就是他的简历和面试情况。 基本情况: 专业技能: 1、&nbsp;熟悉Sping了解SpringMVC、SpringBoot、Mybatis等框架、了解SpringCloud微服务 2、&nbsp;熟悉常用项目管理工具:SVN、GIT、MAVEN、Jenkins 3、&nbsp;熟悉Nginx、tomca...
2020年,冯唐49岁:我给20、30岁IT职场年轻人的建议
点击“技术领导力”关注∆每天早上8:30推送 作者|Mr.K 编辑| Emma 来源|技术领导力(ID:jishulingdaoli) 前天的推文《冯唐:职场人35岁以后,方法论比经验重要》,收到了不少读者的反馈,觉得挺受启发。其实,冯唐写了不少关于职场方面的文章,都挺不错的。可惜大家只记住了“春风十里不如你”、“如何避免成为油腻腻的中年人”等不那么正经的文章。 本文整理了冯...
程序员该看的几部电影
1、骇客帝国(1999) 概念:在线/离线,递归,循环,矩阵等 剧情简介: 不久的将来,网络黑客尼奥对这个看似正常的现实世界产生了怀疑。 他结识了黑客崔妮蒂,并见到了黑客组织的首领墨菲斯。 墨菲斯告诉他,现实世界其实是由一个名叫“母体”的计算机人工智能系统控制,人们就像他们饲养的动物,没有自由和思想,而尼奥就是能够拯救人类的救世主。 可是,救赎之路从来都不会一帆风顺,到底哪里才是真实的世界?如何...
Python绘图,圣诞树,花,爱心 | Turtle篇
每周每日,分享Python实战代码,入门资料,进阶资料,基础语法,爬虫,数据分析,web网站,机器学习,深度学习等等。 公众号回复【进群】沟通交流吧,QQ扫码进群学习吧 微信群 QQ群 1.画圣诞树 import turtle screen = turtle.Screen() screen.setup(800,600) circle = turtle.Turtle()...
作为一个程序员,CPU的这些硬核知识你必须会!
CPU对每个程序员来说,是个既熟悉又陌生的东西? 如果你只知道CPU是中央处理器的话,那可能对你并没有什么用,那么作为程序员的我们,必须要搞懂的就是CPU这家伙是如何运行的,尤其要搞懂它里面的寄存器是怎么一回事,因为这将让你从底层明白程序的运行机制。 随我一起,来好好认识下CPU这货吧 把CPU掰开来看 对于CPU来说,我们首先就要搞明白它是怎么回事,也就是它的内部构造,当然,CPU那么牛的一个东...
还记得那个提速8倍的IDEA插件吗?VS Code版本也发布啦!!
去年,阿里云发布了本地 IDE 插件 Cloud Toolkit,仅 IntelliJ IDEA 一个平台,就有 15 万以上的开发者进行了下载,体验了一键部署带来的开发便利。时隔一年的今天,阿里云正式发布了 Visual Studio Code 版本,全面覆盖前端开发者,帮助前端实现一键打包部署,让开发提速 8 倍。 VSCode 版本的插件,目前能做到什么? 安装插件之后,开发者可以立即体验...
破14亿,Python分析我国存在哪些人口危机!
一、背景 二、爬取数据 三、数据分析 1、总人口 2、男女人口比例 3、人口城镇化 4、人口增长率 5、人口老化(抚养比) 6、各省人口 7、世界人口 四、遇到的问题 遇到的问题 1、数据分页,需要获取从1949-2018年数据,观察到有近20年参数:LAST20,由此推测获取近70年的参数可设置为:LAST70 2、2019年数据没有放上去,可以手动添加上去 3、将数据进行 行列转换 4、列名...
2019年除夕夜的有感而发
天气:小雨(加小雪) 温度:3摄氏度 空气:严重污染(399) 风向:北风 风力:微风 现在是除夕夜晚上十点钟,再有两个小时就要新的一年了; 首先要说的是我没患病,至少现在是没有患病;但是心情确像患了病一样沉重; 现在这个时刻应该大部分家庭都在看春晚吧,或许一家人团团圆圆的坐在一起,或许因为某些特殊原因而不能团圆;但不管是身在何处,身处什么境地,我都想对每一个人说一句:新年快乐! 不知道csdn这...
听说想当黑客的都玩过这个Monyer游戏(1~14攻略)
第零关 进入传送门开始第0关(游戏链接) 请点击链接进入第1关: 连接在左边→ ←连接在右边 看不到啊。。。。(只能看到一堆大佬做完的留名,也能看到菜鸡的我,在后面~~) 直接fn+f12吧 &lt;span&gt;连接在左边→&lt;/span&gt; &lt;a href="first.php"&gt;&lt;/a&gt; &lt;span&gt;←连接在右边&lt;/span&gt; o...
在家远程办公效率低?那你一定要收好这个「在家办公」神器!
相信大家都已经收到国务院延长春节假期的消息,接下来,在家远程办公可能将会持续一段时间。 但是问题来了。远程办公不是人在电脑前就当坐班了,相反,对于沟通效率,文件协作,以及信息安全都有着极高的要求。有着非常多的挑战,比如: 1在异地互相不见面的会议上,如何提高沟通效率? 2文件之间的来往反馈如何做到及时性?如何保证信息安全? 3如何规划安排每天工作,以及如何进行成果验收? ...... ...
作为一个程序员,内存和磁盘的这些事情,你不得不知道啊!!!
截止目前,我已经分享了如下几篇文章: 一个程序在计算机中是如何运行的?超级干货!!! 作为一个程序员,CPU的这些硬核知识你必须会! 作为一个程序员,内存的这些硬核知识你必须懂! 这些知识可以说是我们之前都不太重视的基础知识,可能大家在上大学的时候都学习过了,但是嘞,当时由于老师讲解的没那么有趣,又加上这些知识本身就比较枯燥,所以嘞,大家当初几乎等于没学。 再说啦,学习这些,也看不出来有什么用啊!...
2020年的1月,我辞掉了我的第一份工作
其实,这篇文章,我应该早点写的,毕竟现在已经2月份了。不过一些其它原因,或者是我的惰性、还有一些迷茫的念头,让自己迟迟没有试着写一点东西,记录下,或者说是总结下自己前3年的工作上的经历、学习的过程。 我自己知道的,在写自己的博客方面,我的文笔很一般,非技术类的文章不想去写;另外我又是一个还比较热衷于技术的人,而平常复杂一点的东西,如果想写文章写的清楚点,是需要足够...
别低估自己的直觉,也别高估自己的智商
所有群全部吵翻天,朋友圈全部沦陷,公众号疯狂转发。这两周没怎么发原创,只发新闻,可能有人注意到了。我不是懒,是文章写了却没发,因为大家的关注力始终在这次的疫情上面,发了也没人看。当然,我...
这个世界上人真的分三六九等,你信吗?
偶然间,在知乎上看到一个问题 一时间,勾起了我深深的回忆。 以前在厂里打过两次工,做过家教,干过辅导班,做过中介。零下几度的晚上,贴过广告,满脸、满手地长冻疮。 再回首那段岁月,虽然苦,但让我学会了坚持和忍耐。让我明白了,在这个世界上,无论环境多么的恶劣,只要心存希望,星星之火,亦可燎原。 下文是原回答,希望能对你能有所启发。 如果我说,这个世界上人真的分三六九等,...
节后首个工作日,企业们集体开晨会让钉钉挂了
By 超神经场景描述:昨天 2 月 3 日,是大部分城市号召远程工作的第一天,全国有接近 2 亿人在家开始远程办公,钉钉上也有超过 1000 万家企业活跃起来。关键词:十一出行 人脸...
Java基础知识点梳理
虽然已经在实际工作中经常与java打交道,但是一直没系统地对java这门语言进行梳理和总结,掌握的知识也比较零散。恰好利用这段时间重新认识下java,并对一些常见的语法和知识点做个总结与回顾,一方面为了加深印象,方便后面查阅,一方面为了掌握好Android打下基础。
2020年全新Java学习路线图,含配套视频,学完即为中级Java程序员!!
新的一年来临,突如其来的疫情打破了平静的生活! 在家的你是否很无聊,如果无聊就来学习吧! 世上只有一种投资只赚不赔,那就是学习!!! 传智播客于2020年升级了Java学习线路图,硬核升级,免费放送! 学完你就是中级程序员,能更快一步找到工作! 一、Java基础 JavaSE基础是Java中级程序员的起点,是帮助你从小白到懂得编程的必经之路。 在Java基础板块中有6个子模块的学...
B 站上有哪些很好的学习资源?
哇说起B站,在小九眼里就是宝藏般的存在,放年假宅在家时一天刷6、7个小时不在话下,更别提今年的跨年晚会,我简直是跪着看完的!! 最早大家聚在在B站是为了追番,再后来我在上面刷欧美新歌和漂亮小姐姐的舞蹈视频,最近两年我和周围的朋友们已经把B站当作学习教室了,而且学习成本还免费,真是个励志的好平台ヽ(.◕ฺˇд ˇ◕ฺ;)ノ 下面我们就来盘点一下B站上优质的学习资源: 综合类 Oeasy: 综合...
相关热词 c# 识别回车 c#生成条形码ean13 c#子控制器调用父控制器 c# 写大文件 c# 浏览pdf c#获取桌面图标的句柄 c# list反射 c# 句柄 进程 c# 倒计时 线程 c# 窗体背景色
立即提问