centos7 svn: E000013 不能打开文件“/data/svn/project/format”: 权限不够

在虚拟机里安装centos7,然后安装SVN,安装完连接报错,图片说明
图片说明
网上搜索的方法都试过,不好使,求助各位大大

1个回答

问题解决了,需要配置SELinux.

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
搭建phabicator,svn报错,提示svn: E175002

搭建phabicator,svn查看文件信息网页上报错 svn: E175002: OPTIONS of 'https://zhuzhusvn.com/svn/server/branches/branch2/tinyNine/app/zz/model/ModZzData.php': Server certificate verification failed: certificate issued for a different hostname, issuer is not trusted (https://xxxxx),同样命令在centos服务器上第一次也会报这个,可是执行svn ls https://zhuzhusvn.com/svn/server保存用户密码后就能成功了,服务器使用的apache,是不是也该怎么在apache用户下保存用户密码才好呢。![图片说明](https://img-ask.csdn.net/upload/201512/07/1449449327_914909.png)

centOS安装svn,checkout时报错: Authentication failed

centOS安装svn,checkout时报错:svn: E215004: Authentication failed 几个配置文件如下: ![图片说明](https://img-ask.csdn.net/upload/201801/04/1515059857_793558.png) ![图片说明](https://img-ask.csdn.net/upload/201801/04/1515059868_205870.png) ![图片说明](https://img-ask.csdn.net/upload/201801/04/1515059891_167337.png) 报错如下: ![图片说明](https://img-ask.csdn.net/upload/201801/04/1515059935_409151.png)

vsftpd 500 OOPS: cannot change directory:/home/***

本人使用的是Fedora21,在vsftpd配置中,想让所有实体用户默认被 chroot, 而不被 chroot 的账号配置在chroot_list中 ,vsftpd.conf中的配置如下 chroot_local_user=YES chroot_list_enable=YES chroot_list_file=/etc/vsftpd/chroot_list 不被 chroot 的使用者账号列表 vim /etc/vsftpd/chroot_list javatest 但是重启服务器后,提示:500 OOPS: cannot change directory:/home/javatest 网上查找:修改selinux设置 setsebool -P ftp_home_dir 1 最后,还是提示相同的问题,请高手帮助

500 OOPS: cannot change directory:/root/ftp/home

网上搜了很久了,已经关闭了SELinux,还是不行啊!! ![图片说明](https://img-ask.csdn.net/upload/201506/30/1435656454_921023.png) ![图片说明](https://img-ask.csdn.net/upload/201506/30/1435656578_38306.png)

Failed to retrieve data from /webhdfs/v1/?op=LISTSTATUS: Server Error,同时无法put文件到hdfs

hadoop版本是3.1,ubuntu是18, 问题一:浏览hdfs目录显示: Failed to retrieve data from /webhdfs/v1/?op=LISTSTATUS: Server Error 问题二: namenode的log如下: ``` 438 WARN org.eclipse.jetty.servlet.ServletHandler: Error for /webhdfs/v1/ java.lang.NoClassDefFoundError: javax/activation/DataSource at com.sun.xml.bind.v2.model.impl.RuntimeBuiltinLeafInfoImpl.<clinit>(RuntimeBuiltinLeafInfoImpl.java:457) at com.sun.xml.bind.v2.model.impl.RuntimeTypeInfoSetImpl.<init>(RuntimeTypeInfoSetImpl.java:65) at com.sun.xml.bind.v2.model.impl.RuntimeModelBuilder.createTypeInfoSet(RuntimeModelBuilder.java:133) at com.sun.xml.bind.v2.model.impl.RuntimeModelBuilder.createTypeInfoSet(RuntimeModelBuilder.java:85) at com.sun.xml.bind.v2.model.impl.ModelBuilder.<init>(ModelBuilder.java:156) at com.sun.xml.bind.v2.model.impl.RuntimeModelBuilder.<init>(RuntimeModelBuilder.java:93) at com.sun.xml.bind.v2.runtime.JAXBContextImpl.getTypeInfoSet(JAXBContextImpl.java:473) at com.sun.xml.bind.v2.runtime.JAXBContextImpl.<init>(JAXBContextImpl.java:319) at com.sun.xml.bind.v2.runtime.JAXBContextImpl$JAXBContextBuilder.build(JAXBContextImpl.java:1170) at com.sun.xml.bind.v2.ContextFactory.createContext(ContextFactory.java:145) at com.sun.xml.bind.v2.ContextFactory.createContext(ContextFactory.java:236) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at javax.xml.bind.ContextFinder.newInstance(ContextFinder.java:186) at javax.xml.bind.ContextFinder.newInstance(ContextFinder.java:146) at javax.xml.bind.ContextFinder.find(ContextFinder.java:350) at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:446) at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:409) at com.sun.jersey.server.impl.wadl.WadlApplicationContextImpl.<init>(WadlApplicationContextImpl.java:103) at com.sun.jersey.server.impl.wadl.WadlFactory.init(WadlFactory.java:100) at com.sun.jersey.server.impl.application.RootResourceUriRules.initWadl(RootResourceUriRules.java:169) at com.sun.jersey.server.impl.application.RootResourceUriRules.<init>(RootResourceUriRules.java:106) at com.sun.jersey.server.impl.application.WebApplicationImpl._initiate(WebApplicationImpl.java:1359) at com.sun.jersey.server.impl.application.WebApplicationImpl.access$700(WebApplicationImpl.java:180) at com.sun.jersey.server.impl.application.WebApplicationImpl$13.f(WebApplicationImpl.java:799) at com.sun.jersey.server.impl.application.WebApplicationImpl$13.f(WebApplicationImpl.java:795) at com.sun.jersey.spi.inject.Errors.processWithErrors(Errors.java:193) at com.sun.jersey.server.impl.application.WebApplicationImpl.initiate(WebApplicationImpl.java:795) at com.sun.jersey.server.impl.application.WebApplicationImpl.initiate(WebApplicationImpl.java:790) at com.sun.jersey.spi.container.servlet.ServletContainer.initiate(ServletContainer.java:509) at com.sun.jersey.spi.container.servlet.ServletContainer$InternalWebComponent.initiate(ServletContainer.java:339) at com.sun.jersey.spi.container.servlet.WebComponent.load(WebComponent.java:605) at com.sun.jersey.spi.container.servlet.WebComponent.init(WebComponent.java:207) at com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:394) at com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:577) at javax.servlet.GenericServlet.init(GenericServlet.java:244) at org.eclipse.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:643) at org.eclipse.jetty.servlet.ServletHolder.getServlet(ServletHolder.java:499) at org.eclipse.jetty.servlet.ServletHolder.ensureInstance(ServletHolder.java:791) at org.eclipse.jetty.servlet.ServletHolder.prepare(ServletHolder.java:776) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:579) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) at org.eclipse.jetty.server.Server.handle(Server.java:539) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:333) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108) at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671) at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: java.lang.ClassNotFoundException: javax.activation.DataSource at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583) at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521) ... 65 more 2019-06-18 15:35:01,950 WARN org.eclipse.jetty.servlet.ServletHandler: /webhdfs/v1/ java.lang.NullPointerException at com.sun.jersey.spi.container.ContainerRequest.<init>(ContainerRequest.java:189) at com.sun.jersey.spi.container.servlet.WebComponent.createRequest(WebComponent.java:446) at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:373) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:848) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1772) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:644) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.apache.hadoop.hdfs.web.AuthFilter.doFilter(AuthFilter.java:90) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1609) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) at org.eclipse.jetty.server.Server.handle(Server.java:539) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:333) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108) at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671) at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589) at java.base/java.lang.Thread.run(Thread.java:834) 2019-06-18 15:39:17,698 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Number of transactions: 3 Total time for transactions(ms): 56 Number of transactions batched in Syncs: 0 Number of syncs: 2 SyncTimes(ms): 22 2019-06-18 15:39:25,202 WARN org.eclipse.jetty.servlet.ServletHandler: /webhdfs/v1/ java.lang.NullPointerException at com.sun.jersey.spi.container.ContainerRequest.<init>(ContainerRequest.java:189) at com.sun.jersey.spi.container.servlet.WebComponent.createRequest(WebComponent.java:446) at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:373) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:848) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1772) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:644) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.apache.hadoop.hdfs.web.AuthFilter.doFilter(AuthFilter.java:90) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1609) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) at org.eclipse.jetty.server.Server.handle(Server.java:539) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:333) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108) at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671) at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589) at java.base/java.lang.Thread.run(Thread.java:834) 2019-06-18 15:39:45,858 WARN org.eclipse.jetty.servlet.ServletHandler: /webhdfs/v1/ java.lang.NullPointerException at com.sun.jersey.spi.container.ContainerRequest.<init>(ContainerRequest.java:189) at com.sun.jersey.spi.container.servlet.WebComponent.createRequest(WebComponent.java:446) at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:373) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:848) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1772) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:644) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.apache.hadoop.hdfs.web.AuthFilter.doFilter(AuthFilter.java:90) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1609) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) at org.eclipse.jetty.server.Server.handle(Server.java:539) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:333) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108) at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671) at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589) at java.base/java.lang.Thread.run(Thread.java:834) ``` 附datanode日志: 2019-06-18 14:52:36,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = gx-virtual-machine/127.0.1.1 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.2.0 STARTUP_MSG: classpath = /usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/kerby-xdr-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-core-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.7.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-crypto-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/re2j-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-databind-2.9.5.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.25.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-annotations-2.9.5.jar:/usr/local/hadoop/share/hadoop/common/lib/audience-annotations-0.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.12.0.jar:/usr/local/hadoop/share/hadoop/common/lib/json-smart-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jcip-annotations-1.0-1.jar:/usr/local/hadoop/share/hadoop/common/lib/kerby-pkix-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-3.2.0.jar:/usr/local/hadoop/share/hadoop/common/lib/token-provider-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr311-api-1.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.19.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core4-4.1.0-incubating.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-lang3-3.7.jar:/usr/local/hadoop/share/hadoop/common/lib/dnsjava-2.1.7.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.9.3.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/javax.servlet-api-3.1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-server-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.12.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-client-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/woodstox-core-5.0.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-admin-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-security-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.6.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-text-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.5.2.jar:/usr/local/hadoop/share/hadoop/common/lib/kerby-config-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-server-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-servlet-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-util-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-5.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.11.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-identity-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/accessors-smart-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-xml-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/jul-to-slf4j-1.7.25.jar:/usr/local/hadoop/share/hadoop/common/lib/kerby-util-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-http-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/metrics-core-3.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration2-2.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.12.0.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-common-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.10.5.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/nimbus-jose-jwt-4.41.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.4.4.jar:/usr/local/hadoop/share/hadoop/common/lib/kerby-asn1-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.19.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-3.0.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.19.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-2.9.5.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-3.2.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.54.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-webapp-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/stax2-api-3.1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-servlet-1.19.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-simplekdc-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-io-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/common/hadoop-kms-3.2.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-3.2.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/kerby-xdr-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-core-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/avro-1.7.7.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-crypto-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/re2j-1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/json-simple-1.1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-databind-2.9.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-annotations-2.9.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/audience-annotations-0.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/curator-client-2.12.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/json-smart-2.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jcip-annotations-1.0-1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerby-pkix-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/hadoop-annotations-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/token-provider-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr311-api-1.1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.19.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core4-4.1.0-incubating.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang3-3.7.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/dnsjava-2.1.7.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-collections-3.2.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-beanutils-1.9.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.servlet-api-3.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-server-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/curator-recipes-2.12.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-client-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/okhttp-2.7.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/woodstox-core-5.0.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-admin-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-security-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-net-3.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-text-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/httpclient-4.5.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerby-config-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-server-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-servlet-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/okio-1.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-ajax-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-util-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-5.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/zookeeper-3.4.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.11.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-identity-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/accessors-smart-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-xml-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-all-4.0.52.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerby-util-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-http-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-configuration2-2.1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/curator-framework-2.12.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-common-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.10.5.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/nimbus-jose-jwt-4.41.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/snappy-java-1.0.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/httpcore-4.4.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerby-asn1-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-json-1.19.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.19.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-2.9.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/hadoop-auth-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jaxb-api-2.2.11.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsch-0.1.54.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-webapp-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/stax2-api-3.1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-servlet-1.19.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-simplekdc-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-io-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-client-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-native-client-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-native-client-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-rbf-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-httpfs-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-client-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-rbf-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-nativetask-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-uploader-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-4.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/json-io-2.5.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.19.jar:/usr/local/hadoop/share/hadoop/yarn/lib/mssql-jdbc-6.2.1.jre7.jar:/usr/local/hadoop/share/hadoop/yarn/lib/java-util-1.9.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/ehcache-3.3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/swagger-annotations-1.5.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/HikariCP-java7-2.4.12.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/fst-2.50.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.19.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-4.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/snakeyaml-1.16.jar:/usr/local/hadoop/share/hadoop/yarn/lib/metrics-core-3.2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/objenesis-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-base-2.9.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-module-jaxb-annotations-2.9.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-json-provider-2.9.5.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-submarine-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-timeline-pluginstorage-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-services-api-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-services-core-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-router-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-3.2.0.jar STARTUP_MSG: build = https://github.com/apache/hadoop.git -r e97acb3bd8f3befd27418996fa5d4b50bf2e17bf; compiled by 'sunilg' on 2019-01-08T06:08Z STARTUP_MSG: java = 11.0.3 ************************************************************/ 2019-06-18 14:52:36,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2019-06-18 14:52:41,503 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/usr/local/hadoop/tmp/dfs/data 2019-06-18 14:52:42,424 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2019-06-18 14:52:44,123 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2019-06-18 14:52:44,123 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2019-06-18 14:52:46,504 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2019-06-18 14:52:46,511 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2019-06-18 14:52:46,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is gx-virtual-machine 2019-06-18 14:52:46,567 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2019-06-18 14:52:46,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 0 2019-06-18 14:52:46,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:9866 2019-06-18 14:52:46,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2019-06-18 14:52:46,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2019-06-18 14:52:47,198 INFO org.eclipse.jetty.util.log: Logging initialized @15269ms 2019-06-18 14:52:48,022 INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. 2019-06-18 14:52:48,062 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2019-06-18 14:52:48,161 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2019-06-18 14:52:48,174 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode 2019-06-18 14:52:48,174 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static 2019-06-18 14:52:48,174 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs 2019-06-18 14:52:48,556 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 44121 2019-06-18 14:52:48,580 INFO org.eclipse.jetty.server.Server: jetty-9.3.24.v20180605, build timestamp: 2018-06-06T01:11:56+08:00, git hash: 84205aa28f11a4f31f2a3b86d1bba2cc8ab69827 2019-06-18 14:52:49,011 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@7876d598{/logs,file:///usr/local/hadoop/logs/,AVAILABLE} 2019-06-18 14:52:49,018 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@5af28b27{/static,file:///usr/local/hadoop/share/hadoop/hdfs/webapps/static/,AVAILABLE} 2019-06-18 14:52:50,151 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@547e29a4{/,file:///usr/local/hadoop/share/hadoop/hdfs/webapps/datanode/,AVAILABLE}{/datanode} 2019-06-18 14:52:50,242 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@6f45a1a0{HTTP/1.1,[http/1.1]}{localhost:44121} 2019-06-18 14:52:50,243 INFO org.eclipse.jetty.server.Server: Started @18329ms 2019-06-18 14:52:52,165 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /0.0.0.0:9864 2019-06-18 14:52:52,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hadoop 2019-06-18 14:52:52,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2019-06-18 14:52:52,242 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2019-06-18 14:52:52,720 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue, queueCapacity: 1000, scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler, ipcBackoff: false. 2019-06-18 14:52:52,880 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2019-06-18 14:52:54,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /0.0.0.0:9867 2019-06-18 14:52:55,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2019-06-18 14:52:55,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: <default> 2019-06-18 14:52:55,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:9000 starting to offer service 2019-06-18 14:52:55,532 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2019-06-18 14:52:55,561 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2019-06-18 14:52:58,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:9000 2019-06-18 14:52:58,329 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 1 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=1, dataDirs=1) 2019-06-18 14:52:58,458 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /usr/local/hadoop/tmp/dfs/data/in_use.lock acquired by nodename 55815@gx-virtual-machine 2019-06-18 14:52:58,478 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory with location [DISK]file:/usr/local/hadoop/tmp/dfs/data is not formatted for namespace 317473294. Formatting... 2019-06-18 14:52:58,479 INFO org.apache.hadoop.hdfs.server.common.Storage: Generated new storageID DS-8b3e1e6d-135a-433a-93bb-3e62598daf5e for directory /usr/local/hadoop/tmp/dfs/data 2019-06-18 14:52:58,749 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-200946205-127.0.1.1-1560840480894 2019-06-18 14:52:58,750 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /usr/local/hadoop/tmp/dfs/data/current/BP-200946205-127.0.1.1-1560840480894 2019-06-18 14:52:58,753 INFO org.apache.hadoop.hdfs.server.common.Storage: Block pool storage directory for location [DISK]file:/usr/local/hadoop/tmp/dfs/data and block pool id BP-200946205-127.0.1.1-1560840480894 is not formatted. Formatting ... 2019-06-18 14:52:58,753 INFO org.apache.hadoop.hdfs.server.common.Storage: Formatting block pool BP-200946205-127.0.1.1-1560840480894 directory /usr/local/hadoop/tmp/dfs/data/current/BP-200946205-127.0.1.1-1560840480894/current 2019-06-18 14:52:58,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Setting up storage: nsid=317473294;bpid=BP-200946205-127.0.1.1-1560840480894;lv=-57;nsInfo=lv=-65;cid=CID-eb45654d-0bc6-4348-b02f-e03603e1ae37;nsid=317473294;c=1560840480894;bpid=BP-200946205-127.0.1.1-1560840480894;dnuuid=null 2019-06-18 14:52:58,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Generated and persisted new Datanode UUID 6a2049c6-1a18-437a-97bd-51c5bb65a639 2019-06-18 14:52:59,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-8b3e1e6d-135a-433a-93bb-3e62598daf5e 2019-06-18 14:52:59,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/usr/local/hadoop/tmp/dfs/data, StorageType: DISK 2019-06-18 14:52:59,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Registered FSDatasetState MBean 2019-06-18 14:52:59,680 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /usr/local/hadoop/tmp/dfs/data 2019-06-18 14:52:59,801 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /usr/local/hadoop/tmp/dfs/data 2019-06-18 14:52:59,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding block pool BP-200946205-127.0.1.1-1560840480894 2019-06-18 14:52:59,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-200946205-127.0.1.1-1560840480894 on volume /usr/local/hadoop/tmp/dfs/data... 2019-06-18 14:53:00,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-200946205-127.0.1.1-1560840480894 on /usr/local/hadoop/tmp/dfs/data: 327ms 2019-06-18 14:53:00,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to scan all replicas for block pool BP-200946205-127.0.1.1-1560840480894: 359ms 2019-06-18 14:53:00,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-200946205-127.0.1.1-1560840480894 on volume /usr/local/hadoop/tmp/dfs/data... 2019-06-18 14:53:00,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /usr/local/hadoop/tmp/dfs/data/current/BP-200946205-127.0.1.1-1560840480894/current/replicas doesn't exist 2019-06-18 14:53:00,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-200946205-127.0.1.1-1560840480894 on volume /usr/local/hadoop/tmp/dfs/data: 17ms 2019-06-18 14:53:00,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to add all replicas to map for block pool BP-200946205-127.0.1.1-1560840480894: 27ms 2019-06-18 14:53:00,208 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: Now scanning bpid BP-200946205-127.0.1.1-1560840480894 on volume /usr/local/hadoop/tmp/dfs/data 2019-06-18 14:53:00,221 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/usr/local/hadoop/tmp/dfs/data, DS-8b3e1e6d-135a-433a-93bb-3e62598daf5e): finished scanning block pool BP-200946205-127.0.1.1-1560840480894 2019-06-18 14:53:00,401 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/usr/local/hadoop/tmp/dfs/data, DS-8b3e1e6d-135a-433a-93bb-3e62598daf5e): no suitable block pools found to scan. Waiting 1814399799 ms. 2019-06-18 14:53:00,418 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Periodic Directory Tree Verification scan starting at 2019/6/18 下午8:05 with interval of 21600000ms 2019-06-18 14:53:00,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool BP-200946205-127.0.1.1-1560840480894 (Datanode Uuid 6a2049c6-1a18-437a-97bd-51c5bb65a639) service to localhost/127.0.0.1:9000 beginning handshake with NN 2019-06-18 14:53:00,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool Block pool BP-200946205-127.0.1.1-1560840480894 (Datanode Uuid 6a2049c6-1a18-437a-97bd-51c5bb65a639) service to localhost/127.0.0.1:9000 successfully registered with NN 2019-06-18 14:53:00,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: For namenode localhost/127.0.0.1:9000 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000 2019-06-18 14:53:01,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0xb210af820fa10abf, containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 19 msec to generate and 231 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2019-06-18 14:53:01,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-200946205-127.0.1.1-1560840480894 2019-06-18 15:44:37,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741825_1001 src: /127.0.0.1:34774 dest: /127.0.0.1:9866 2019-06-18 15:44:37,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34774, dest: /127.0.0.1:9866, bytes: 8260, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741825_1001, duration(ns): 75831098 2019-06-18 15:44:37,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741826_1002 src: /127.0.0.1:34776 dest: /127.0.0.1:9866 2019-06-18 15:44:38,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34776, dest: /127.0.0.1:9866, bytes: 953, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741826_1002, duration(ns): 5252820 2019-06-18 15:44:38,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741827_1003 src: /127.0.0.1:34778 dest: /127.0.0.1:9866 2019-06-18 15:44:38,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34778, dest: /127.0.0.1:9866, bytes: 11392, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741827_1003, duration(ns): 19816531 2019-06-18 15:44:38,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741827_1003, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741828_1004 src: /127.0.0.1:34780 dest: /127.0.0.1:9866 2019-06-18 15:44:38,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34780, dest: /127.0.0.1:9866, bytes: 1061, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741828_1004, duration(ns): 9820674 2019-06-18 15:44:38,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741828_1004, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741829_1005 src: /127.0.0.1:34782 dest: /127.0.0.1:9866 2019-06-18 15:44:38,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34782, dest: /127.0.0.1:9866, bytes: 620, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741829_1005, duration(ns): 9424051 2019-06-18 15:44:38,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741829_1005, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741830_1006 src: /127.0.0.1:34784 dest: /127.0.0.1:9866 2019-06-18 15:44:38,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34784, dest: /127.0.0.1:9866, bytes: 3518, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741830_1006, duration(ns): 6662498 2019-06-18 15:44:38,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741830_1006, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741831_1007 src: /127.0.0.1:34786 dest: /127.0.0.1:9866 2019-06-18 15:44:38,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34786, dest: /127.0.0.1:9866, bytes: 682, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741831_1007, duration(ns): 5047916 2019-06-18 15:44:38,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741831_1007, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741832_1008 src: /127.0.0.1:34788 dest: /127.0.0.1:9866 2019-06-18 15:44:38,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34788, dest: /127.0.0.1:9866, bytes: 758, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741832_1008, duration(ns): 8532382 2019-06-18 15:44:38,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741832_1008, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741833_1009 src: /127.0.0.1:34790 dest: /127.0.0.1:9866 2019-06-18 15:44:38,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34790, dest: /127.0.0.1:9866, bytes: 690, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741833_1009, duration(ns): 5589094 2019-06-18 15:44:38,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741833_1009, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:01,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741834_1010 src: /127.0.0.1:36578 dest: /127.0.0.1:9866 2019-06-19 09:54:02,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36578, dest: /127.0.0.1:9866, bytes: 8260, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741834_1010, duration(ns): 32739756 2019-06-19 09:54:02,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741834_1010, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741835_1011 src: /127.0.0.1:36580 dest: /127.0.0.1:9866 2019-06-19 09:54:02,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36580, dest: /127.0.0.1:9866, bytes: 953, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741835_1011, duration(ns): 12137675 2019-06-19 09:54:02,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741835_1011, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741836_1012 src: /127.0.0.1:36582 dest: /127.0.0.1:9866 2019-06-19 09:54:02,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36582, dest: /127.0.0.1:9866, bytes: 11392, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741836_1012, duration(ns): 8740891 2019-06-19 09:54:02,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741836_1012, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741837_1013 src: /127.0.0.1:36584 dest: /127.0.0.1:9866 2019-06-19 09:54:02,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36584, dest: /127.0.0.1:9866, bytes: 1061, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741837_1013, duration(ns): 8680367 2019-06-19 09:54:02,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741837_1013, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741838_1014 src: /127.0.0.1:36586 dest: /127.0.0.1:9866 2019-06-19 09:54:02,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36586, dest: /127.0.0.1:9866, bytes: 620, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741838_1014, duration(ns): 8474258 2019-06-19 09:54:02,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741838_1014, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741839_1015 src: /127.0.0.1:36588 dest: /127.0.0.1:9866 2019-06-19 09:54:02,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36588, dest: /127.0.0.1:9866, bytes: 3518, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741839_1015, duration(ns): 6946259 2019-06-19 09:54:02,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741839_1015, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741840_1016 src: /127.0.0.1:36590 dest: /127.0.0.1:9866 2019-06-19 09:54:02,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36590, dest: /127.0.0.1:9866, bytes: 682, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741840_1016, duration(ns): 6602106 2019-06-19 09:54:02,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741840_1016, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741841_1017 src: /127.0.0.1:36592 dest: /127.0.0.1:9866 2019-06-19 09:54:02,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36592, dest: /127.0.0.1:9866, bytes: 758, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741841_1017, duration(ns): 9690339 2019-06-19 09:54:02,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder:

进行ROS kinetic环境配置时显示:bash:/opt/ros/kinetic/setup.bash:没有那个文件或目录?

自己确实是照网上说的,用gedit~/.bashrc命令检查,也更正了大小写和无关的配置,但更正错误后再次运行source~/.bashrc时,还是会出现同样的问题。是不是我真的没有/opt/ros/kinetic/setup这个文件啊?因为我记得我之前有几个软件包下载失败了(猜测) 运行gedit~/.bashrc后的结果如下:

mysql正常启动后一直报错,直到mysql死掉

mysql正常启动后一直报错,直到mysql死掉后不再报错。这时候我再启动mysql,有继续报错。 希望帮忙解决,谢谢 ![图片说明](https://img-ask.csdn.net/upload/201801/07/1515326909_823451.png) /usr/local/mysql/bin/mysqld_safe: line 183: 8747 Killed nohup /usr/local/mysql/bin/mysqld --basedir=/usr/local/mysql --datadir=/usr/local/mysql/data --plugin-dir=/usr/local/mysqlb/plugin --log-error=VM_161_52_centos.err --pid-file=/usr/local/mysql/data/VM_161_52_centos.pid --socket=/tmp/mysql.sock --port=3306 < /dev/null >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 /usr/local/mysql/bin/mysqld_safe: line 183: 9058 Killed nohup /usr/local/mysql/bin/mysqld --basedir=/usr/local/mysql --datadir=/usr/local/mysql/data --plugin-dir=/usr/local/mysqlb/plugin --log-error=VM_161_52_centos.err --pid-file=/usr/local/mysql/data/VM_161_52_centos.pid --socket=/tmp/mysql.sock --port=3306 < /dev/null >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 /usr/local/mysql/bin/mysqld_safe: line 183: 9506 Killed nohup /usr/local/mysql/bin/mysqld --basedir=/usr/local/mysql --datadir=/usr/local/mysql/data --plugin-dir=/usr/local/mysqlb/plugin --log-error=VM_161_52_centos.err --pid-file=/usr/local/mysql/data/VM_161_52_centos.pid --socket=/tmp/mysql.sock --port=3306 < /dev/null >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 /usr/local/mysql/bin/mysqld_safe: line 183: 10076 Killed nohup /usr/local/mysql/bin/mysqld --basedir=/usr/local/mysql --datadir=/usr/local/mysql/data --plugin-dir=/usr/local/mysqlb/plugin --log-error=VM_161_52_centos.err --pid-file=/usr/local/mysql/data/VM_161_52_centos.pid --socket=/tmp/mysql.sock --port=3306 < /dev/null >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 /usr/local/mysql/bin/mysqld_safe: line 183: 10231 Killed nohup /usr/local/mysql/bin/mysqld --basedir=/usr/local/mysql --datadir=/usr/local/mysql/data --plugin-dir=/usr/local/mysqlb/plugin --log-error=VM_161_52_centos.err --pid-file=/usr/local/mysql/data/VM_161_52_centos.pid --socket=/tmp/mysql.sock --port=3306 < /dev/null >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1 >> /usr/local/mysql/data/VM_161_52_centos.err 2>&1

SOAP-ERROR:解析WSDL:无法从'http://10.31.1.106:8080/gss-0.1/FileUtilities?wsdl'加载

<div class="post-text" itemprop="text"> <p>I hope someone can help me because I am looking for the solution since 2 working days...</p> <p>I have a php website which need to request a distant webservice. When I try it on my computer (Win7 + WAMP), it works well. When I try it on the webserver (CentOs6), it return me the following error :</p> <pre><code>SOAP-ERROR: Parsing WSDL: Couldn't load from 'http://ip_address:8080/gss-0.1/FileUtilities?wsdl' : failed to load external entity "http://ip_address:8080/gss-0.1/FileUtilities?wsdl" </code></pre> <p>NB: the webservice is on the same network of my webserver (I do not use the same IP/port).</p> <p>I know that the WSDL is accessible from the webserver because it is accessible using curl linux function.</p> <pre><code>curl http://ip_address:8080 </code></pre> <p>My php code is:</p> <pre><code>&lt;?php $wsdl = "http://ip_address:8080/gss-0.1/FileUtilities?wsdl"; try{ $client = new SoapClient($wsdl ,array('trace' =&gt; 1, 'exceptions'=&gt; 1, 'cache_wsdl' =&gt; WSDL_CACHE_NONE )); } catch(Exception $e) { echo $e-&gt;getMessage(); }?&gt; </code></pre> </div>

svn checkout 报错代码库文件不存在

我在centos系统yum安装了svn服务器,版本是1.6的 本地是mac系统,自带svn,版本是1.7 checkout时候报错,svn: E170000: URL 'svn://123.57.69.204/opt/svn/repositories' doesn't exist 求大神指教。

zabbix3.4安装后启动报错:Can't open PID file /run/zabbix/zabbix_server.pid

zabbix3.4安装后启动报错: 报错内容是找不到zabbix_server.pid文件,看了一下zabbix_server.conf中配置的PidFile配置项,在该目录下确实没有该文件,看网上有人说修改SELinux=disabled重启系统就有了,重启了还是没有,有没有碰到这个问题的大神,帮忙解答一下~ 修改后 [root@centos04 ~]# getenforce Disabled ``` [root@centos04 ~]# systemctl start zabbix-server Job for zabbix-server.service failed because a configured resource limit was exceeded. See "systemctl status zabbix-server.service" and "journalctl -xe" for details. [root@centos04 ~]# journalctl -xe -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit zabbix-server.service has begun starting up. Oct 19 00:17:48 centos04 systemd[1]: Can't open PID file /run/zabbix/zabbix_server.pid (yet?) after start: No such file or directory Oct 19 00:17:48 centos04 zabbix_server[1608]: zabbix_server [1610]: cannot open "/usr/local/logs/zabbix/zabbix_server.log": [13] Permission denied Oct 19 00:17:48 centos04 systemd[1]: zabbix-server.service never wrote its PID file. Failing. Oct 19 00:17:48 centos04 systemd[1]: Failed to start Zabbix Server. -- Subject: Unit zabbix-server.service has failed -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit zabbix-server.service has failed. -- -- The result is failed. Oct 19 00:17:48 centos04 systemd[1]: Unit zabbix-server.service entered failed state. Oct 19 00:17:48 centos04 systemd[1]: zabbix-server.service failed. Oct 19 00:17:59 centos04 systemd[1]: zabbix-server.service holdoff time over, scheduling restart. Oct 19 00:17:59 centos04 systemd[1]: Stopped Zabbix Server. -- Subject: Unit zabbix-server.service has finished shutting down -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit zabbix-server.service has finished shutting down. Oct 19 00:17:59 centos04 systemd[1]: Starting Zabbix Server... -- Subject: Unit zabbix-server.service has begun start-up -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit zabbix-server.service has begun starting up. Oct 19 00:17:59 centos04 systemd[1]: Can't open PID file /run/zabbix/zabbix_server.pid (yet?) after start: No such file or directory Oct 19 00:17:59 centos04 zabbix_server[1612]: zabbix_server [1614]: cannot open "/usr/local/logs/zabbix/zabbix_server.log": [13] Permission denied Oct 19 00:17:59 centos04 systemd[1]: zabbix-server.service never wrote its PID file. Failing. Oct 19 00:17:59 centos04 systemd[1]: Failed to start Zabbix Server. -- Subject: Unit zabbix-server.service has failed -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit zabbix-server.service has failed. -- -- The result is failed. Oct 19 00:17:59 centos04 systemd[1]: Unit zabbix-server.service entered failed state. Oct 19 00:17:59 centos04 systemd[1]: zabbix-server.service failed. ```

centos7上用rpmbuild建立rpm安装包问题(急)

centos7.1 php-5.6.25 $ ls rpmbuild/ BUILD BUILDROOT RPMS SOURCES SPECS SRPMS $cat rpmbuild/SPECS/php.spec Name: php Version: 5.6.25 Release: 1%{?dist} Summary: compiled from 5.6.25 by Kevin Group: System Environment/Daemons License: GPL URL: https://secure.php.net Source0: php-5.6.25.tar.gz BuildRoot: %(mktemp -ud %{_tmppath}/%{name}-%{version}-%{release}-XXXXXX) BuildRequires: gcc, gcc-c++, openssl-devel Requires: wireshark-gnome %description php server. Compiled from 5.6.25 by Kevin %prep %setup -q %build ./configure '--with-libdir=lib64' '--prefix=/usr/local/php5.6' '--with-fpm-systemd' '--enable-fpm' '--enable-mbstring' '--with-mysql' '--with-mysqli' '--with-pdo-mysql' '--with-gd' '--enable-gd-native-ttf' '--with-freetype-dir' '--with-curl' '--with-openssl' '--with-mcrypt' '--enable-zip' '--enable-intl' make %{?_smp_mflags} %install rm -rf %{buildroot} make install DESTDIR=%{buildroot} %clean rm -rf %{buildroot} %files %defattr(-,root,root,-) %defattr(-,root,root,-) /usr/local/php5.6/bin/* /usr/local/php5.6/sbin/* /usr/local/php5.6/include/* /usr/local/php5.6/php/php/fpm/* /usr/local/php5.6/php/man/man1/* /usr/local/php5.6/php/man/man8/* /usr/local/php5.6/lib/php/* /usr/local/php5.6/lib/php/extensions/* /usr/local/php5.6/lib/php/extensions/no-debug-non-zts-20131226/* /usr/local/php5.6/lib/php/build/* /usr/local/php5.6/lib/php/build/shtool/* /usr/local/php5.6/lib/php/.registry/* /usr/local/php5.6/lib/php/.channels/* /usr/local/php5.6/lib/php/Archive/* /usr/local/php5.6/lib/php/doc/* /usr/local/php5.6/lib/php/doc/Archive_Tar/* /usr/local/php5.6/lib/php/doc/Archive_Tar/docs/* /usr/local/php5.6/lib/php/doc/Structures_Graph/* /usr/local/php5.6/lib/php/doc/Structures_Graph/docs/* /usr/local/php5.6/lib/php/doc/Structures_Graph/docs/tutorials/* /usr/local/php5.6/lib/php/doc/Structures_Graph/docs/tutorials/Structures_Graph/* /usr/local/php5.6/lib/php/doc/Structures_Graph/LICENSE/* /usr/local/php5.6/lib/php/doc/XML_Util/* /usr/local/php5.6/lib/php/doc/XML_Util/examples/* /usr/local/php5.6/lib/php/doc/PEAR/* /usr/local/php5.6/lib/php/doc/PEAR/LICENSE/* /usr/local/php5.6/lib/php/doc/PEAR/INSTALL/* /usr/local/php5.6/lib/php/Console/* /usr/local/php5.6/lib/php/test/* /usr/local/php5.6/lib/php/test/Console_Getopt/* /usr/local/php5.6/lib/php/test/Console_Getopt/tests/* /usr/local/php5.6/lib/php/test/Structures_Graph/* /usr/local/php5.6/lib/php/test/Structures_Graph/tests/* /usr/local/php5.6/lib/php/test/XML_Util/* /usr/local/php5.6/lib/php/test/XML_Util/tests/* /usr/local/php5.6/lib/php/Structures/* /usr/local/php5.6/lib/php/Structures/Graph/* /usr/local/php5.6/lib/php/Structures/Graph/Manipulator/* /usr/local/php5.6/lib/php/XML/* /usr/local/php5.6/lib/php/OS/* /usr/local/php5.6/lib/php/PEAR/* /usr/local/php5.6/lib/php/PEAR/ChannelFile/* /usr/local/php5.6/lib/php/PEAR/Command/* /usr/local/php5.6/lib/php/PEAR/Downloader/* /usr/local/php5.6/lib/php/PEAR/Frontend/* /usr/local/php5.6/lib/php/PEAR/Installer/* /usr/local/php5.6/lib/php/PEAR/Installer/Role/* /usr/local/php5.6/lib/php/PEAR/PackageFile/* /usr/local/php5.6/lib/php/PEAR/PackageFile/Generator/* /usr/local/php5.6/lib/php/PEAR/PackageFile/Parser/* /usr/local/php5.6/lib/php/PEAR/PackageFile/v2/* /usr/local/php5.6/lib/php/PEAR/REST/* /usr/local/php5.6/lib/php/PEAR/Task/* /usr/local/php5.6/lib/php/PEAR/Task/Postinstallscript/* /usr/local/php5.6/lib/php/PEAR/Task/Replace/* /usr/local/php5.6/lib/php/PEAR/Task/Unixeol/* /usr/local/php5.6/lib/php/PEAR/Task/Windowseol/* /usr/local/php5.6/lib/php/PEAR/Validator/* /usr/local/php5.6/lib/php/data/* /usr/local/php5.6/lib/php/data/PEAR/* %config /usr/local/php5.6/etc/* /usr/local/php5.6/var/run/* /usr/local/php5.6/lib/* %dir /usr/local/php5.6/var/log/ %post cp /usr/local/php5.6/etc/php-fpm.conf.default /usr/local/php5.6/etc/php-fpm.conf ln -s /usr/local/php5.6/bin/php /usr/bin/php %changelog * Tue Aug 30 2016 Kevin<kevin_liao@163.com> 5.6.25 - first rpm from php-5.6.25 然后用一个user用户为centos,没在root下执行 $ rpmbuild -ba rpmbuild/SPECS/php.spec 最后报错 Build complete. Don't forget to run 'make test'. + exit 0 Executing(%install): /bin/sh -e /var/tmp/rpm-tmp.QjJHDq + umask 022 + cd /home/centos/rpmbuild/BUILD + '[' /home/centos/rpmbuild/BUILDROOT/php-5.6.25-1.el7.centos.x86_64 '!=' / ']' + rm -rf /home/centos/rpmbuild/BUILDROOT/php-5.6.25-1.el7.centos.x86_64 ++ dirname /home/centos/rpmbuild/BUILDROOT/php-5.6.25-1.el7.centos.x86_64 + mkdir -p /home/centos/rpmbuild/BUILDROOT + mkdir /home/centos/rpmbuild/BUILDROOT/php-5.6.25-1.el7.centos.x86_64 + cd php-5.6.25 + rm -rf /home/centos/rpmbuild/BUILDROOT/php-5.6.25-1.el7.centos.x86_64 + make install DESTDIR=/home/centos/rpmbuild/BUILDROOT/php-5.6.25-1.el7.centos.x86_64 Installing shared extensions: /usr/local/php5.6/lib/php/extensions/no-debug-non-zts-20131226/ Installing PHP CLI binary: /usr/local/php5.6/bin/ Installing PHP CLI man page: /usr/local/php5.6/php/man/man1/ Installing PHP FPM binary: /usr/local/php5.6/sbin/ Installing PHP FPM config: /usr/local/php5.6/etc/ Installing PHP FPM man page: /usr/local/php5.6/php/man/man8/ Installing PHP FPM status page: /usr/local/php5.6/php/php/fpm/ Installing PHP CGI binary: /usr/local/php5.6/bin/ Installing PHP CGI man page: /usr/local/php5.6/php/man/man1/ Installing build environment: /usr/local/php5.6/lib/php/build/ Installing header files: /usr/local/php5.6/include/php/ Installing helper programs: /usr/local/php5.6/bin/ program: phpize program: php-config Installing man pages: /usr/local/php5.6/php/man/man1/ page: phpize.1 page: php-config.1 Installing PEAR environment: /usr/local/php5.6/lib/php/ [PEAR] Archive_Tar - already installed: 1.4.0 [PEAR] Console_Getopt - already installed: 1.4.1 [PEAR] Structures_Graph- already installed: 1.1.1 [PEAR] XML_Util - already installed: 1.3.0 [PEAR] PEAR - already installed: 1.10.1 Wrote PEAR system config file at: /usr/local/php5.6/etc/pear.conf You may want to add: /usr/local/php5.6/lib/php to your php.ini include_path /home/centos/rpmbuild/BUILD/php-5.6.25/build/shtool install -c ext/phar/phar.phar /usr/local/php5.6/bin ln -s -f phar.phar /usr/local/php5.6/bin/phar Installing PDO headers: /usr/local/php5.6/include/php/ext/pdo/ + /usr/lib/rpm/find-debuginfo.sh --strict-build-id -m --run-dwz --dwz-low-mem-die-limit 10000000 --dwz-max-die-limit 110000000 /home/centos/rpmbuild/BUILD/php-5.6.25 find: '/home/centos/rpmbuild/BUILDROOT/php-5.6.25-1.el7.centos.x86_64': No such file or directory error: Bad exit status from /var/tmp/rpm-tmp.QjJHDq (%install) RPM build errors: Bad exit status from /var/tmp/rpm-tmp.QjJHDq (%install)

centos7.6 安装python3.执行/configure时,gcc异常

安装python时,执行./configure初始编译,报gcc没有工作: 错误信息: ``` checking build system type... x86_64-unknown-linux-gnu checking host system type... x86_64-unknown-linux-gnu checking for python3.6... no checking for python3... no checking for python... python checking for --enable-universalsdk... no checking for --with-universal-archs... no checking MACHDEP... linux checking for --without-gcc... no checking for --with-icc... no checking for gcc... gcc checking whether the C compiler works... no configure: error: in `/opt/Python-3.6.0': configure: error: C compiler cannot create executables ``` config.log信息: ![图片说明](https://img-ask.csdn.net/upload/201909/09/1568015863_225982.png) 去看日志发现ld找不到 执行which ld确实找不到。 ![图片说明](https://img-ask.csdn.net/upload/201909/09/1568015884_257870.png) 去搜索资料,都说安装gcc即可,可是我已经把python所需的依赖全部安装了,c编译器还是没有工作,我想可能是ld问题,我在安装pydoop时也遇到ld问题,很头疼,求解!!!

在创建dockerFile文件中出现/bin/sh: rm: command not found

* 通过学习dockerfile命令试着自己做一下练习,在联系过程中出现 `/bin/sh: rm: command not found` * 代码 ```` FROM centos:latest MAINTAINER my<my@163.com> ENV PATH /tmp WORKDIR $PATH RUN rm -rf /tmp CMD /bin/bash ```` * Build过程 ``` Sending build context to Docker daemon 2.048 kB Step 1/6 : FROM centos:latest ---> 1e1148e4cc2c Step 2/6 : MAINTAINER my<my@163.com> ---> Running in dc6073af05a2 ---> b6f7e3b7bca8 Removing intermediate container dc6073af05a2 Step 3/6 : ENV PATH /tmp ---> Running in 7d000413c304 ---> f643d6b20259 Removing intermediate container 7d000413c304 Step 4/6 : WORKDIR $PATH ---> 340ef4b29f1c Removing intermediate container 658c7d7a96b9 Step 5/6 : RUN rm -rf /tmp ---> Running in 3aad9ab202d1 /bin/sh: rm: command not found The command '/bin/sh -c rm -rf /tmp' returned a non-zero code: 127 ``` 以上请大神指点迷津

make编译报错:fatal error: gnu/stubs-64.h:

CentOS7安装编译unixbench工具,执行make编译时候报错: In file included from /usr/include/features.h:399:0, from /usr/include/string.h:27, from ./src/arith.c:32: /usr/include/gnu/stubs.h:10:27: fatal error: gnu/stubs-64.h: 麻烦各路大神帮忙解决下,谢谢!

go / pkg / tool / linux_amd64 / link:运行gcc失败:退出状态1 / usr / bin / ld:找不到-lgdal

<div class="post-text" itemprop="text"> <p>I am trying to run <code>go build</code> on my sources. </p> <pre><code>go/pkg/tool/linux_amd64/link: running gcc failed: exit status 1 /usr/bin/ld: cannot find -lgdal collect2: error: ld returned 1 exit status </code></pre> <p>My <code>LD_LIBRARY_PATH</code> variable contains <code>/home/fzd/project/lib64</code>, the path to the dir of the <code>libgdal.so</code> file. My <code>PKG_CONFIG_PATH</code> contains the path to the dir of a <code>.pc</code> file with the following contents :</p> <pre><code>prefix=/home/fzd/project exec_prefix=${prefix} libdir=${prefix}/lib64 deplibdir=${prefix}/lib64 includedir=${prefix}/include Name: myLibs Description: Libs Requires: Version: v1.0 Libs: -L${deplibdir} -lgdal Cflags: -I${includedir} </code></pre> <p>I don't know which variable to check. Everything seems fine, and the fun part is that, when I clone my repo elsewhere, I don't have the issue (same <code>LD_LIBRARY_PATH</code>, etc.)</p> <p>Does anyone have a clue about what I could check?</p> <p>I am using go1.11.1, on CentOS7.6.</p> </div>

CentOS下启动apache报错,Syntax error on line 235 of /usr/local/apache2/conf/httpd.conf

CentOS下启动apache service httpd start Syntax error on line 235 of /usr/local/apache2/conf/httpd.conf: Cannot load /usr/local/apache2/modules/libphp5.so into server: /usr/local/apache2/modules/libphp5.so: symbol deflate, version libmysqlclient_16 not defined in file libmysqlclient.so.16 with link time reference 系统已经装好了,apache ,php,mysql5。 昨天因为mysql启动不起来,把编译版换成安装版了, 当时用起来没问题。今天系统自启动apache,就起不来了。 我把php再编译了一遍,还是这个错误。 重启也没用。 谁知道这是咋回事?

错误:fork / exec:没有这样的文件或目录—在docker中运行Golang代码时

<div class="post-text" itemprop="text"> <p>first of all any help is appreciated</p> <p>I want execute Gocode in a container, but when I try and execute in a container I get the following error:</p> <pre><code>[root@ping-app-3618687774-abqmo ping-app]# go run main.go start check_server INFO[0000] http.startHttpServer ok, listening 0.0.0.0:8080 source="api.go:106" INFO[0000] Index=ipdata exists=true source="elastic.go:164" INFO[0000] Index=ipdata Type=entity exists=true source="elastic.go:184" fork/exec : no such file or directory fork/exec : no such file or directory fork/exec : no such file or directory fork/exec : no such file or directory fork/exec : no such file or directory fork/exec : no such file or directory fork/exec : no such file or directory fork/exec : no such file or directory </code></pre> <p>I can't understand why this is happening and it only happened in docker, but when I run that Gocode on the host and it is ok.</p> <pre><code>Environment : Centos: 7.0.1406 Kubernetes: 1.2.0 docker: 1.10.3 </code></pre> <p>Thanks a lot for any suggestions.</p> </div>

安装oracle11g grid 执行root.sh时一直卡在Adding Clusterware entries to inittab

安装oracle11g(11204)grid standalone,在执行root.sh的时候卡在Adding Clusterware entries to inittab一直不动,/tmp/.oracle下面没有文件,查看/oracle/11204/grid_1/home/cfgtoollogs/crsconfig/roothas.log ...... 2019-08-21 10:41:28: Executing /oracle/11204/grid_1/home/bin/crsctl create scr grid 2019-08-21 10:41:28: Executing cmd: /oracle/11204/grid_1/home/bin/crsctl create scr grid 2019-08-21 10:41:28: Registering ohasd 2019-08-21 10:41:28: Executing cmd: /bin/rpm -q sles-release 2019-08-21 10:41:28: Command output: > package sles-release is not installed >End Command output 2019-08-21 10:41:28: init file = /oracle/11204/grid_1/home/crs/init/init.ohasd 2019-08-21 10:41:28: Copying file /oracle/11204/grid_1/home/crs/init/init.ohasd to /etc/init.d directory 2019-08-21 10:41:28: Setting init.ohasd permission in /etc/init.d directory 2019-08-21 10:41:28: init file = /oracle/11204/grid_1/home/crs/init/ohasd 2019-08-21 10:41:28: Copying file /oracle/11204/grid_1/home/crs/init/ohasd to /etc/init.d directory 2019-08-21 10:41:28: Setting ohasd permission in /etc/init.d directory 2019-08-21 10:41:28: Executing cmd: /bin/rpm -q sles-release 2019-08-21 10:41:28: Command output: > package sles-release is not installed >End Command output 2019-08-21 10:41:28: Removing "/etc/rc.d/rc3.d/S96ohasd" 2019-08-21 10:41:28: Removing file /etc/rc.d/rc3.d/S96ohasd 2019-08-21 10:41:28: Successfully removed file: /etc/rc.d/rc3.d/S96ohasd 2019-08-21 10:41:28: Creating a link "/etc/rc.d/rc3.d/S96ohasd" pointing to /etc/init.d/ohasd 2019-08-21 10:41:28: Removing "/etc/rc.d/rc5.d/S96ohasd" 2019-08-21 10:41:28: Removing file /etc/rc.d/rc5.d/S96ohasd 2019-08-21 10:41:28: Successfully removed file: /etc/rc.d/rc5.d/S96ohasd 2019-08-21 10:41:28: Creating a link "/etc/rc.d/rc5.d/S96ohasd" pointing to /etc/init.d/ohasd 2019-08-21 10:41:28: Removing "/etc/rc.d/rc0.d/K15ohasd" 2019-08-21 10:41:28: Removing file /etc/rc.d/rc0.d/K15ohasd 2019-08-21 10:41:28: Successfully removed file: /etc/rc.d/rc0.d/K15ohasd 2019-08-21 10:41:28: Creating a link "/etc/rc.d/rc0.d/K15ohasd" pointing to /etc/init.d/ohasd 2019-08-21 10:41:28: Removing "/etc/rc.d/rc1.d/K15ohasd" 2019-08-21 10:41:28: Removing file /etc/rc.d/rc1.d/K15ohasd 2019-08-21 10:41:28: Successfully removed file: /etc/rc.d/rc1.d/K15ohasd 2019-08-21 10:41:28: Creating a link "/etc/rc.d/rc1.d/K15ohasd" pointing to /etc/init.d/ohasd 2019-08-21 10:41:28: Removing "/etc/rc.d/rc2.d/K15ohasd" 2019-08-21 10:41:28: Removing file /etc/rc.d/rc2.d/K15ohasd 2019-08-21 10:41:28: Successfully removed file: /etc/rc.d/rc2.d/K15ohasd 2019-08-21 10:41:28: Creating a link "/etc/rc.d/rc2.d/K15ohasd" pointing to /etc/init.d/ohasd 2019-08-21 10:41:28: Removing "/etc/rc.d/rc4.d/K15ohasd" 2019-08-21 10:41:28: Removing file /etc/rc.d/rc4.d/K15ohasd 2019-08-21 10:41:28: Successfully removed file: /etc/rc.d/rc4.d/K15ohasd 2019-08-21 10:41:28: Creating a link "/etc/rc.d/rc4.d/K15ohasd" pointing to /etc/init.d/ohasd 2019-08-21 10:41:28: Removing "/etc/rc.d/rc6.d/K15ohasd" 2019-08-21 10:41:28: Removing file /etc/rc.d/rc6.d/K15ohasd 2019-08-21 10:41:28: Successfully removed file: /etc/rc.d/rc6.d/K15ohasd 2019-08-21 10:41:28: Creating a link "/etc/rc.d/rc6.d/K15ohasd" pointing to /etc/init.d/ohasd 2019-08-21 10:41:28: The file ohasd has been successfully linked to the RC directories 2019-08-21 10:41:28: Executing cmd: /bin/rpm -q sles-release 2019-08-21 10:41:28: Command output: > package sles-release is not installed >End Command output SS S_OHASD -state SUCCESS ' 2019-08-21 10:41:28: Removing file /tmp/filea21cdK 2019-08-21 10:41:28: Successfully removed file: /tmp/filea21cdK 2019-08-21 10:41:28: /bin/su successfully executed 2019-08-21 10:41:28: Succeeded in writing the checkpoint:'ROOTCRS_OHASD' with status:SUCCESS 2019-08-21 10:41:28: CkptFile: /oracle/11204/grid_1/base/Clusterware/ckptGridHA_edippsdr.xml 2019-08-21 10:41:28: Sync the checkpoint file '/oracle/11204/grid_1/base/Clusterware/ckptGridHA_edippsdr.xml' 2019-08-21 10:41:28: Sync '/oracle/11204/grid_1/base/Clusterware/ckptGridHA_edippsdr.xml' to the physical disk 2019-08-21 10:41:28: Starting ohasd 2019-08-21 10:41:28: Executing cmd: /bin/rpm -qf /sbin/init 2019-08-21 10:41:28: Command output: > SysVinit-2.86-15.el5 >End Command output 2019-08-21 10:41:28: itab entries=cssd|evmd|crsd|ohasd 2019-08-21 10:41:28: Executing /sbin/init q 2019-08-21 10:41:28: Executing cmd: /sbin/init q 2019-08-21 10:41:33: Created backup /etc/inittab.no_crs 2019-08-21 10:41:33: Appending to /etc/inittab.tmp: 2019-08-21 10:41:33: h1:35:respawn:/etc/init.d/init.ohasd run >/dev/null 2>&1 </dev/null 2019-08-21 10:41:33: Done updating /etc/inittab.tmp 2019-08-21 10:41:33: Saved /etc/inittab.crs 2019-08-21 10:41:33: Installed new /etc/inittab 2019-08-21 10:41:33: Executing /sbin/init q 2019-08-21 10:41:33: Executing cmd: /sbin/init q 2019-08-21 10:41:33: Executing cmd: /oracle/11204/grid_1/home/bin/crsctl start has 2019-08-21 10:41:34: Command output: > CRS-4640: Oracle High Availability Services is already active > CRS-4000: Command Start failed, or completed with errors. >End Command output 2019-08-21 10:41:34: Executing /etc/init.d/ohasd install 2019-08-21 10:41:34: Executing cmd: /etc/init.d/ohasd install 2019-08-21 10:41:34: ohasd is already active 2019-08-21 10:41:34: Started service 'ohasd' 2019-08-21 10:41:34: Checking ohasd 2019-08-21 10:41:34: Executing cmd: /oracle/11204/grid_1/home/bin/crsctl check has 因为本台服务器曾经是oracle10gRac的其中一个节点,改了一些参数,不知道是不是什么参数没设置好还是咋回事执行了超多次都卡在这....求大佬帮忙看看

误删了glibc,centos下报错-bash: /bin/ls: /lib64/ld-linux-x86-64.so.2: bad ELF interpreter: No such file or directory

因为操作不当误删了glibc库,然后任何命令都报错-bash: /bin/ls: /lib64/ld-linux-x86-64.so.2: bad ELF interpreter: No such file or directory;进入了救援模式,挂载镜像的时候又报错unknown filesystem type ios9660;然后要怎么样才能恢复系统呢

学Python后到底能干什么?网友:我太难了

感觉全世界营销文都在推Python,但是找不到工作的话,又有哪个机构会站出来给我推荐工作? 笔者冷静分析多方数据,想跟大家说:关于超越老牌霸主Java,过去几年间Python一直都被寄予厚望。但是事实是虽然上升趋势,但是国内环境下,一时间是无法马上就超越Java的,也可以换句话说:超越Java只是时间问题罢。 太嚣张了会Python的人!找工作拿高薪这么简单? https://edu....

大学四年自学走来,这些私藏的实用工具/学习网站我贡献出来了

大学四年,看课本是不可能一直看课本的了,对于学习,特别是自学,善于搜索网上的一些资源来辅助,还是非常有必要的,下面我就把这几年私藏的各种资源,网站贡献出来给你们。主要有:电子书搜索、实用工具、在线视频学习网站、非视频学习网站、软件下载、面试/求职必备网站。 注意:文中提到的所有资源,文末我都给你整理好了,你们只管拿去,如果觉得不错,转发、分享就是最大的支持了。 一、电子书搜索 对于大部分程序员...

在中国程序员是青春饭吗?

今年,我也32了 ,为了不给大家误导,咨询了猎头、圈内好友,以及年过35岁的几位老程序员……舍了老脸去揭人家伤疤……希望能给大家以帮助,记得帮我点赞哦。 目录: 你以为的人生 一次又一次的伤害 猎头界的真相 如何应对互联网行业的「中年危机」 一、你以为的人生 刚入行时,拿着傲人的工资,想着好好干,以为我们的人生是这样的: 等真到了那一天,你会发现,你的人生很可能是这样的: ...

为什么程序猿都不愿意去外包?

分享外包的组织架构,盈利模式,亲身经历,以及根据一些外包朋友的反馈,写了这篇文章 ,希望对正在找工作的老铁有所帮助

Java校招入职华为,半年后我跑路了

何来 我,一个双非本科弟弟,有幸在 19 届的秋招中得到前东家华为(以下简称 hw)的赏识,当时秋招签订就业协议,说是入了某 java bg,之后一系列组织架构调整原因等等让人无法理解的神操作,最终毕业前夕,被通知调往其他 bg 做嵌入式开发(纯 C 语言)。 由于已至于校招末尾,之前拿到的其他 offer 又无法再收回,一时感到无力回天,只得默默接受。 毕业后,直接入职开始了嵌入式苦旅,由于从未...

Java基础知识面试题(2020最新版)

文章目录Java概述何为编程什么是Javajdk1.5之后的三大版本JVM、JRE和JDK的关系什么是跨平台性?原理是什么Java语言有哪些特点什么是字节码?采用字节码的最大好处是什么什么是Java程序的主类?应用程序和小程序的主类有何不同?Java应用程序与小程序之间有那些差别?Java和C++的区别Oracle JDK 和 OpenJDK 的对比基础语法数据类型Java有哪些数据类型switc...

@程序员:GitHub这个项目快薅羊毛

今天下午在朋友圈看到很多人都在发github的羊毛,一时没明白是怎么回事。 后来上百度搜索了一下,原来真有这回事,毕竟资源主义的羊毛不少啊,1000刀刷爆了朋友圈!不知道你们的朋友圈有没有看到类似的消息。 这到底是啥情况? 微软开发者平台GitHub 的一个区块链项目 Handshake ,搞了一个招募新会员的活动,面向GitHub 上前 25万名开发者派送 4,246.99 HNS币,大约价...

用python打开电脑摄像头,并把图像传回qq邮箱【Pyinstaller打包】

前言: 如何悄悄的打开朋友的摄像头,看看她最近过的怎么样,嘿嘿!这次让我带你们来实现这个功能。 注: 这个程序仅限在朋友之间开玩笑,别去搞什么违法的事情哦。 代码 发送邮件 使用python内置的email模块即可完成。导入相应的代码封装为一个send函数,顺便导入需要导入的包 注: 下面的代码有三处要修改的地方,两处写的qq邮箱地址,还有一处写的qq邮箱授权码,不知道qq邮箱授权码的可以去百度一...

C++(继承):19---虚基类与虚继承(virtual)

一、菱形继承 在介绍虚继承之前介绍一下菱形继承 概念:A作为基类,B和C都继承与A。最后一个类D又继承于B和C,这样形式的继承称为菱形继承 菱形继承的缺点: 数据冗余:在D中会保存两份A的内容 访问不明确(二义性):因为D不知道是以B为中介去访问A还是以C为中介去访问A,因此在访问某些成员的时候会发生二义性 缺点的解决: 数据冗余:通过下面“虚继承”技术来解决(见下) 访问...

计算机网络——浅析网络层

一、前言 注意,关于ipv4和ipv6,ipv4是ip协议第4版本,也表示这个版本的ip一共4个字节,同样地,ipv6是ip协议第6版本,也表示这个版本的ip一共6个字节。 关于网络层使用路由器实现互联:在计算机网络的分层结构中,不同层有不同的中继设备: 计算机网络层 中继设备/中继系统 物理层 中继器、集线器Hub 数据链路层 网桥或交换机(交换机是多端口网桥,两者本质上是一个东西) 网络层 路...

我以为我学懂了数据结构,直到看了这个导图才发现,我错了

数据结构与算法思维导图

技术大佬:我去,你写的 switch 语句也太老土了吧

昨天早上通过远程的方式 review 了两名新来同事的代码,大部分代码都写得很漂亮,严谨的同时注释也很到位,这令我非常满意。但当我看到他们当中有一个人写的 switch 语句时,还是忍不住破口大骂:“我擦,小王,你丫写的 switch 语句也太老土了吧!” 来看看小王写的代码吧,看完不要骂我装逼啊。 private static String createPlayer(PlayerTypes p...

华为初面+综合面试(Java技术面)附上面试题

华为面试整体流程大致分为笔试,性格测试,面试,综合面试,回学校等结果。笔试来说,华为的难度较中等,选择题难度和网易腾讯差不多。最后的代码题,相比下来就简单很多,一共3道题目,前2题很容易就AC,题目已经记不太清楚,不过难度确实不大。最后一题最后提交的代码过了75%的样例,一直没有发现剩下的25%可能存在什么坑。 笔试部分太久远,我就不怎么回忆了。直接将面试。 面试 如果说腾讯的面试是挥金如土...

和黑客斗争的 6 天!

互联网公司工作,很难避免不和黑客们打交道,我呆过的两家互联网公司,几乎每月每天每分钟都有黑客在公司网站上扫描。有的是寻找 Sql 注入的缺口,有的是寻找线上服务器可能存在的漏洞,大部分都...

女程序员,为什么比男程序员少???

昨天看到一档综艺节目,讨论了两个话题:(1)中国学生的数学成绩,平均下来看,会比国外好?为什么?(2)男生的数学成绩,平均下来看,会比女生好?为什么?同时,我又联想到了一个技术圈经常讨...

总结了 150 余个神奇网站,你不来瞅瞅吗?

原博客再更新,可能就没了,之后将持续更新本篇博客。

副业收入是我做程序媛的3倍,工作外的B面人生是怎样的?

提到“程序员”,多数人脑海里首先想到的大约是:为人木讷、薪水超高、工作枯燥…… 然而,当离开工作岗位,撕去层层标签,脱下“程序员”这身外套,有的人生动又有趣,马上展现出了完全不同的A/B面人生! 不论是简单的爱好,还是正经的副业,他们都干得同样出色。偶尔,还能和程序员的特质结合,产生奇妙的“化学反应”。 @Charlotte:平日素颜示人,周末美妆博主 大家都以为程序媛也个个不修边幅,但我们也许...

MySQL数据库面试题(2020最新版)

文章目录数据库基础知识为什么要使用数据库什么是SQL?什么是MySQL?数据库三大范式是什么mysql有关权限的表都有哪几个MySQL的binlog有有几种录入格式?分别有什么区别?数据类型mysql有哪些数据类型引擎MySQL存储引擎MyISAM与InnoDB区别MyISAM索引与InnoDB索引的区别?InnoDB引擎的4大特性存储引擎选择索引什么是索引?索引有哪些优缺点?索引使用场景(重点)...

女朋友过生日,我花了20分钟给她写了一个代理服务器

女朋友说:“看你最近挺辛苦的,我送你一个礼物吧。你看看想要什么,我来准备。” 我想了半天,从书到鞋子到电子产品最后到生活用品,感觉自己什么都不缺,然后和她说:“你省省钱吧,我什么都不需要。” 她坚持要送:“不行,你一定要说一个礼物,我想送你东西了。” 于是,我认真了起来,拿起手机,上淘宝逛了几分钟,但还是没能想出来缺点什么,最后实在没办法了:“这样吧,如果你实在想送东西,那你就写一个代理服务器吧”...

记一次腾讯面试,我挂在了最熟悉不过的队列上……

腾讯后台面试,面试官问:如何自己实现队列?

如果你是老板,你会不会踢了这样的员工?

有个好朋友ZS,是技术总监,昨天问我:“有一个老下属,跟了我很多年,做事勤勤恳恳,主动性也很好。但随着公司的发展,他的进步速度,跟不上团队的步伐了,有点...

带了6个月的徒弟当了面试官,而身为高级工程师的我天天修Bug......

即将毕业的应届毕业生一枚,现在只拿到了两家offer,但最近听到一些消息,其中一个offer,我这个组据说客户很少,很有可能整组被裁掉。 想问大家: 如果我刚入职这个组就被裁了怎么办呢? 大家都是什么时候知道自己要被裁了的? 面试软技能指导: BQ/Project/Resume 试听内容: 除了刷题,还有哪些技能是拿到offer不可或缺的要素 如何提升面试软实力:简历, 行为面试,沟通能...

离职半年了,老东家又发 offer,回不回?

有小伙伴问松哥这个问题,他在上海某公司,在离职了几个月后,前公司的领导联系到他,希望他能够返聘回去,他很纠结要不要回去? 俗话说好马不吃回头草,但是这个小伙伴既然感到纠结了,我觉得至少说明了两个问题:1.曾经的公司还不错;2.现在的日子也不是很如意。否则应该就不会纠结了。 老实说,松哥之前也有过类似的经历,今天就来和小伙伴们聊聊回头草到底吃不吃。 首先一个基本观点,就是离职了也没必要和老东家弄的苦...

2020阿里全球数学大赛:3万名高手、4道题、2天2夜未交卷

阿里巴巴全球数学竞赛( Alibaba Global Mathematics Competition)由马云发起,由中国科学技术协会、阿里巴巴基金会、阿里巴巴达摩院共同举办。大赛不设报名门槛,全世界爱好数学的人都可参与,不论是否出身数学专业、是否投身数学研究。 2020年阿里巴巴达摩院邀请北京大学、剑桥大学、浙江大学等高校的顶尖数学教师组建了出题组。中科院院士、美国艺术与科学院院士、北京国际数学...

HTTP与HTTPS的区别

面试官问HTTP与HTTPS的区别,我这样回答让他竖起大拇指!

男生更看重女生的身材脸蛋,还是思想?

往往,我们看不进去大段大段的逻辑。深刻的哲理,往往短而精悍,一阵见血。问:产品经理挺漂亮的,有点心动,但不知道合不合得来。男生更看重女生的身材脸蛋,还是...

为什么程序员做外包会被瞧不起?

二哥,有个事想询问下您的意见,您觉得应届生值得去外包吗?公司虽然挺大的,中xx,但待遇感觉挺低,马上要报到,挺纠结的。

当HR压你价,说你只值7K,你该怎么回答?

当HR压你价,说你只值7K时,你可以流畅地回答,记住,是流畅,不能犹豫。 礼貌地说:“7K是吗?了解了。嗯~其实我对贵司的面试官印象很好。只不过,现在我的手头上已经有一份11K的offer。来面试,主要也是自己对贵司挺有兴趣的,所以过来看看……”(未完) 这段话主要是陪HR互诈的同时,从公司兴趣,公司职员印象上,都给予对方正面的肯定,既能提升HR的好感度,又能让谈判气氛融洽,为后面的发挥留足空间。...

面试:第十六章:Java中级开发(16k)

HashMap底层实现原理,红黑树,B+树,B树的结构原理 Spring的AOP和IOC是什么?它们常见的使用场景有哪些?Spring事务,事务的属性,传播行为,数据库隔离级别 Spring和SpringMVC,MyBatis以及SpringBoot的注解分别有哪些?SpringMVC的工作原理,SpringBoot框架的优点,MyBatis框架的优点 SpringCould组件有哪些,他们...

面试阿里p7,被按在地上摩擦,鬼知道我经历了什么?

面试阿里p7被问到的问题(当时我只知道第一个):@Conditional是做什么的?@Conditional多个条件是什么逻辑关系?条件判断在什么时候执...

立即提问
相关内容推荐