jsp页面的$.ajax在网站上无法执行,在eclipse中运行正常

jsp页面的$.ajax在网站上无法执行,在eclipse中运行正常,浏览器显示“Can't find variable: $

以下是JSP中的fuction中的代码
$.ajax({
type : "post",
async : false, //同步执行
url : "bar.do",
data : para,

dataType : "json",
success : function(result) {
if (result) {

1个回答

右键查看源代码,看下jquery的引用的路径是否正确。

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
Eclipse的JSP页面不能更新到tomcat服务器
这是eclipse中编写的jsp页面,我修改了此处ajax请求的url地址, ![图片说明](https://img-ask.csdn.net/upload/201908/07/1565143126_355610.png) 这个截图是tomcat中work\Catalina\中项目编译jsp页面生成的java文件,此处显示的还是我没有更改url之前的请求地址。 ![图片说明](https://img-ask.csdn.net/upload/201908/07/1565143150_871234.png) 在这儿说一下我之前试过的方法: 1.重装tomcat 2. clean项目 3.清除浏览器缓存。 尝试过后还是无果。在此请求各位路人,帮帮忙,对小弟指点指点,看还有什么办法解决这个问题。 感激不尽,真的感激不尽, 求求你们了!!!
ajax+eclipse+jsp页面后台是java
我想知道怎么用ajax分页,每页10条,ajax写好之后,后台写什么方法去执行调用,我是新手,不懂这些,求大神帮帮忙 我用的是spring,springmvc+hibernate,三个框架,实在是不会了。 ``` <script type="text/javascript"> $(function() { //此demo通过Ajax加载分页元素 var initPagination = function() { var num_entries = $("#hiddenresult").size(); // 创建分页 $("#Pagination").pagination(num_entries, { num_edge_entries: 1, //边缘页数 num_display_entries: 4, //主体页数 callback: pageselectCallback, items_per_page: 10, //每页显示1项 prev_text: "前一页", next_text: "后一页" }); function pageselectCallback(page_index, jq) { var new_content = $("#hiddenresult :eq("+page_index +")").clone(); $("#Searchresult").empty().append(new_content); //装载对应分页的内容 alert($("#hiddenresult :eq("+page_index +")")+"2"); return false; } }; //ajax加载 $("#hiddenresult").load("/news/user/all.htmls",null,initPagination); }); </script> 网上抄的ajax 后台查询所有数据 @SuppressWarnings({ "unchecked" }) @RequestMapping("/all") public @ResponseBody List<AcctNewscontext> ceshi(){ String hql="from AcctNewscontext"; Query query = this.getCurrentSession().createQuery(hql); System.out.println("这是所有"+query.list()); List<AcctNewscontext> q= query.list(); return q; 后台查询hibeinate分页查 @SuppressWarnings({ "unchecked" }) @RequestMapping(value="/listPage") public @ResponseBody List<AcctNewscontext> next(Integer page_index){ LOGGER.info("下一页"); String hql="from AcctNewscontext"; Query query = this.getCurrentSession().createQuery(hql); query.setFirstResult(page_index);//从第一条记录开始 query.setMaxResults(10);//取出10条记录 List<AcctNewscontext> q= query.list(); System.out.println("这是下一页"+q); return q; 后台limit 查询 @SuppressWarnings("rawtypes") @RequestMapping(value="/fenye") public String fenye(ModelMap map){ LOGGER.info("各种分页各种列表"); String hql="select * from newscontent limit 0,10"; Query query = this.getCurrentSession().createSQLQuery(hql); List list=query.list(); map.addAttribute("list", list); System.out.println("这是分页"+list); return "/user/listnews"; ```
myeclipse中的Jsp页面的ajax怎么一直实现不了,哪位大神给看看。。。
1. <script type="text/javascript"> function ajax(){ //声明一个空对象 用来装入xmlhttprequest对象 var xmlhttpreq=null; // 首先判断浏览器的使用情况,然后给XMLHttpRequest对象赋值,进行实例化 if (window.ActiveXObject){ xmlhttpreq = new ActiveObject("Microsoft.XMLHTTP"); }else if(window.XMLHTTPRequest){ xmlhttpreq =new XMLHTTPRequest(); } //实例化后,开始使用方法对初始化xmlhttprequest对象.true为异步方式,false为同步方式。 //指定异步提交 的目标和提交方式,调用xmlhttp open方法 xmlhttpreq.open("GET","ajax01.jsp",true); //当xmlhttp状态改变时,需要进行的处理,以响应函数进行,设置回调函数 xmlhttpreq.onreadystatechange = RequestCallBack; //使用send()方法发送该请求 xmlhttpreq.send(); //请求改变时,处理响应。处理之前首先检查readyState的值和http状态 //readyState处理函数 function RequestCallBack(){ if(xmlhttpreq.readystate==4){ if(xmlhttpreq,status ==200){ //将xmlhttpreq的值赋予id为restext的元素 document.getElementById("restext").innerHTML = xmlhttpreq.responseText; } } } } </script> </head> <body> <input type = "button" value = "ajax提交" onclick = "ajax()"> <div id = "restext"></div> </body>
jsp 中 关于ajax 请求的问题
我用的eclipse 在创建servlet是不需要配置web.xml 就能用 然后呢在用ajax访问的时候呢 就不知道该怎么去写那个 URL了
ajax(eclipse)具体简单实现
servlet,jdbc,xml,navicat,jsp实现一个ajax检测用户名的操作,求代码
jsp接收的json字符串中文显示问号,使用的编码为utf-8
使用ajax接收方法返回的json字符串中,中文显示为问号!tomcat,eclipse,jsp上都设置了utf-8,字符集过滤器也设置了,求助大神还有哪需要设置字符编码? ![servlet代码](https://img-ask.csdn.net/upload/201612/02/1480669118_996738.png) ![tomcat](https://img-ask.csdn.net/upload/201612/02/1480669173_563449.png) ![jsp ajax](https://img-ask.csdn.net/upload/201612/02/1480669226_750188.png)
ssm项目中ajax请求后台一部分函数后台Did not find handler method for [xxx],还有部分成功的为什么?
ssm项目中,用ajax请求后台数据,同一个controller中,同一个jsp页面,有几个function中的ajax请求成功,有的请求失败,后台的日志中是Did not find handler method for [/xxx.controller]
JSP页面中的下拉框不能显示取到的值
下拉框取不到返回的值 代码如下 ``` <td> <select id="enterpriseNature" > <option value="">-请选择-</option> </select> </td> ``` jquery代码: ``` <script type="text/javascript" src="${pageContext.request.contextPath }/js/jquery-1.11.3.min.js"></script> <script type="text/javascript"> $(function(){ //页面加载函数就会执行,页面加载函数就会执行查询字典 $.post("${pageContext.request.contextPath }/baseDict_findByTypeCode.action",{"dict_type_code":"007"},function(data){ //遍历JSON数据 $(date).each(function(i,n){ $("#enterpriseNature").append("<option value='"+n.dict_id+"'>"+n.dict_item_name+"</option>"); }); },"json"); }); </script> ``` 控制台也打印出值了 ``` BaseDictAction中的findByTypeCode方法执行了.... Hibernate: select basedict0_.dict_id as dict_id1_0_, basedict0_.dict_type_code as dict_typ2_0_, basedict0_.dict_type_name as dict_typ3_0_, basedict0_.dict_item_name as dict_ite4_0_, basedict0_.dict_item_code as dict_ite5_0_, basedict0_.dict_sort as dict_sor6_0_, basedict0_.dict_enable as dict_ena7_0_, basedict0_.dict_memo as dict_mem8_0_ from base_dict basedict0_ where basedict0_.dict_type_code=? [{"dict_id":"16","dict_item_code":"","dict_item_name":"私营","dict_type_code":"007","dict_type_name":"企业性质"},{"dict_id":"17","dict_item_code":"","dict_item_name":"国企","dict_type_code":"007","dict_type_name":"企业性质"}] ``` 求大神帮帮小白
springMVC配置运行报RequestMappingHandlerAdapter错误
springMVC.xml配置如下 ``` <?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:p="http://www.springframework.org/schema/p" xmlns:context="http://www.springframework.org/schema/context" xmlns:mvc="http://www.springframework.org/schema/mvc" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.1.xsd http://www.springframework.org/schema/mvc http://www.springframework.org/schema/mvc/spring-mvc-4.0.xsd"> <!-- 自动扫描该包,使SpringMVC认为包下用了@controller注解的类是控制器 --> <mvc:annotation-driven/> <context:component-scan base-package="cn.sunky.controller" /> <!--避免IE执行AJAX时,返回JSON出现下载文件 --> <!-- 定义跳转的文件的前后缀 ,视图模式配置--> <bean class="org.springframework.web.servlet.view.InternalResourceViewResolver"> <!-- 这里的配置我的理解是自动给后面action的方法return的字符串加上前缀和后缀,变成一个 可用的url地址 --> <property name="prefix" value="/WEB-INF/jsp/" /> <property name="suffix" value=".jsp" /> </bean> <!-- 配置文件上传,如果没有使用文件上传可以不用配置,当然如果不配,那么配置文件中也不必引入上传组件包 --> <bean id="multipartResolver" class="org.springframework.web.multipart.commons.CommonsMultipartResolver"> <!-- 默认编码 --> <property name="defaultEncoding" value="utf-8" /> <!-- 文件大小最大值 --> <property name="maxUploadSize" value="10485760000" /> <!-- 内存中的最大值 --> <property name="maxInMemorySize" value="40960" /> </bean> </beans> ``` 报错信息如下: ``` [org.springframework.web.context.support.XmlWebApplicationContext] - Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter': Instantiation of bean failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter]: Constructor threw exception; nested exception is javax.json.bind.JsonbException: JSON Binding provider org.eclipse.yasson.JsonBindingProvider not found [org.springframework.web.servlet.DispatcherServlet] - Context initialization failed org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter': Instantiation of bean failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter]: Constructor threw exception; nested exception is javax.json.bind.JsonbException: JSON Binding provider org.eclipse.yasson.JsonBindingProvider not found at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:1231) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1130) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:545) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:502) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:312) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:228) at org.springframework.beans.facto .... ```
ssm下,多个超链接如何发送ajax请求然后将页面跳转
在ssm框架下,多个超链接如何发送ajax请求然后将页面跳转到 ``` <jsp:include page="#"></jsp:include> ``` 如下代码: ![图片说明](https://img-ask.csdn.net/upload/201904/06/1554535838_475529.png) 不懂如何弄,求解
前台用ajax传数据到后台并添加至数据库?
前台用ajax传数据到后台报 http://127.0.0.1:8080/ssm9/projects/add 404![图片说明](https://img-ask.csdn.net/upload/201909/10/1568085935_17869.png)![图片说明](https://img-ask.csdn.net/upload/201909/10/1568085946_693580.png) ![图片说明](https://img-ask.csdn.net/upload/201909/10/1568085623_717068.png)![图片说明](https://img-ask.csdn.net/upload/201909/10/1568085718_289245.png) 在jsp页面写了 <c:set var="path" value="${pageContext.request.contextPath}"></c:set> 所以 url 用的 ${path}
Failed to retrieve data from /webhdfs/v1/?op=LISTSTATUS: Server Error,同时无法put文件到hdfs
hadoop版本是3.1,ubuntu是18, 问题一:浏览hdfs目录显示: Failed to retrieve data from /webhdfs/v1/?op=LISTSTATUS: Server Error 问题二: namenode的log如下: ``` 438 WARN org.eclipse.jetty.servlet.ServletHandler: Error for /webhdfs/v1/ java.lang.NoClassDefFoundError: javax/activation/DataSource at com.sun.xml.bind.v2.model.impl.RuntimeBuiltinLeafInfoImpl.<clinit>(RuntimeBuiltinLeafInfoImpl.java:457) at com.sun.xml.bind.v2.model.impl.RuntimeTypeInfoSetImpl.<init>(RuntimeTypeInfoSetImpl.java:65) at com.sun.xml.bind.v2.model.impl.RuntimeModelBuilder.createTypeInfoSet(RuntimeModelBuilder.java:133) at com.sun.xml.bind.v2.model.impl.RuntimeModelBuilder.createTypeInfoSet(RuntimeModelBuilder.java:85) at com.sun.xml.bind.v2.model.impl.ModelBuilder.<init>(ModelBuilder.java:156) at com.sun.xml.bind.v2.model.impl.RuntimeModelBuilder.<init>(RuntimeModelBuilder.java:93) at com.sun.xml.bind.v2.runtime.JAXBContextImpl.getTypeInfoSet(JAXBContextImpl.java:473) at com.sun.xml.bind.v2.runtime.JAXBContextImpl.<init>(JAXBContextImpl.java:319) at com.sun.xml.bind.v2.runtime.JAXBContextImpl$JAXBContextBuilder.build(JAXBContextImpl.java:1170) at com.sun.xml.bind.v2.ContextFactory.createContext(ContextFactory.java:145) at com.sun.xml.bind.v2.ContextFactory.createContext(ContextFactory.java:236) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at javax.xml.bind.ContextFinder.newInstance(ContextFinder.java:186) at javax.xml.bind.ContextFinder.newInstance(ContextFinder.java:146) at javax.xml.bind.ContextFinder.find(ContextFinder.java:350) at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:446) at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:409) at com.sun.jersey.server.impl.wadl.WadlApplicationContextImpl.<init>(WadlApplicationContextImpl.java:103) at com.sun.jersey.server.impl.wadl.WadlFactory.init(WadlFactory.java:100) at com.sun.jersey.server.impl.application.RootResourceUriRules.initWadl(RootResourceUriRules.java:169) at com.sun.jersey.server.impl.application.RootResourceUriRules.<init>(RootResourceUriRules.java:106) at com.sun.jersey.server.impl.application.WebApplicationImpl._initiate(WebApplicationImpl.java:1359) at com.sun.jersey.server.impl.application.WebApplicationImpl.access$700(WebApplicationImpl.java:180) at com.sun.jersey.server.impl.application.WebApplicationImpl$13.f(WebApplicationImpl.java:799) at com.sun.jersey.server.impl.application.WebApplicationImpl$13.f(WebApplicationImpl.java:795) at com.sun.jersey.spi.inject.Errors.processWithErrors(Errors.java:193) at com.sun.jersey.server.impl.application.WebApplicationImpl.initiate(WebApplicationImpl.java:795) at com.sun.jersey.server.impl.application.WebApplicationImpl.initiate(WebApplicationImpl.java:790) at com.sun.jersey.spi.container.servlet.ServletContainer.initiate(ServletContainer.java:509) at com.sun.jersey.spi.container.servlet.ServletContainer$InternalWebComponent.initiate(ServletContainer.java:339) at com.sun.jersey.spi.container.servlet.WebComponent.load(WebComponent.java:605) at com.sun.jersey.spi.container.servlet.WebComponent.init(WebComponent.java:207) at com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:394) at com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:577) at javax.servlet.GenericServlet.init(GenericServlet.java:244) at org.eclipse.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:643) at org.eclipse.jetty.servlet.ServletHolder.getServlet(ServletHolder.java:499) at org.eclipse.jetty.servlet.ServletHolder.ensureInstance(ServletHolder.java:791) at org.eclipse.jetty.servlet.ServletHolder.prepare(ServletHolder.java:776) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:579) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) at org.eclipse.jetty.server.Server.handle(Server.java:539) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:333) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108) at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671) at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: java.lang.ClassNotFoundException: javax.activation.DataSource at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583) at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521) ... 65 more 2019-06-18 15:35:01,950 WARN org.eclipse.jetty.servlet.ServletHandler: /webhdfs/v1/ java.lang.NullPointerException at com.sun.jersey.spi.container.ContainerRequest.<init>(ContainerRequest.java:189) at com.sun.jersey.spi.container.servlet.WebComponent.createRequest(WebComponent.java:446) at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:373) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:848) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1772) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:644) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.apache.hadoop.hdfs.web.AuthFilter.doFilter(AuthFilter.java:90) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1609) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) at org.eclipse.jetty.server.Server.handle(Server.java:539) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:333) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108) at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671) at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589) at java.base/java.lang.Thread.run(Thread.java:834) 2019-06-18 15:39:17,698 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Number of transactions: 3 Total time for transactions(ms): 56 Number of transactions batched in Syncs: 0 Number of syncs: 2 SyncTimes(ms): 22 2019-06-18 15:39:25,202 WARN org.eclipse.jetty.servlet.ServletHandler: /webhdfs/v1/ java.lang.NullPointerException at com.sun.jersey.spi.container.ContainerRequest.<init>(ContainerRequest.java:189) at com.sun.jersey.spi.container.servlet.WebComponent.createRequest(WebComponent.java:446) at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:373) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:848) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1772) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:644) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.apache.hadoop.hdfs.web.AuthFilter.doFilter(AuthFilter.java:90) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1609) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) at org.eclipse.jetty.server.Server.handle(Server.java:539) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:333) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108) at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671) at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589) at java.base/java.lang.Thread.run(Thread.java:834) 2019-06-18 15:39:45,858 WARN org.eclipse.jetty.servlet.ServletHandler: /webhdfs/v1/ java.lang.NullPointerException at com.sun.jersey.spi.container.ContainerRequest.<init>(ContainerRequest.java:189) at com.sun.jersey.spi.container.servlet.WebComponent.createRequest(WebComponent.java:446) at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:373) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:848) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1772) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:644) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.apache.hadoop.hdfs.web.AuthFilter.doFilter(AuthFilter.java:90) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1609) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) at org.eclipse.jetty.server.Server.handle(Server.java:539) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:333) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108) at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671) at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589) at java.base/java.lang.Thread.run(Thread.java:834) ``` 附datanode日志: 2019-06-18 14:52:36,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = gx-virtual-machine/127.0.1.1 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.2.0 STARTUP_MSG: classpath = /usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/kerby-xdr-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-core-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.7.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-crypto-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/re2j-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-databind-2.9.5.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.25.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-annotations-2.9.5.jar:/usr/local/hadoop/share/hadoop/common/lib/audience-annotations-0.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.12.0.jar:/usr/local/hadoop/share/hadoop/common/lib/json-smart-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jcip-annotations-1.0-1.jar:/usr/local/hadoop/share/hadoop/common/lib/kerby-pkix-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-3.2.0.jar:/usr/local/hadoop/share/hadoop/common/lib/token-provider-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr311-api-1.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.19.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core4-4.1.0-incubating.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-lang3-3.7.jar:/usr/local/hadoop/share/hadoop/common/lib/dnsjava-2.1.7.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.9.3.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/javax.servlet-api-3.1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-server-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.12.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-client-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/woodstox-core-5.0.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-admin-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-security-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.6.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-text-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.5.2.jar:/usr/local/hadoop/share/hadoop/common/lib/kerby-config-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-server-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-servlet-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-util-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-5.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.11.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-identity-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/accessors-smart-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-xml-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/jul-to-slf4j-1.7.25.jar:/usr/local/hadoop/share/hadoop/common/lib/kerby-util-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-http-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/metrics-core-3.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration2-2.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.12.0.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-common-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.10.5.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/nimbus-jose-jwt-4.41.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.4.4.jar:/usr/local/hadoop/share/hadoop/common/lib/kerby-asn1-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.19.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-3.0.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.19.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-2.9.5.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-3.2.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.54.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-webapp-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/stax2-api-3.1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-servlet-1.19.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-simplekdc-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-io-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/common/hadoop-kms-3.2.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-3.2.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/kerby-xdr-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-core-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/avro-1.7.7.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-crypto-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/re2j-1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/json-simple-1.1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-databind-2.9.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-annotations-2.9.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/audience-annotations-0.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/curator-client-2.12.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/json-smart-2.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jcip-annotations-1.0-1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerby-pkix-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/hadoop-annotations-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/token-provider-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr311-api-1.1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.19.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core4-4.1.0-incubating.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang3-3.7.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/dnsjava-2.1.7.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-collections-3.2.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-beanutils-1.9.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.servlet-api-3.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-server-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/curator-recipes-2.12.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-client-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/okhttp-2.7.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/woodstox-core-5.0.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-admin-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-security-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-net-3.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-text-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/httpclient-4.5.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerby-config-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-server-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-servlet-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/okio-1.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-ajax-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-util-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-5.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/zookeeper-3.4.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.11.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-identity-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/accessors-smart-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-xml-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-all-4.0.52.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerby-util-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-http-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-configuration2-2.1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/curator-framework-2.12.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-common-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.10.5.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/nimbus-jose-jwt-4.41.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/snappy-java-1.0.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/httpcore-4.4.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerby-asn1-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-json-1.19.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.19.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-2.9.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/hadoop-auth-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jaxb-api-2.2.11.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsch-0.1.54.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-webapp-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/stax2-api-3.1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-servlet-1.19.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-simplekdc-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-io-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-client-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-native-client-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-native-client-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-rbf-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-httpfs-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-client-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-rbf-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-nativetask-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-uploader-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-4.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/json-io-2.5.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.19.jar:/usr/local/hadoop/share/hadoop/yarn/lib/mssql-jdbc-6.2.1.jre7.jar:/usr/local/hadoop/share/hadoop/yarn/lib/java-util-1.9.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/ehcache-3.3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/swagger-annotations-1.5.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/HikariCP-java7-2.4.12.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/fst-2.50.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.19.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-4.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/snakeyaml-1.16.jar:/usr/local/hadoop/share/hadoop/yarn/lib/metrics-core-3.2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/objenesis-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-base-2.9.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-module-jaxb-annotations-2.9.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-json-provider-2.9.5.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-submarine-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-timeline-pluginstorage-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-services-api-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-services-core-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-router-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-3.2.0.jar STARTUP_MSG: build = https://github.com/apache/hadoop.git -r e97acb3bd8f3befd27418996fa5d4b50bf2e17bf; compiled by 'sunilg' on 2019-01-08T06:08Z STARTUP_MSG: java = 11.0.3 ************************************************************/ 2019-06-18 14:52:36,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2019-06-18 14:52:41,503 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/usr/local/hadoop/tmp/dfs/data 2019-06-18 14:52:42,424 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2019-06-18 14:52:44,123 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2019-06-18 14:52:44,123 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2019-06-18 14:52:46,504 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2019-06-18 14:52:46,511 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2019-06-18 14:52:46,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is gx-virtual-machine 2019-06-18 14:52:46,567 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2019-06-18 14:52:46,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 0 2019-06-18 14:52:46,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:9866 2019-06-18 14:52:46,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2019-06-18 14:52:46,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2019-06-18 14:52:47,198 INFO org.eclipse.jetty.util.log: Logging initialized @15269ms 2019-06-18 14:52:48,022 INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. 2019-06-18 14:52:48,062 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2019-06-18 14:52:48,161 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2019-06-18 14:52:48,174 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode 2019-06-18 14:52:48,174 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static 2019-06-18 14:52:48,174 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs 2019-06-18 14:52:48,556 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 44121 2019-06-18 14:52:48,580 INFO org.eclipse.jetty.server.Server: jetty-9.3.24.v20180605, build timestamp: 2018-06-06T01:11:56+08:00, git hash: 84205aa28f11a4f31f2a3b86d1bba2cc8ab69827 2019-06-18 14:52:49,011 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@7876d598{/logs,file:///usr/local/hadoop/logs/,AVAILABLE} 2019-06-18 14:52:49,018 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@5af28b27{/static,file:///usr/local/hadoop/share/hadoop/hdfs/webapps/static/,AVAILABLE} 2019-06-18 14:52:50,151 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@547e29a4{/,file:///usr/local/hadoop/share/hadoop/hdfs/webapps/datanode/,AVAILABLE}{/datanode} 2019-06-18 14:52:50,242 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@6f45a1a0{HTTP/1.1,[http/1.1]}{localhost:44121} 2019-06-18 14:52:50,243 INFO org.eclipse.jetty.server.Server: Started @18329ms 2019-06-18 14:52:52,165 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /0.0.0.0:9864 2019-06-18 14:52:52,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hadoop 2019-06-18 14:52:52,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2019-06-18 14:52:52,242 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2019-06-18 14:52:52,720 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue, queueCapacity: 1000, scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler, ipcBackoff: false. 2019-06-18 14:52:52,880 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2019-06-18 14:52:54,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /0.0.0.0:9867 2019-06-18 14:52:55,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2019-06-18 14:52:55,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: <default> 2019-06-18 14:52:55,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:9000 starting to offer service 2019-06-18 14:52:55,532 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2019-06-18 14:52:55,561 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2019-06-18 14:52:58,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:9000 2019-06-18 14:52:58,329 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 1 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=1, dataDirs=1) 2019-06-18 14:52:58,458 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /usr/local/hadoop/tmp/dfs/data/in_use.lock acquired by nodename 55815@gx-virtual-machine 2019-06-18 14:52:58,478 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory with location [DISK]file:/usr/local/hadoop/tmp/dfs/data is not formatted for namespace 317473294. Formatting... 2019-06-18 14:52:58,479 INFO org.apache.hadoop.hdfs.server.common.Storage: Generated new storageID DS-8b3e1e6d-135a-433a-93bb-3e62598daf5e for directory /usr/local/hadoop/tmp/dfs/data 2019-06-18 14:52:58,749 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-200946205-127.0.1.1-1560840480894 2019-06-18 14:52:58,750 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /usr/local/hadoop/tmp/dfs/data/current/BP-200946205-127.0.1.1-1560840480894 2019-06-18 14:52:58,753 INFO org.apache.hadoop.hdfs.server.common.Storage: Block pool storage directory for location [DISK]file:/usr/local/hadoop/tmp/dfs/data and block pool id BP-200946205-127.0.1.1-1560840480894 is not formatted. Formatting ... 2019-06-18 14:52:58,753 INFO org.apache.hadoop.hdfs.server.common.Storage: Formatting block pool BP-200946205-127.0.1.1-1560840480894 directory /usr/local/hadoop/tmp/dfs/data/current/BP-200946205-127.0.1.1-1560840480894/current 2019-06-18 14:52:58,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Setting up storage: nsid=317473294;bpid=BP-200946205-127.0.1.1-1560840480894;lv=-57;nsInfo=lv=-65;cid=CID-eb45654d-0bc6-4348-b02f-e03603e1ae37;nsid=317473294;c=1560840480894;bpid=BP-200946205-127.0.1.1-1560840480894;dnuuid=null 2019-06-18 14:52:58,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Generated and persisted new Datanode UUID 6a2049c6-1a18-437a-97bd-51c5bb65a639 2019-06-18 14:52:59,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-8b3e1e6d-135a-433a-93bb-3e62598daf5e 2019-06-18 14:52:59,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/usr/local/hadoop/tmp/dfs/data, StorageType: DISK 2019-06-18 14:52:59,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Registered FSDatasetState MBean 2019-06-18 14:52:59,680 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /usr/local/hadoop/tmp/dfs/data 2019-06-18 14:52:59,801 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /usr/local/hadoop/tmp/dfs/data 2019-06-18 14:52:59,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding block pool BP-200946205-127.0.1.1-1560840480894 2019-06-18 14:52:59,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-200946205-127.0.1.1-1560840480894 on volume /usr/local/hadoop/tmp/dfs/data... 2019-06-18 14:53:00,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-200946205-127.0.1.1-1560840480894 on /usr/local/hadoop/tmp/dfs/data: 327ms 2019-06-18 14:53:00,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to scan all replicas for block pool BP-200946205-127.0.1.1-1560840480894: 359ms 2019-06-18 14:53:00,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-200946205-127.0.1.1-1560840480894 on volume /usr/local/hadoop/tmp/dfs/data... 2019-06-18 14:53:00,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /usr/local/hadoop/tmp/dfs/data/current/BP-200946205-127.0.1.1-1560840480894/current/replicas doesn't exist 2019-06-18 14:53:00,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-200946205-127.0.1.1-1560840480894 on volume /usr/local/hadoop/tmp/dfs/data: 17ms 2019-06-18 14:53:00,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to add all replicas to map for block pool BP-200946205-127.0.1.1-1560840480894: 27ms 2019-06-18 14:53:00,208 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: Now scanning bpid BP-200946205-127.0.1.1-1560840480894 on volume /usr/local/hadoop/tmp/dfs/data 2019-06-18 14:53:00,221 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/usr/local/hadoop/tmp/dfs/data, DS-8b3e1e6d-135a-433a-93bb-3e62598daf5e): finished scanning block pool BP-200946205-127.0.1.1-1560840480894 2019-06-18 14:53:00,401 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/usr/local/hadoop/tmp/dfs/data, DS-8b3e1e6d-135a-433a-93bb-3e62598daf5e): no suitable block pools found to scan. Waiting 1814399799 ms. 2019-06-18 14:53:00,418 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Periodic Directory Tree Verification scan starting at 2019/6/18 下午8:05 with interval of 21600000ms 2019-06-18 14:53:00,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool BP-200946205-127.0.1.1-1560840480894 (Datanode Uuid 6a2049c6-1a18-437a-97bd-51c5bb65a639) service to localhost/127.0.0.1:9000 beginning handshake with NN 2019-06-18 14:53:00,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool Block pool BP-200946205-127.0.1.1-1560840480894 (Datanode Uuid 6a2049c6-1a18-437a-97bd-51c5bb65a639) service to localhost/127.0.0.1:9000 successfully registered with NN 2019-06-18 14:53:00,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: For namenode localhost/127.0.0.1:9000 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000 2019-06-18 14:53:01,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0xb210af820fa10abf, containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 19 msec to generate and 231 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2019-06-18 14:53:01,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-200946205-127.0.1.1-1560840480894 2019-06-18 15:44:37,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741825_1001 src: /127.0.0.1:34774 dest: /127.0.0.1:9866 2019-06-18 15:44:37,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34774, dest: /127.0.0.1:9866, bytes: 8260, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741825_1001, duration(ns): 75831098 2019-06-18 15:44:37,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741826_1002 src: /127.0.0.1:34776 dest: /127.0.0.1:9866 2019-06-18 15:44:38,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34776, dest: /127.0.0.1:9866, bytes: 953, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741826_1002, duration(ns): 5252820 2019-06-18 15:44:38,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741827_1003 src: /127.0.0.1:34778 dest: /127.0.0.1:9866 2019-06-18 15:44:38,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34778, dest: /127.0.0.1:9866, bytes: 11392, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741827_1003, duration(ns): 19816531 2019-06-18 15:44:38,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741827_1003, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741828_1004 src: /127.0.0.1:34780 dest: /127.0.0.1:9866 2019-06-18 15:44:38,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34780, dest: /127.0.0.1:9866, bytes: 1061, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741828_1004, duration(ns): 9820674 2019-06-18 15:44:38,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741828_1004, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741829_1005 src: /127.0.0.1:34782 dest: /127.0.0.1:9866 2019-06-18 15:44:38,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34782, dest: /127.0.0.1:9866, bytes: 620, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741829_1005, duration(ns): 9424051 2019-06-18 15:44:38,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741829_1005, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741830_1006 src: /127.0.0.1:34784 dest: /127.0.0.1:9866 2019-06-18 15:44:38,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34784, dest: /127.0.0.1:9866, bytes: 3518, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741830_1006, duration(ns): 6662498 2019-06-18 15:44:38,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741830_1006, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741831_1007 src: /127.0.0.1:34786 dest: /127.0.0.1:9866 2019-06-18 15:44:38,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34786, dest: /127.0.0.1:9866, bytes: 682, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741831_1007, duration(ns): 5047916 2019-06-18 15:44:38,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741831_1007, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741832_1008 src: /127.0.0.1:34788 dest: /127.0.0.1:9866 2019-06-18 15:44:38,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34788, dest: /127.0.0.1:9866, bytes: 758, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741832_1008, duration(ns): 8532382 2019-06-18 15:44:38,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741832_1008, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741833_1009 src: /127.0.0.1:34790 dest: /127.0.0.1:9866 2019-06-18 15:44:38,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34790, dest: /127.0.0.1:9866, bytes: 690, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741833_1009, duration(ns): 5589094 2019-06-18 15:44:38,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741833_1009, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:01,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741834_1010 src: /127.0.0.1:36578 dest: /127.0.0.1:9866 2019-06-19 09:54:02,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36578, dest: /127.0.0.1:9866, bytes: 8260, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741834_1010, duration(ns): 32739756 2019-06-19 09:54:02,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741834_1010, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741835_1011 src: /127.0.0.1:36580 dest: /127.0.0.1:9866 2019-06-19 09:54:02,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36580, dest: /127.0.0.1:9866, bytes: 953, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741835_1011, duration(ns): 12137675 2019-06-19 09:54:02,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741835_1011, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741836_1012 src: /127.0.0.1:36582 dest: /127.0.0.1:9866 2019-06-19 09:54:02,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36582, dest: /127.0.0.1:9866, bytes: 11392, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741836_1012, duration(ns): 8740891 2019-06-19 09:54:02,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741836_1012, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741837_1013 src: /127.0.0.1:36584 dest: /127.0.0.1:9866 2019-06-19 09:54:02,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36584, dest: /127.0.0.1:9866, bytes: 1061, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741837_1013, duration(ns): 8680367 2019-06-19 09:54:02,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741837_1013, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741838_1014 src: /127.0.0.1:36586 dest: /127.0.0.1:9866 2019-06-19 09:54:02,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36586, dest: /127.0.0.1:9866, bytes: 620, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741838_1014, duration(ns): 8474258 2019-06-19 09:54:02,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741838_1014, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741839_1015 src: /127.0.0.1:36588 dest: /127.0.0.1:9866 2019-06-19 09:54:02,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36588, dest: /127.0.0.1:9866, bytes: 3518, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741839_1015, duration(ns): 6946259 2019-06-19 09:54:02,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741839_1015, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741840_1016 src: /127.0.0.1:36590 dest: /127.0.0.1:9866 2019-06-19 09:54:02,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36590, dest: /127.0.0.1:9866, bytes: 682, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741840_1016, duration(ns): 6602106 2019-06-19 09:54:02,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741840_1016, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741841_1017 src: /127.0.0.1:36592 dest: /127.0.0.1:9866 2019-06-19 09:54:02,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36592, dest: /127.0.0.1:9866, bytes: 758, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741841_1017, duration(ns): 9690339 2019-06-19 09:54:02,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder:
tomcat不能启动,导致eclipses未响应
遇到一个很纳闷的问题,我用的SSH框架,然后再jsp页面用ajax获取后台的action,json是确定有数据的,但是如果不把中间的那段代码注释掉,tomcat就会不能启动,导致eclipse未响应,如果注释掉,就能正常的获取数据,这是为什么啊,我觉得问题出在了图中划红线的部分,但是在action测试,可以通过的啊,各位,有谁能讲解一下么
关于Acegi 在 复合环境下的奇妙错误,请教!!
首先简单描述下项目的环境 开发环境: Tomcat 6.0 + apache2.2 + eclipse3.4 + jdk 6 正式环境: Tomcat 6.0 群集 + apache2.2 + jk mod + Echcache 页面缓存 + jdk 6 [b]注意: 正式环境是在一台机器上,配置 apache 和多个 tomcat[/b] Acegi 1.0.7 版本 Acegi 设定的权限管控为 所有页面都可以访问,但是填写资料,需要登陆操作 首页是 index.jsp , 其他链接,均为 ***.action 的方式 首页的登陆框 实现 acegi 的 ajax 登陆操作,登陆成功,登陆框会切换成 用户名 + 欢迎词 + 退出,登陆不成功,会提示密码或帐号错误,继续登陆。 在开发环境下,一切正常。 部署到正式环境后,出现下面两种错误: 1、输入用户名密码后,登陆成功,但是 登陆框不会切换,点击任何一个栏目进入后,在页面顶部会显示 欢迎 用户名 (表示登陆成功,如果登陆不成功,这里会显示 : 登陆 注册 的链接 ) 2、由于正式环境采用了 apache + jk + tomcat 集群 的方式, 在 worker 的配置中,设定了 sticky_session = 1。 其他配置属于正常配置 ,这里出现了一个奇怪的事情: 第一次访问首页时, 登陆框会显示: 欢迎 某某某 的字样,也就是是已经登陆了,但是我还没有登陆,而且 某某某 的用户名是随机显示的,当点击 个人资料时,却又跳转到 登陆画面,显然,当前的这个 session 并未登陆成功,还是需要登陆操作,登陆成功后 , 任何页面的顶部都会正确显示 用户名,只有 index.jsp 页面不会正常显示。 页面顶部采用 header.jsp 页面编写, 在查看是否有权限时,是通过 <authz:authorize ifAllGranted="ANONYMOUS"> 这样的写法判断是否有用户登陆,然后获取用户名,任何情况下,都不会出错,首页也是同样的写法,可是,却会出现以上问题。。 请教各位。。是不是因为复合环境下,会出现以上错误? 我做过如下测试: 1、将 集群的 tomcat 关闭到只剩下一个,然后测试, 不会出现 别人的用户名出现在 index.jsp 页面, 但是登陆依旧存在问题 2、将 header.jsp 的代码 copy 到 index.jsp 页面,还是无法起作用 3、将 echcache 的页面缓存关闭, 发现问题好转,是否 页面缓存会让 index.jsp 页面无法正常刷新? 另外,正式环境是不能关闭 页面缓存,效率会差很多。。。 还请各位指点!!
ssm整合单元测试时候报错Failed to load ApplicationContext
报错代码如下 严重: Caught exception while allowing TestExecutionListener [org.springframework.test.context.support.DependencyInjectionTestExecutionListener@6108b2d7] to prepare test instance [com.atguigu.crud.test.MapperTest@49b0b76] java.lang.IllegalStateException: Failed to load ApplicationContext at org.springframework.test.context.cache.DefaultCacheAwareContextLoaderDelegate.loadContext(DefaultCacheAwareContextLoaderDelegate.java:124) at org.springframework.test.context.support.DefaultTestContext.getApplicationContext(DefaultTestContext.java:83) at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.injectDependencies(DependencyInjectionTestExecutionListener.java:117) at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.prepareTestInstance(DependencyInjectionTestExecutionListener.java:83) at org.springframework.test.context.TestContextManager.prepareTestInstance(TestContextManager.java:230) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.createTest(SpringJUnit4ClassRunner.java:228) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner$1.runReflectiveCall(SpringJUnit4ClassRunner.java:287) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.methodBlock(SpringJUnit4ClassRunner.java:289) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:247) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:94) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61) at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:70) at org.junit.runners.ParentRunner.run(ParentRunner.java:363) at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:191) at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86) at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:538) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:760) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:460) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:206) Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.context.event.internalEventListenerProcessor': BeanPostProcessor before instantiation of bean failed; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.aop.support.DefaultBeanFactoryPointcutAdvisor#0': Cannot resolve reference to bean 'txPoint' while setting bean property 'pointcut'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'txPoint': Failed to introspect bean class [org.springframework.aop.aspectj.AspectJExpressionPointcut] for lookup method metadata: could not find class that it depends on; nested exception is java.lang.NoClassDefFoundError: org/aspectj/weaver/reflect/ReflectionWorld$ReflectionWorldException at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:479) at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:306) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:302) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197) at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:761) at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:866) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:542) at org.springframework.test.context.support.AbstractGenericContextLoader.loadContext(AbstractGenericContextLoader.java:128) at org.springframework.test.context.support.AbstractGenericContextLoader.loadContext(AbstractGenericContextLoader.java:60) at org.springframework.test.context.support.AbstractDelegatingSmartContextLoader.delegateLoading(AbstractDelegatingSmartContextLoader.java:108) at org.springframework.test.context.support.AbstractDelegatingSmartContextLoader.loadContext(AbstractDelegatingSmartContextLoader.java:251) at org.springframework.test.context.cache.DefaultCacheAwareContextLoaderDelegate.loadContextInternal(DefaultCacheAwareContextLoaderDelegate.java:98) at org.springframework.test.context.cache.DefaultCacheAwareContextLoaderDelegate.loadContext(DefaultCacheAwareContextLoaderDelegate.java:116) ... 25 more Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.aop.support.DefaultBeanFactoryPointcutAdvisor#0': Cannot resolve reference to bean 'txPoint' while setting bean property 'pointcut'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'txPoint': Failed to introspect bean class [org.springframework.aop.aspectj.AspectJExpressionPointcut] for lookup method metadata: could not find class that it depends on; nested exception is java.lang.NoClassDefFoundError: org/aspectj/weaver/reflect/ReflectionWorld$ReflectionWorldException at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:359) at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:108) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1531) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1276) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:553) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:483) at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:306) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:302) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202) at org.springframework.aop.framework.autoproxy.BeanFactoryAdvisorRetrievalHelper.findAdvisorBeans(BeanFactoryAdvisorRetrievalHelper.java:92) at org.springframework.aop.framework.autoproxy.AbstractAdvisorAutoProxyCreator.findCandidateAdvisors(AbstractAdvisorAutoProxyCreator.java:102) at org.springframework.aop.aspectj.autoproxy.AspectJAwareAdvisorAutoProxyCreator.shouldSkip(AspectJAwareAdvisorAutoProxyCreator.java:103) at org.springframework.aop.framework.autoproxy.AbstractAutoProxyCreator.postProcessBeforeInstantiation(AbstractAutoProxyCreator.java:248) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:1037) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.resolveBeforeInstantiation(AbstractAutowireCapableBeanFactory.java:1011) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:473) ... 38 more Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'txPoint': Failed to introspect bean class [org.springframework.aop.aspectj.AspectJExpressionPointcut] for lookup method metadata: could not find class that it depends on; nested exception is java.lang.NoClassDefFoundError: org/aspectj/weaver/reflect/ReflectionWorld$ReflectionWorldException at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.determineCandidateConstructors(AutowiredAnnotationBeanPostProcessor.java:269) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.determineConstructorsFromBeanPostProcessors(AbstractAutowireCapableBeanFactory.java:1118) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1091) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:513) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:483) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:325) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197) at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:351) ... 54 more Caused by: java.lang.NoClassDefFoundError: org/aspectj/weaver/reflect/ReflectionWorld$ReflectionWorldException at java.lang.Class.getDeclaredMethods0(Native Method) at java.lang.Class.privateGetDeclaredMethods(Class.java:2701) at java.lang.Class.getDeclaredMethods(Class.java:1975) at org.springframework.util.ReflectionUtils.getDeclaredMethods(ReflectionUtils.java:613) at org.springframework.util.ReflectionUtils.doWithMethods(ReflectionUtils.java:524) at org.springframework.util.ReflectionUtils.doWithMethods(ReflectionUtils.java:510) at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.determineCandidateConstructors(AutowiredAnnotationBeanPostProcessor.java:247) ... 61 more Caused by: java.lang.ClassNotFoundException: org.aspectj.weaver.reflect.ReflectionWorld$ReflectionWorldException at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 68 more 具体配置文件如下 applicationContext.xml <?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:context="http://www.springframework.org/schema/context" xmlns:aop="http://www.springframework.org/schema/aop" xmlns:tx="http://www.springframework.org/schema/tx" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-4.3.xsd http://www.springframework.org/schema/aop http://www.springframework.org/schema/aop/spring-aop-4.3.xsd http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx-4.3.xsd"> <!-- Spring 的配置文件 --> <context:component-scan base-package="com.atguigu"> <context:exclude-filter type="annotation" expression="org.springframework.stereotype.Controller" /> </context:component-scan> <!-- Spring的配置文件,这里主要配置和业务逻辑有关的 --> <!--=================== 数据源,事务控制,xxx ================--> <context:property-placeholder location="classpath:dbconfig.properties" /> <bean id="pooledDataSource" class="com.mchange.v2.c3p0.ComboPooledDataSource"> <property name="jdbcUrl" value="${jdbc.jdbcUrl}"></property> <property name="driverClass" value="${jdbc.driverClass}"></property> <property name="user" value="${jdbc.user}"></property> <property name="password" value="${jdbc.password}"></property> </bean> <!--================== 配置和MyBatis的整合=============== --> <bean id="sqlSessionFactory" class="org.mybatis.spring.SqlSessionFactoryBean"> <!-- 指定mybatis全局配置文件的位置 --> <property name="configLocation" value="classpath:mybatis-config.xml"></property> <property name="dataSource" ref="pooledDataSource"></property> <!-- 指定mybatis,mapper文件的位置 --> <property name="mapperLocations" value="classpath:mapper/*.xml"></property> </bean> <!-- 配置扫描器,将mybatis接口的实现加入到ioc容器中 --> <bean class="org.mybatis.spring.mapper.MapperScannerConfigurer"> <!--扫描所有dao接口的实现,加入到ioc容器中 --> <property name="basePackage" value="com.atguigu.crud.dao"></property> </bean> <!-- 配置一个可以执行批量的sqlSession --> <bean id="sqlSession" class="org.mybatis.spring.SqlSessionTemplate"> <constructor-arg name="sqlSessionFactory" ref="sqlSessionFactory"></constructor-arg> <constructor-arg name="executorType" value="BATCH"></constructor-arg> </bean> <!--============================================= --> <!-- ===============事务控制的配置 ================--> <bean id="transactionManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager"> <!--控制住数据源 --> <property name="dataSource" ref="pooledDataSource"></property> </bean> <!--开启基于注解的事务,使用xml配置形式的事务(必要主要的都是使用配置式) --> <aop:config> <!-- 切入点表达式 --> <aop:pointcut expression="execution(* com.atguigu.crud.service..*(..))" id="txPoint"/> <!-- 配置事务增强 --> <aop:advisor advice-ref="txAdvice" pointcut-ref="txPoint"/> </aop:config> <!--配置事务增强,事务如何切入 --> <tx:advice id="txAdvice" transaction-manager="transactionManager"> <tx:attributes> <!-- 所有方法都是事务方法 --> <tx:method name="*"/> <!--以get开始的所有方法 --> <tx:method name="get*" read-only="true"/> </tx:attributes> </tx:advice> <!-- Spring配置文件的核心点(数据源、与mybatis的整合,事务控制) --> </beans> dispatcherServlet-servlet.xml <?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:context="http://www.springframework.org/schema/context" xmlns:mvc="http://www.springframework.org/schema/mvc" xsi:schemaLocation="http://www.springframework.org/schema/mvc http://www.springframework.org/schema/mvc/spring-mvc-4.3.xsd http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-4.3.xsd"> <!--SpringMVC的配置文件,包含网站跳转逻辑的控制,配置 --> <context:component-scan base-package="com.atguigu" use-default-filters="false"> <!--只扫描控制器。 --> <context:include-filter type="annotation" expression="org.springframework.stereotype.Controller"/> </context:component-scan> <!--配置视图解析器,方便页面返回 --> <bean class="org.springframework.web.servlet.view.InternalResourceViewResolver"> <property name="prefix" value="/WEB-INF/views/"></property> <property name="suffix" value=".jsp"></property> </bean> <!--两个标准配置 --> <!-- 将springmvc不能处理的请求交给tomcat --> <mvc:default-servlet-handler/> <!-- 能支持springmvc更高级的一些功能,JSR303校验,快捷的ajax...映射动态请求 --> <mvc:annotation-driven/> </beans> mybatis-config.xml <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE configuration PUBLIC "-//mybatis.org//DTD Config 3.0//EN" "http://mybatis.org/dtd/mybatis-3-config.dtd"> <configuration> <settings> <setting name="mapUnderscoreToCamelCase" value="true"/> </settings> <typeAliases> <package name="com.atguigu.crud.bean"/> </typeAliases> </configuration> MapperTest.java package com.atguigu.crud.test; import org.junit.Test; import org.junit.runner.RunWith; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.test.context.ContextConfiguration; import org.springframework.test.context.junit4.SpringJUnit4ClassRunner; import com.atguigu.crud.dao.DepartmentMapper; @RunWith(SpringJUnit4ClassRunner.class) @ContextConfiguration(locations = {"classpath:applicationContext.xml"} ) public class MapperTest { @Autowired DepartmentMapper departmentMapper; @Test public void testCRUD() { /*ApplicationContext ioc = new ClassPathXmlApplicationContext("applicationContext.xml"); DepartmentMapper departmentMapper = ioc.getBean(DepartmentMapper.class);*/ System.out.println(departmentMapper); } } 求帮忙解决一下 困扰3天了,
SSM框架controller层不跳转页面,controller的方法也没加responsebody,但是就是不跳转页面,怎么办啊?
``` <servlet> <!-- 前端控制器 --> <servlet-name>springmvc</servlet-name> <servlet-class>org.springframework.web.servlet.DispatcherServlet</servlet-class> <!-- 配置文件路径 --> <init-param> <param-name>contextConfigLocation</param-name> <param-value>classpath:spring/spring-*.xml</param-value> </init-param> </servlet> <servlet-mapping> <servlet-name>springmvc</servlet-name> <url-pattern>/</url-pattern> </servlet-mapping> ``` 上面是servlet的配置 ``` <bean class="org.springframework.web.servlet.view.InternalResourceViewResolver"> <property name="viewClass" value="org.springframework.web.servlet.view.JstlView" /> <property name="prefix" value="/WEB-INF/jsp/" /> <property name="suffix" value=".jsp" /> </bean> ``` 上面是springMVC的视图解析器 ``` @Controller @RequestMapping("/user") public class UserController { @Autowired private UserService userServiceImpl; /** * 登录功能 */ @RequestMapping("/isLogin") public String isLogin(User userData,HttpSession session) { User user = null; user = userServiceImpl.queryByUserName(userData.getUserName()); if(user == null) { session.setAttribute("false", "用户名或密码错误"); //return "redirect:/user/isLogin"; } return "index"; } } ``` 这是controller,这是我做的登录功能,我想在后端跳转,因为其它页面都在WEB-INF中 ``` var username = $("input[name='userName']").val(); var password = $("input[name='passWord']").val(); $.ajax({ type:"POST", url:"user/isLogin", data:{"userName":username,"passWord":password} }); ``` 这是jsp页面,我用的ajax传输数据 但是登录成功后不跳转页面 ,求各位大牛帮一下忙,在此谢过各位了
ajax局部刷新div,刷新后div为空
用ajax局部刷新div,刷新后div为空。页面是登入后从别的页面执行action转过来的,这会影响吗? // div内容 (display设置为block加载出来也是空) ``` <div class="content" id="abnormal" style="display:none"> <!-- <font size="3">车牌号码:</font> <input type="text"></input> <input type="submit" value="登记"></input> <input type="text" value="输入车位号码"> --> <!-- <div style="width:70%;margin-bottom:20px;text-align:center"> <input type="text" name="word" placeholder="请输入车位号码"/> <input type="submit" value="Search"/><br/> </div> --> <s:form name="fm"> <table id="box-table-a" style="width:70%;margin:auto"> <tr> <th>车位号码</th> <th>车位状态</th> <th>车牌号码</th> <th>状态修改</th> </tr> <s:iterator value="#plist" status="st" id="list" begin="0" end="8"> <s:if test ="#st.last!=true && #st.index<11"> <tr> <td id="pid"><s:property value="#list.id"/></td> <td><s:if test="#list.status == 'idle' ">空闲</s:if> <s:elseif test="#list.status == 'assigned' ">已分配</s:elseif> <s:else>异常</s:else></td> <td><s:if test="slist[#st.index] == null ">无车辆</s:if> <s:else><s:property value="slist[#st.index]"/></s:else></td> <td><input class="corner-button" type="button" onclick="javascript:updateLoc(this)" value="Click me"></td> </tr> </s:if> <s:else><tr> <td id="pid"><s:property value="plist[#st.index].id"/></td> <td><s:if test="plist[#st.index].status == 'idle' ">空闲</s:if> <s:elseif test="plist[#st.index].status == 'assigned' ">已分配</s:elseif> <s:else>异常</s:else></td> <td><s:if test="slist[#st.index] == null ">无车辆</s:if> <s:else><s:property value="slist[#st.index]"/></s:else></td> <td><input class="corner-button" type="button" onclick="javascript:updateLoc(this)" value="Click me"></td> </tr> </s:else> </s:iterator> </table> </s:form> </div> ``` 提交修改内容 (如果把本页面提出来,不放在WEB-INF下,且地址填 $('#abnormal').load('http://localhost:8088/parking/operator.jsp #abnormal');的话,会报错) ``` function updateLocation(){ var loc=document.getElementById("formloc").text; var type=document.getElementById("type").value; //alert(user+type); $('#myModal').modal('hide') $.ajax({ type:"POST", url:"setAbnormal.action", data:{ "loc":loc, "type":type }, dataType:"json", catche:false, success:function(data){ //alert(""); //window.location.reload(); } }); alert("修改成功!"); $('#abnormal').load('http://localhost:8088/parking/login.action #abnormal'); } ``` ajax提交后执行的代码 ``` public String execute(){ ParkinfoDao dao=new ParkinfoDao(); //System.out.println(getUser()+":"+getType()); dao.updateParking_info(dao.findLocById(loc),type); plist=pdao.getPiList(); slist=rdao.getCarNum(); HttpServletRequest request=ServletActionContext.getRequest(); HttpSession session=request.getSession(); session.setAttribute("plist", plist); //String s=(String)session.getAttribute("username"); //System.out.println(s); //request.getSession(false); //session. return SUCCESS; } ``` struts.xml文件 ``` <action name="login" class="com.admin.LoginAction" method="execute"> <result name="success">/WEB-INF/operator.jsp</result><!-- /WEB-INF --> <result name="login">/WEB-INF/sysadmin.jsp</result> <result name="error">/login.jsp</result> </action> <!-- 设置异常 --> <action name="setAbnormal" class="com.admin.UpdateAbnormalAction" method="execute"> <result name="success">/WEB-INF/operator.jsp</result> </action> ```
activemq-web报错,activemq版本为5.9
**错误代码** java.lang.IllegalStateException: A filter or servlet of the current chain does not support asynchronous operations. at org.apache.catalina.connector.Request.startAsync(Request.java:1675) at org.apache.catalina.connector.Request.startAsync(Request.java:1668) at org.apache.catalina.connector.RequestFacade.startAsync(RequestFacade.java:1022) at org.eclipse.jetty.continuation.Servlet3Continuation.suspend(Servlet3Continuation.java:202) at org.apache.activemq.web.MessageListenerServlet.doMessages(MessageListenerServlet.java:349) at org.apache.activemq.web.MessageListenerServlet.doGet(MessageListenerServlet.java:250) at org.apache.activemq.web.AjaxServlet.doGet(AjaxServlet.java:47) at javax.servlet.http.HttpServlet.service(HttpServlet.java:624) at javax.servlet.http.HttpServlet.service(HttpServlet.java:731) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) at org.eclipse.jetty.continuation.ContinuationFilter.doFilter(ContinuationFilter.java:137) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:218) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:505) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:169) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103) at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:958) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:452) at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1087) at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:637) at org.apache.tomcat.util.net.AprEndpoint$SocketProcessor.doRun(AprEndpoint.java:2517) at org.apache.tomcat.util.net.AprEndpoint$SocketProcessor.run(AprEndpoint.java:2506) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) at java.lang.Thread.run(Thread.java:745) ** 我的web.xml配置** <?xml version="1.0" encoding="UTF-8"?> <web-app xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://java.sun.com/xml/ns/javaee" xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_3_0.xsd" id="WebApp_ID" version="3.0"> <display-name>mqajax</display-name> <context-param> <description>Whether we should include an embedded broker or not</description> <param-name>org.apache.activemq.brokerURL</param-name> <param-value>tcp://127.0.0.1:61616</param-value> <!-- 这里可以写tcp://192.168.1.111:61616的形式连接其他服务器上的ActiveMQ服务器 --> </context-param> <servlet> <servlet-name>AjaxServlet</servlet-name> <servlet-class>org.apache.activemq.web.AjaxServlet</servlet-class> <load-on-startup>1</load-on-startup> <async-supported>true</async-supported> </servlet> <servlet> <servlet-name>MessageServlet</servlet-name> <servlet-class>org.apache.activemq.web.MessageServlet</servlet-class> <load-on-startup>1</load-on-startup> </servlet> <servlet> <servlet-name>QueueBrowseServlet</servlet-name> <servlet-class>org.apache.activemq.web.QueueBrowseServlet</servlet-class> </servlet> <servlet> <servlet-name>PortfolioPublishServlet</servlet-name> <servlet-class>org.apache.activemq.web.PortfolioPublishServlet</servlet-class> <load-on-startup>1</load-on-startup> </servlet> <servlet-mapping> <servlet-name>AjaxServlet</servlet-name> <url-pattern>/amq/*</url-pattern> </servlet-mapping> <servlet-mapping> <servlet-name>MessageServlet</servlet-name> <url-pattern>/message/*</url-pattern> </servlet-mapping> <servlet-mapping> <servlet-name>QueueBrowseServlet</servlet-name> <url-pattern>/queueBrowse/*</url-pattern> </servlet-mapping> <servlet-mapping> <servlet-name>PortfolioPublishServlet</servlet-name> <url-pattern>/portfolioPublish</url-pattern> </servlet-mapping> <!-- 这种方式是在tomcat7版本以下运行,因为ajax+activemq需要servlet3.0的支持,而servlet3.0又只有在tomcat7中得到支持,所有加上jetty此包,就能在tomcat6运行了 --> <filter> <filter-name>session</filter-name> <filter-class>org.eclipse.jetty.continuation.ContinuationFilter</filter-class> </filter> <filter-mapping> <filter-name>session</filter-name> <url-pattern>/*</url-pattern> </filter-mapping> <welcome-file-list> <welcome-file>index.jsp</welcome-file> </welcome-file-list> </web-app> **JS代码** <script type="text/javascript"> var amq = org.activemq.Amq; amq.init({ uri: 'amq', logging: true, timeout: 20 }); var myHandler = function(message){ $("#msgDiv").append(message); $("#msgDiv").append("<br>"); } amq.addListener("smeguangdong","topic://FirstTopic",myHandler); function send(){ var nickname = $("#nickname").val(); var content = $("#content").val(); var msg = nickname + " : " +content; //alert(msg); amq.sendMessage("topic://FirstTopic","<message>"+msg+"</message>"); } </script>
500错误添加数据时报错Request processing failed; nested exception is org.springframework.web.multipart.MultipartException: The current request is not a multipart request
报错:求大神指点 ![图片说明](https://img-ask.csdn.net/upload/201912/11/1576028291_313681.png) jsp代码 //添加一条数据 function addLea(){ //addImg(); //alert("test"); var renttitle1 =$("#renttitle1").val(); var rentarea1 = $("#rentarea1").val(); var areadetail1 = $("#areadetail1").val(); var rentroomType1 = $("#rentroomtype1").val(); var rentroomtype1; if(rentroomType1==0){ rentroomtype1 = "整租"; }else if(rentroomType1 == 1){ rentroomtype1 = "主卧"; }else if(rentroomType1 == 2){ rentroomtype1 = "次卧"; } var rentprize1 = $("#rentprize1").val(); var genderRequire1 = $("#genderrequire1").val(); var genderrequire1; if(genderRequire1==0){ genderrequire1 = "不限男女"; }else if(genderRequire1 == 1){ genderrequire1 = "男"; }else if(genderRequire1 == 2){ genderrequire1 = "女"; } var rentstartdate1 = $("#rentstartdate1").val(); var rentenddate1 = $("#rentenddate1").val(); var formData = new FormData(); for (var i=0;i<$('#roompictureurl1')[0].files.length;i++){ formData.append('file',$('#roompictureurl1')[0].files[i]); } /* console.log(formData.getAll('file')); */ formData.append('renttitle1',renttitle1); formData.append('rentpublisher1',1004); formData.append('rentarea1',rentarea1); formData.append('areadetail1',areadetail1); formData.append('rentroomtype1',rentroomtype1); formData.append('rentprize1',rentprize1); formData.append('genderRequire1',genderRequire1); formData.append('rentstartdate1',rentstartdate1); formData.append('rentenddate1',rentenddate1); $.ajax({ url: '<%=request.getContextPath()%>/insertOne.do', type: 'post', data: formData, dataType:'json', cache: false, processData: false, contentType: 'multipart/form-data', success : function(data) { $('#myModal').modal('hide'); $('#back').hide(); $('#sure').hide(); $("#showContent").text('添加成功!'); setTimeout(function () {$('#delModal').modal('show');}, 500); setTimeout(function () {$("#delModal").modal('hide');}, 1500); setTimeout(function () {window.location.reload(); },2000); } }) } 点击提交按钮时会报错
程序员必须掌握的核心算法有哪些?
由于我之前一直强调数据结构以及算法学习的重要性,所以就有一些读者经常问我,数据结构与算法应该要学习到哪个程度呢?,说实话,这个问题我不知道要怎么回答你,主要取决于你想学习到哪些程度,不过针对这个问题,我稍微总结一下我学过的算法知识点,以及我觉得值得学习的算法。这些算法与数据结构的学习大多数是零散的,并没有一本把他们全部覆盖的书籍。下面是我觉得值得学习的一些算法以及数据结构,当然,我也会整理一些看过...
大学四年自学走来,这些私藏的实用工具/学习网站我贡献出来了
大学四年,看课本是不可能一直看课本的了,对于学习,特别是自学,善于搜索网上的一些资源来辅助,还是非常有必要的,下面我就把这几年私藏的各种资源,网站贡献出来给你们。主要有:电子书搜索、实用工具、在线视频学习网站、非视频学习网站、软件下载、面试/求职必备网站。 注意:文中提到的所有资源,文末我都给你整理好了,你们只管拿去,如果觉得不错,转发、分享就是最大的支持了。 一、电子书搜索 对于大部分程序员...
小白学 Python 爬虫(25):爬取股票信息
人生苦短,我用 Python 前文传送门: 小白学 Python 爬虫(1):开篇 小白学 Python 爬虫(2):前置准备(一)基本类库的安装 小白学 Python 爬虫(3):前置准备(二)Linux基础入门 小白学 Python 爬虫(4):前置准备(三)Docker基础入门 小白学 Python 爬虫(5):前置准备(四)数据库基础 小白学 Python 爬虫(6):前置准备(...
卸载 x 雷某度!GitHub 标星 1.5w+,从此我只用这款全能高速下载工具!
作者 | Rocky0429 来源 | Python空间 大家好,我是 Rocky0429,一个喜欢在网上收集各种资源的蒟蒻… 网上资源眼花缭乱,下载的方式也同样千奇百怪,比如 BT 下载,磁力链接,网盘资源等等等等,下个资源可真不容易,不一样的方式要用不同的下载软件,因此某比较有名的 x 雷和某度网盘成了我经常使用的工具。 作为一个没有钱的穷鬼,某度网盘几十 kb 的下载速度让我...
世界上最牛的网络设备,价格低廉,其貌不扬......
夜深人静,电视和电脑都已经关机休息,但是我还在默默工作,我安静地趴在你家中的某个地方,7*24小时不眠不休,任劳任怨,目的只有一个,能让你舒服地躺在床上,畅快地刷手机!没错,这就是我,...
《面试宝典》2019年springmvc面试高频题(java)
前言 2019即将过去,伴随我们即将迎来的又是新的一年,过完春节,马上又要迎来新的金三银四面试季。那么,作为程序猿的你,是否真的有所准备的呢,亦或是安于本职工作,继续做好手头上的事情。 当然,不论选择如何,假如你真的准备在之后的金三银四跳槽的话,那么作为一个Java工程师,就不可不看了。如何在几个月的时间里,快速的为即将到来的面试进行充分的准备呢? 1、什么是Spring MVC ?简单...
一名大专同学的四个问题
【前言】   收到一封来信,赶上各种事情拖了几日,利用今天要放下工作的时机,做个回复。   2020年到了,就以这一封信,作为开年标志吧。 【正文】   您好,我是一名现在有很多困惑的大二学生。有一些问题想要向您请教。   先说一下我的基本情况,高考失利,不想复读,来到广州一所大专读计算机应用技术专业。学校是偏艺术类的,计算机专业没有实验室更不用说工作室了。而且学校的学风也不好。但我很想在计算机领...
复习一周,京东+百度一面,不小心都拿了Offer
京东和百度一面都问了啥,面试官百般刁难,可惜我全会。
轻松搭建基于 SpringBoot + Vue 的 Web 商城应用
首先介绍下在本文出现的几个比较重要的概念: 函数计算(Function Compute): 函数计算是一个事件驱动的服务,通过函数计算,用户无需管理服务器等运行情况,只需编写代码并上传。函数计算准备计算资源,并以弹性伸缩的方式运行用户代码,而用户只需根据实际代码运行所消耗的资源进行付费。Fun: Fun 是一个用于支持 Serverless 应用部署的工具,能帮助您便捷地管理函数计算、API ...
Python+OpenCV实时图像处理
目录 1、导入库文件 2、设计GUI 3、调用摄像头 4、实时图像处理 4.1、阈值二值化 4.2、边缘检测 4.3、轮廓检测 4.4、高斯滤波 4.5、色彩转换 4.6、调节对比度 5、退出系统 初学OpenCV图像处理的小伙伴肯定对什么高斯函数、滤波处理、阈值二值化等特性非常头疼,这里给各位分享一个小项目,可通过摄像头实时动态查看各类图像处理的特点,也可对各位调参、测试...
2020年一线城市程序员工资大调查
人才需求 一线城市共发布岗位38115个,招聘120827人。 其中 beijing 22805 guangzhou 25081 shanghai 39614 shenzhen 33327 工资分布 2020年中国一线城市程序员的平均工资为16285元,工资中位数为14583元,其中95%的人的工资位于5000到20000元之间。 和往年数据比较: yea...
为什么猝死的都是程序员,基本上不见产品经理猝死呢?
相信大家时不时听到程序员猝死的消息,但是基本上听不到产品经理猝死的消息,这是为什么呢? 我们先百度搜一下:程序员猝死,出现将近700多万条搜索结果: 搜索一下:产品经理猝死,只有400万条的搜索结果,从搜索结果数量上来看,程序员猝死的搜索结果就比产品经理猝死的搜索结果高了一倍,而且从下图可以看到,首页里面的五条搜索结果,其实只有两条才是符合条件。 所以程序员猝死的概率真的比产品经理大,并不是错...
毕业5年,我问遍了身边的大佬,总结了他们的学习方法
我问了身边10个大佬,总结了他们的学习方法,原来成功都是有迹可循的。
python爬取百部电影数据,我分析出了一个残酷的真相
2019年就这么匆匆过去了,就在前几天国家电影局发布了2019年中国电影市场数据,数据显示去年总票房为642.66亿元,同比增长5.4%;国产电影总票房411.75亿元,同比增长8.65%,市场占比 64.07%;城市院线观影人次17.27亿,同比增长0.64%。 看上去似乎是一片大好对不对?不过作为一名严谨求实的数据分析师,我从官方数据中看出了一点端倪:国产票房增幅都已经高达8.65%了,为什...
推荐10个堪称神器的学习网站
每天都会收到很多读者的私信,问我:“二哥,有什么推荐的学习网站吗?最近很浮躁,手头的一些网站都看烦了,想看看二哥这里有什么新鲜货。” 今天一早做了个恶梦,梦到被老板辞退了。虽然说在我们公司,只有我辞退老板的份,没有老板辞退我这一说,但是还是被吓得 4 点多都起来了。(主要是因为我掌握着公司所有的核心源码,哈哈哈) 既然 4 点多起来,就得好好利用起来。于是我就挑选了 10 个堪称神器的学习网站,推...
这些软件太强了,Windows必装!尤其程序员!
Windows可谓是大多数人的生产力工具,集娱乐办公于一体,虽然在程序员这个群体中都说苹果是信仰,但是大部分不都是从Windows过来的,而且现在依然有很多的程序员用Windows。 所以,今天我就把我私藏的Windows必装的软件分享给大家,如果有一个你没有用过甚至没有听过,那你就赚了????,这可都是提升你幸福感的高效率生产力工具哦! 走起!???? NO、1 ScreenToGif 屏幕,摄像头和白板...
阿里面试一个ArrayList我都能跟面试官扯半小时
我是真的没想到,面试官会这样问我ArrayList。
曾经优秀的人,怎么就突然不优秀了。
职场上有很多辛酸事,很多合伙人出局的故事,很多技术骨干被裁员的故事。说来模板都类似,曾经是名校毕业,曾经是优秀员工,曾经被领导表扬,曾经业绩突出,然而突然有一天,因为种种原因,被裁员了,...
大学四年因为知道了这32个网站,我成了别人眼中的大神!
依稀记得,毕业那天,我们导员发给我毕业证的时候对我说“你可是咱们系的风云人物啊”,哎呀,别提当时多开心啦????,嗯,我们导员是所有导员中最帅的一个,真的???? 不过,导员说的是实话,很多人都叫我大神的,为啥,因为我知道这32个网站啊,你说强不强????,这次是绝对的干货,看好啦,走起来! PS:每个网站都是学计算机混互联网必须知道的,真的牛杯,我就不过多介绍了,大家自行探索,觉得没用的,尽管留言吐槽吧???? 社...
2020年1月中国编程语言排行榜,python是2019增长最快编程语言
编程语言比例 排名 编程语言 最低工资 工资中位数 最低工资 最高工资 人头 人头百分比 1 rust 20713 17500 5042 46250 480 0.14% 2 typescript 18503 22500 6000 30000 1821 0.52% 3 lua 18150 17500 5250 35000 2956 0.84% 4 go 17989 16...
看完这篇HTTP,跟面试官扯皮就没问题了
我是一名程序员,我的主要编程语言是 Java,我更是一名 Web 开发人员,所以我必须要了解 HTTP,所以本篇文章就来带你从 HTTP 入门到进阶,看完让你有一种恍然大悟、醍醐灌顶的感觉。 最初在有网络之前,我们的电脑都是单机的,单机系统是孤立的,我还记得 05 年前那会儿家里有个电脑,想打电脑游戏还得两个人在一个电脑上玩儿,及其不方便。我就想为什么家里人不让上网,我的同学 xxx 家里有网,每...
史上最全的IDEA快捷键总结
现在Idea成了主流开发工具,这篇博客对其使用的快捷键做了总结,希望对大家的开发工作有所帮助。
阿里程序员写了一个新手都写不出的低级bug,被骂惨了。
这种新手都不会范的错,居然被一个工作好几年的小伙子写出来,差点被当场开除了。
谁是华为扫地僧?
是的,华为也有扫地僧!2020年2月11-12日,“养在深闺人不知”的华为2012实验室扫地僧们,将在华为开发者大会2020(Cloud)上,和大家见面。到时,你可以和扫地僧们,吃一个洋...
Idea 中最常用的10款插件(提高开发效率),一定要学会使用!
学习使用一些插件,可以提高开发效率。对于我们开发人员很有帮助。这篇博客介绍了开发中使用的插件。
AI 没让人类失业,搞 AI 的人先失业了
最近和几个 AI 领域的大佬闲聊 根据他们讲的消息和段子 改编出下面这个故事 如有雷同 都是巧合 1. 老王创业失败,被限制高消费 “这里写我跑路的消息实在太夸张了。” 王葱葱哼笑一下,把消息分享给群里。 阿杰也看了消息,笑了笑。在座几位也都笑了。 王葱葱是个有名的人物,21岁那年以全额奖学金进入 KMU 攻读人工智能博士,累计发表论文 40 余篇,个人技术博客更是成为深度学习领域内风向标。 ...
2020年,冯唐49岁:我给20、30岁IT职场年轻人的建议
点击“技术领导力”关注∆每天早上8:30推送 作者|Mr.K 编辑| Emma 来源|技术领导力(ID:jishulingdaoli) 前天的推文《冯唐:职场人35岁以后,方法论比经验重要》,收到了不少读者的反馈,觉得挺受启发。其实,冯唐写了不少关于职场方面的文章,都挺不错的。可惜大家只记住了“春风十里不如你”、“如何避免成为油腻腻的中年人”等不那么正经的文章。 本文整理了冯...
工作十年的数据分析师被炒,没有方向,你根本躲不过中年危机
2020年刚刚开始,就意味着离职潮高峰的到来,我身边就有不少人拿着年终奖离职了,而最让我感到意外的,是一位工作十年的数据分析师也离职了,不同于别人的主动辞职,他是被公司炒掉的。 很多人都说数据分析是个好饭碗,工作不累薪资高、入门简单又好学。然而今年34的他,却真正尝到了中年危机的滋味,平时也有不少人都会私信问我: 数据分析师也有中年危机吗?跟程序员一样是吃青春饭的吗?该怎么保证自己不被公司淘汰...
作为一名大学生,如何在B站上快乐的学习?
B站是个宝,谁用谁知道???? 作为一名大学生,你必须掌握的一项能力就是自学能力,很多看起来很牛X的人,你可以了解下,人家私底下一定是花大量的时间自学的,你可能会说,我也想学习啊,可是嘞,该学习啥嘞,不怕告诉你,互联网时代,最不缺的就是学习资源,最宝贵的是啥? 你可能会说是时间,不,不是时间,而是你的注意力,懂了吧! 那么,你说学习资源多,我咋不知道,那今天我就告诉你一个你必须知道的学习的地方,人称...
那些年,我们信了课本里的那些鬼话
教材永远都是有错误的,从小学到大学,我们不断的学习了很多错误知识。 斑羚飞渡 在我们学习的很多小学课文里,有很多是错误文章,或者说是假课文。像《斑羚飞渡》: 随着镰刀头羊的那声吼叫,整个斑羚群迅速分成两拨,老年斑羚为一拨,年轻斑羚为一拨。 就在这时,我看见,从那拨老斑羚里走出一只公斑羚来。公斑羚朝那拨年轻斑羚示意性地咩了一声,一只半大的斑羚应声走了出来。一老一少走到伤心崖,后退了几步,突...
立即提问