能写一个Util将下面json格式数据info.json,用io读完存到redis缓存中??在线等!!!大概有100w条数据 20C

{
"RECORDS":[
{
"id":1000000059,
"account":"0040140072",
"content":"[{\"SH600345\":{\"IR\":\"\",\"LT\":\"20.8\",\"GT\":\"20.83\",\"DR\":\"\",\"interval\":\"12\"}},{\"SH600837\":{\"IR\":\"1\",\"LT\":\"16.18\",\"GT\":\"16.18\",\"DR\":\"1\",\"interval\":\"12\"}},{\"SZ000886\":{\"IR\":\"\",\"LT\":\"6.41\",\"GT\":\"6.43\",\"DR\":\"\",\"interval\":\"12\"}},{\"SZ300509\":{\"IR\":\"\",\"LT\":\"87.45\",\"GT\":\"87.65\",\"DR\":\"\",\"interval\":\"12\"}},{\"SZ300076\":{\"IR\":\"\",\"LT\":\"15.40\",\"GT\":\"15.40\",\"DR\":\"\",\"interval\":\"12\"}},{\"SZ002452\":{\"IR\":\"\",\"LT\":\"11.90\",\"GT\":\"12.15\",\"DR\":\"\",\"interval\":\"12\"}},{\"SZ300515\":{\"IR\":\"\",\"LT\":\"61.89\",\"GT\":\"61.91\",\"DR\":\"\",\"interval\":\"12\"}},{\"SH603861\":{\"IR\":\"\",\"LT\":\"40.90\",\"GT\":\"41.23\",\"DR\":\"\",\"interval\":\"12\"}},{\"SH600466\":{\"IR\":\"\",\"LT\":\"8.06\",\"GT\":\"8.08\",\"DR\":\"\",\"interval\":\"12\"}},{\"SZ002127\":{\"IR\":\"\",\"LT\":\"10.63\",\"GT\":\"10.66\",\"DR\":\"\",\"interval\":\"12\"}}]",
"cid":"8ffa3310da8101a5b2ab1c47a039a77c",
"updatetime":"2017-02-10 10:29:51"
},
{
"id":1000000062,
"account":"0040127361",
"content":"[{\"SZ002032\":{\"IR\":\"3.32\",\"DR\":\"3.42\",\"interval\":\"12\",\"GG\":\"1\",\"HS\":\"77\",\"GTZS\":\"2\",\"LTZS\":\"3\"}},{\"SZ002881\":{\"GT\":\"12.91\",\"interval\":\"12\",\"GG\":\"0\"}},{\"SZ002881\":{\"interval\":\"12\",\"GG\":\"0\"}},{\"SZ300368\":{\"GT\":\"5\",\"LT\":\"6\",\"interval\":\"12\",\"GG\":\"0\"}},{\"SZ300368\":{\"GT\":\"9\",\"LT\":\"7\",\"interval\":\"12\",\"GG\":\"0\"}},{\"SZ300368\":{\"IR\":\"3.12\",\"DR\":\"2.18\",\"interval\":\"12\",\"GG\":\"0\"}},{\"SH600004\":{\"IR\":\"0.20\",\"DR\":\"0.25\",\"interval\":\"12\",\"GG\":\"0\"}},{\"SH600837\":{\"GT\":\"1\",\"LT\":\"16\",\"IR\":\"0.10\",\"DR\":\"1.05\",\"interval\":\"12\",\"GG\":\"0\"}},{\"SH600876\":{\"IR\":\"2\",\"interval\":\"12\",\"GG\":\"0\"}},{\"SH601519\":{\"GT\":\"9\",\"LT\":\"8\",\"IR\":\"3.13\",\"DR\":\"5.11\",\"interval\":\"0\",\"GG\":\"0\"}},{\"SH603227\":{\"GT\":\"7\",\"LT\":\"6.9\",\"interval\":\"12\",\"GG\":\"0\"}},{\"SH603227\":{\"GT\":\"6.92\",\"LT\":\"6.91\",\"interval\":\"12\",\"GG\":\"0\"}}]",
"cid":"6a9d921606967559337ede0e64f3be93",
"updatetime":"2017-12-26 15:45:36"
},
]
}

6个回答

都差不多,而且10万数据 算少的的,所以在IO上区分不开,百万数据在考虑这个性能问题

IO瓶颈只出现在两方交互过程,如BS、DISK、DB。
JSON不过是交互的数据内容,数据内容的解析耗时几乎可以忽略不计。

看需求要不要后续对数据进行操作,10w量不大,无法划定具体区分 建议 osw isr

看需求要不要后续对数据进行操作,10w量不大,无法划定具体区分 建议 osw isr

O瓶颈只出现在两方交互过程,如BS、DISK、DB。
JSON不过是交互的数据内容,数据内容的解析耗时几乎可以忽略不计。

都差不多,而且10万数据 算少的的,所以在IO上区分不开,百万数据在考虑这个性能问题

JSON不过是交互的数据内容,数据内容的解析耗时几乎可以忽略不计

这一大串直接用json解析封装到对象里不就行了

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
String字符串转换成json格式并打印json数据。
执行到150行的时候,不抱错,也没有执行后面的语句,我想请教大神String转json格式然后存储到本地数据库,有什么办法。 package sql_conn; import java.awt.EventQueue; import java.awt.event.ActionEvent; import java.awt.event.ActionListener; import javax.swing.JButton; import javax.swing.JFrame; import javax.swing.JPanel; import java.io.BufferedReader; import java.io.BufferedWriter; import java.io.DataOutputStream; import java.io.IOException; import java.io.InputStreamReader; import java.io.OutputStreamWriter; import java.io.PrintWriter; import java.io.UnsupportedEncodingException; import java.net.MalformedURLException; import java.net.URL; import java.net.URLConnection; import java.net.URLEncoder; import javax.swing.*; import java.sql.Connection; import java.sql.DatabaseMetaData; import java.sql.DriverManager; import java.sql.PreparedStatement; import java.sql.ResultSet; import java.sql.ResultSetMetaData; import java.sql.SQLException; import java.sql.Statement; import java.sql.Types; import java.io.BufferedReader; import java.io.File; import java.io.FileOutputStream; import java.io.FileReader; import java.io.RandomAccessFile; import java.util.ArrayList; import java.util.List; import net.sf.json.JSONArray; import net.sf.json.JSONObject; public class java_sql extends JFrame{ JTextArea jtf1,jtf2; JButton yellowbutton,bluebutton,redbutton,okbutton; private static String test_url="jdbc:jtds:sqlserver://192.168.5.14:1433;DatabaseName=EUCP5"; private static String test_user = "admin123"; private static String test_pwd = "abcd_12345"; private static Connection con = null; public static void main(String[] args) { // TODO Auto-generated method stub EventQueue.invokeLater(new Runnable() { @Override public void run() { // TODO Auto-generated method stub java_sql frame=new java_sql(); frame.setVisible(true); } }); } public java_sql(){ this.setTitle("EUCP_数据库连接调试工具"); this.setSize(600,600); okbutton=new JButton("提交"); okbutton.setVisible(true); /*按钮添加button1Handler监听*/ okbutton.addActionListener(new Button1Handler()); //add buttons to panel buttonPanel=new JPanel(); buttonPanel.add(okbutton); this.add(buttonPanel); } private class Button1Handler implements ActionListener{ public void actionPerformed(ActionEvent e){ try { java_sql sms=new java_sql(); //jtf2.setText(okbutton.getActionCommand()); //JOptionPane.showMessageDialog(null, e.toString(), "alter", JOptionPane.ERROR_MESSAGE); sms.send_middle("177612309","验证码是48907"); } catch (Exception ex) { //JOptionPane.showMessageDialog(null, ex.toString(), "alter", JOptionPane.ERROR_MESSAGE); } } } private JPanel buttonPanel; public static int send_middle(String url, String param) throws MalformedURLException, UnsupportedEncodingException { String inputLine = ""; int value = -2; String con; DataOutputStream out = null; //InputStream in = null; String strUrl="http://eucpwx.mb345.com:443/AppApi/User/GetUserList?"; String strparam = "UserID=SU0126&GetType=1&NowID=&GetNum=100&Other=&SearchUserName="; try { System.out.println("start"); inputLine = sendPost(strUrl, strparam); //value = new Integer(inputLine).intValue(); System.out.println(inputLine); String str=String.valueOf(inputLine); //JSONObject st = JSONObject.fromObject(str); JSONObject jsonObject = new JSONObject().fromObject(str.toString()); System.out.println("可以"); Object data=jsonObject.get("UserID"); jsonObject = new JSONObject().fromObject(data.toString()); data=jsonObject.get("data"); jsonObject = new JSONObject().fromObject(data.toString()); data=jsonObject.get("value"); int value_k = Integer.parseInt(data.toString()); /* Statement stmt = null; String strSQL = ""; String driverName="com.microsoft.sqlserver.jdbc.SQLServerDriver"; String dbURL="jdbc:slserver://192.168.5.14:1433;DatabaseName=EnterpriseSMS_Play"; String userName="admin"; String userPwd="abcd"; Class.forName(driverName); Connection dbConn=DriverManager.getConnection(dbURL,userName,userPwd); String sql = ""; //注意java对SQL Server发送的SQL语句中表名需要用[]包围 System.out.println("SQL 开始"); for (int i = 0; i < jsonArr.size(); i++) { userid[i] = jsonArr.getJSONObject(i).getString("userid"); username[i] = jsonArr.getJSONObject(i).getString("username"); mobile[i] = jsonArr.getJSONObject(i).getString("mobile"); Avatar[i] = jsonArr.getJSONObject(i).getString("Avatar"); sql = "INSERT into user_bak (userid, username, mobile, Avatar) values('"+userid[i]+"','" +username[i]+"','"+mobile[i]+ "','"+Avatar[i]+"');"; System.out.println(sql.toString()); PreparedStatement statement = null; statement = dbConn.prepareStatement(sql); ResultSet rs = statement.executeQuery(); rs.close(); statement.close(); dbConn.close(); File f = new File("log.txt"); if (!f.exists()) { f.createNewFile(); } OutputStreamWriter write = new OutputStreamWriter(new FileOutputStream(f),"utf-8"); BufferedWriter writer=new BufferedWriter(write); writer.write(content); writer.close(); }*/ } catch (Exception e) { //System.out.println(e.toString()); value = -2; } //System.out.println(String.format("返回值:%d", value)); return value; } /** * 向指定 URL 发送POST方法的请求 * * @param url * 发送请求的 URL * @param param * 请求参数,请求参数应该是 name1=value1&name2=value2 的形式。 * @return 所代表远程资源的响应结果 */ public static String sendPost(String url, String param) { PrintWriter out = null; BufferedReader in = null; String result = ""; try { URL realUrl = new URL(url); // 打开和URL之间的连接 URLConnection conn = realUrl.openConnection(); // 设置通用的请求属性 conn.setRequestProperty("accept", "*/*"); conn.setRequestProperty("connection", "Keep-Alive"); conn.setRequestProperty("user-agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1;SV1)"); // 发送POST请求必须设置如下两行 conn.setDoOutput(true); conn.setDoInput(true); // 获取URLConnection对象对应的输出流 out = new PrintWriter(conn.getOutputStream()); // 发送请求参数 out.print(param); // flush输出流的缓冲 out.flush(); // 定义BufferedReader输入流来读取URL的响应 in = new BufferedReader( new InputStreamReader(conn.getInputStream())); String line; while ((line = in.readLine()) != null) { //System.out.println(line); //String send_line =URLEncoder.encode(line.toString(), "utf-8"); //JOptionPane.showMessageDialog(null, line.toString(), "alter", JOptionPane.ERROR_MESSAGE); //System.out.println(line.toString()); result += line; //System.out.println(result.toString()); } } catch (Exception e) { //System.out.println("发送 POST 请求出现异常!" + e); e.printStackTrace(); } // 使用finally块来关闭输出流、输入流 finally { try { if (out != null) { out.close(); } if (in != null) { in.close(); } } catch (IOException ex) { ex.printStackTrace(); } } return result; } } ![图片说明](https://img-ask.csdn.net/upload/201709/04/1504509813_211448.png)
py4j.protocol.Py4JJavaError: //java.io.FileNotFoundException
Cent OS 运行机器训练程序,报错如下: ``` Traceback (most recent call last): File "trainKundur2areaGenBrakingAgent.py", line 42, in <module> ob_act_dim_ary = ipss_app.initStudyCase(case_files_array , dyn_config_file, rl_config_file) File "/root/anaconda3/envs/py3ml/lib/python3.6/site-packages/py4j/java_gateway.py", line 1286, in __call__ answer, self.gateway_client, self.target_id, self.name) File "/root/anaconda3/envs/py3ml/lib/python3.6/site-packages/py4j/protocol.py", line 328, in get_return_value format(target_id, ".", name), value) py4j.protocol.Py4JJavaError: An error occurred while calling t.initStudyCase. : java.io.FileNotFoundException: testData/Kundur-2area/json/kundur2area_dyn_config.json (没有那个文件或目录) at java.io.FileInputStream.open(Native Method) at java.io.FileInputStream.<init>(FileInputStream.java:131) at org.interpss.util.FileUtil.readFile(FileUtil.java:96) at org.interpss.pssl.plugin.cmd.json.BaseJSONBean.toBean(BaseJSONBean.java:58) at org.pnnl.gov.pss_gateway.IpssPyGateway.initStudyCase(IpssPyGateway.java:133) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at py4j.Gateway.invoke(Gateway.java:282) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.GatewayConnection.run(GatewayConnection.java:214) at java.lang.Thread.run(Thread.java:744) ``` 文件经检查存在,且路径正确。求问如何解决,谢谢。
解析json数据,存入数据库
# 关于json解析存入数据库中的程序 json数据是台风数据,内容从url中获取,这是其中一条: [{"tfbh":"201611","name":"圆规","ename":"Kompasu","is_current":0,"begin_time":"2016-08-20T08:00:00","end_time":"2016-08-21T14:00:00","land":[]},{"tfbh":"201610","name":"狮子山","ename":"Lionrock","is_current":1,"begin_time":"2016-08-19T20:00:00","end_time":"2016-08-24T08:00:00","land":[]},{"tfbh":"201609","name":"蒲公英","ename":"Mindulle","is_current":0,"begin_time":"2016-08-19T14:00:00","end_time":"2016-08-23T02:00:00","land":[]},{"tfbh":"201608","name":"电母","ename":"Dianmu","is_current":0,"begin_time":"2016-08-18T05:00:00","end_time":"2016-08-19T23:00:00","land":[]},{"tfbh":"201607","name":"灿都","ename":"Chanthu","is_current":0,"begin_time":"2016-08-14T02:00:00","end_time":"2016-08-17T17:00:00","land":[]},{"tfbh":"201606","name":"康森","ename":"Conson","is_current":0,"begin_time":"2016-08-09T08:00:00","end_time":"2016-08-13T14:00:00","land":[]},{"tfbh":"201605","name":"奥麦斯","ename":"Omais","is_current":0,"begin_time":"2016-08-04T14:00:00","end_time":"2016-08-09T20:00:00","land":[]},{"tfbh":"201604","name":"妮妲","ename":"Nida","is_current":0,"begin_time":"2016-07-30T17:00:00","end_time":"2016-08-03T05:00:00","land":[]},{"tfbh":"201603","name":"银河","ename":"Mirinae","is_current":0,"begin_time":"2016-07-26T11:00:00","end_time":"2016-07-28T11:00:00","land":[]},{"tfbh":"201602","name":"卢碧","ename":"Lupit","is_current":0,"begin_time":"2016-07-24T02:00:00","end_time":"2016-07-25T02:00:00","land":[]},{"tfbh":"201601","name":"尼伯特","ename":"Nepartak","is_current":0,"begin_time":"2016-07-03T08:00:00","end_time":"2016-07-10T11:00:00","land":[]}] 要求是解析出数据,插入数据表里,属性有 tfbh(主键),name,ename,begin_time,"is_current",等我是菜鸟,求大神帮忙解答。package service; import java.io.BufferedReader; import java.io.IOException; import java.io.InputStreamReader; import java.net.MalformedURLException; import java.net.URL; import java.net.URLConnection; import java.util.ArrayList; import java.util.Date; import java.util.List; import dao.TaiFengInfoDao; import pao.TaiFengInfo; import net.sf.json.JSONArray; import net.sf.json.JSONException; import net.sf.json.JSONObject; public class getTaifengInfo { public getTaifengInfo() {} private static getTaifengInfo dao = new getTaifengInfo(); public static getTaifengInfo getInstance() { return dao; } private static String loadJson (String url) { StringBuilder json = new StringBuilder(); try { URL urlObject = new URL(url); URLConnection uc = urlObject.openConnection(); BufferedReader in = new BufferedReader(new InputStreamReader(uc.getInputStream())); String inputLine = null; while ( (inputLine = in.readLine()) != null) { json.append(inputLine); } in.close(); } catch (MalformedURLException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } return json.toString(); } /* public static TaiFengInfo jiexitaifengInfo(String str){ TaiFengInfoDao tdao = new TaiFengInfoDao(); JSONObject dataOfJson = JSONObject.fromObject(str); // String begin_time = dataOfJson.getString("begin_time"); // String end_time = dataOfJson.getString("end_time"); String tfbh=dataOfJson.getString("tfbh"); String name=dataOfJson.getString("name"); String ename=dataOfJson.getString("ename"); int is_current=dataOfJson.getInt("is_current"); String land=dataOfJson.getString("land"); TaiFengInfo info=new TaiFengInfo(); info.setTfbh(tfbh); // info.setBegin_time(begin_time); info.setEname(ename); info.setName(name); // info.setEnd_time(end_time); info.setIs_current(is_current); info.setLand(land); tdao.addtaifenglist(info); return info; }*/ /*private static String[] getStringArray4Json(String jsonString){ JSONArray jsonArray = JSONArray.fromObject(jsonString); String[] stringArray = new String[jsonArray.size()]; for( int i = 0 ; i<jsonArray.size() ; i++ ){ stringArray[i] = jsonArray.getString(i); } return stringArray; } */ /** * @param args */ public static TaiFengInfo jiexitaifengInfo1(String str){ TaiFengInfoDao tdao = new TaiFengInfoDao(); String strj="{"+"results:"+str; JSONObject dataOfJson = JSONObject.fromObject(str); JSONArray results=dataOfJson.getJSONArray("results"); for(int i=0;i<results.size();i++){ JSONObject Jinfo=results.getJSONObject(i); TaiFengInfo info=new TaiFengInfo(); } return null; } public static void main(String[] args) { String url = "http://www.wztf121.com/data/complex/path.json?rand=7428111"; // String url = "http://www.kuaidi100.com/query?type=yunda&postid=1201386764793"; String json = loadJson(url); System.out.println(json); String str="{"+json+"}"; System.out.println(str); // jiexitaifengInfo(str); /* String[] stringArray=getStringArray4Json(json); System.out.println(stringArray[0]);*/ } } ``` ``` ``` ```
java.lang.OutOfMemoryError: Java heap space
``` package com.shopping.view.app.action; import java.util.List; import javax.servlet.http.HttpServletResponse; import org.nutz.json.Json; import org.nutz.json.JsonFormat; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Controller; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.ResponseBody; import com.alibaba.fastjson.JSON; import com.shopping.core.annotation.SecurityMapping; import com.shopping.foundation.domain.GoodsBrand; import com.shopping.foundation.domain.GoodsClass; import com.shopping.foundation.service.IGoodsBrandService; import com.shopping.foundation.service.IGoodsClassService; @Controller public class APIGoods_listAction { @Autowired private IGoodsBrandService goodsBrandService; @Autowired private IGoodsClassService goodsClassService; @RequestMapping(value="/api/goods_list.htm",produces = "text/html;charset=UTF-8") @ResponseBody public String goods_list(HttpServletResponse response) { List<GoodsBrand> gbs = this.goodsBrandService.query("select obj from GoodsBrand obj order by obj.sequence asc", null, -1, -1); // List<GoodsClass> gcs = this.goodsClassService.query( // "select obj from GoodsClass obj where obj.parent.id is null order by obj.sequence asc", null, -1, -1); response.setContentType("text/json"); response.setHeader("Cache-Control", "no-cache"); response.setCharacterEncoding("UTF-8"); if(gbs!=null && gbs.size()>0){ String data = Json.toJson(gbs, JsonFormat.compact()); return "{\"statusCode\":200,\"msg\":\"加载成功!\",\"data\":"+data+"}"; }else{ return "{\"statusCode\":500,\"msg\":\"系统错误!\"}"; } } } ``` ``` 六月 12, 2017 5:15:12 下午 org.apache.catalina.core.StandardWrapperValve invoke 严重: Servlet.service() for servlet [shopping] in context with path [/shopping] threw exception [Handler processing failed; nested exception is java.lang.OutOfMemoryError: Java heap space] with root cause java.lang.OutOfMemoryError: Java heap space at java.util.Arrays.copyOf(Arrays.java:3332) at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:137) at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:121) at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:622) at java.lang.StringBuilder.append(StringBuilder.java:202) at org.nutz.lang.stream.StringWriter.write(StringWriter.java:23) at java.io.Writer.write(Writer.java:113) at java.io.Writer.append(Writer.java:293) at org.nutz.json.JsonRendering.string2Json(JsonRendering.java:237) at org.nutz.json.JsonRendering.appendName(JsonRendering.java:55) at org.nutz.json.JsonRendering.appendPair(JsonRendering.java:71) at org.nutz.json.JsonRendering.pojo2Json(JsonRendering.java:192) at org.nutz.json.JsonRendering.render(JsonRendering.java:273) at org.nutz.json.JsonRendering.coll2Json(JsonRendering.java:297) at org.nutz.json.JsonRendering.render(JsonRendering.java:269) at org.nutz.json.JsonRendering.appendPair(JsonRendering.java:73) at org.nutz.json.JsonRendering.pojo2Json(JsonRendering.java:192) at org.nutz.json.JsonRendering.render(JsonRendering.java:273) at org.nutz.json.JsonRendering.coll2Json(JsonRendering.java:297) at org.nutz.json.JsonRendering.render(JsonRendering.java:269) at org.nutz.json.JsonRendering.appendPair(JsonRendering.java:73) at org.nutz.json.JsonRendering.pojo2Json(JsonRendering.java:192) at org.nutz.json.JsonRendering.render(JsonRendering.java:273) at org.nutz.json.JsonRendering.appendPair(JsonRendering.java:73) at org.nutz.json.JsonRendering.pojo2Json(JsonRendering.java:192) at org.nutz.json.JsonRendering.render(JsonRendering.java:273) at org.nutz.json.JsonRendering.appendPair(JsonRendering.java:73) at org.nutz.json.JsonRendering.pojo2Json(JsonRendering.java:192) at org.nutz.json.JsonRendering.render(JsonRendering.java:273) at org.nutz.json.JsonRendering.appendPair(JsonRendering.java:73) at org.nutz.json.JsonRendering.pojo2Json(JsonRendering.java:192) at org.nutz.json.JsonRendering.render(JsonRendering.java:273) ```
jackson1.7.6读取json数据求大牛补充下代码
package per.sww.eleven_five; public class Glossary { private String title; private GlossDiv GlossDiv; public String getTitle() { return title; } public void setTitle(String title) { this.title = title; } public GlossDiv getGlossDiv() { return GlossDiv; } public void setGlossDiv(GlossDiv glossDiv) { GlossDiv = glossDiv; } } ``` package per.sww.eleven_five; public class GlossDef { public GlossDef(String para) { this.para=para; } private String para; public String getPara() { return para; } public void setPara(String para) { this.para = para; } } ``` package per.sww.eleven_five; public class GlossDiv { private String title; private GlossList GlossList; public String getTitle() { return title; } public void setTitle(String title) { this.title = title; } public GlossList getGlossList() { return GlossList; } public void setGlossList(GlossList glossList) { GlossList = glossList; } } ``` package per.sww.eleven_five; public class GlossEntry { private String ID; private String SortAs; private String GlossTerm; private String Acronym; private String Abbrev; private GlossDef GlossDef; private String GlossSee; public String getID() { return ID; } public void setID(String iD) { ID = iD; } public String getSortAs() { return SortAs; } public void setSortAs(String sortAs) { SortAs = sortAs; } public String getGlossTerm() { return GlossTerm; } public void setGlossTerm(String glossTerm) { GlossTerm = glossTerm; } public String getAcronym() { return Acronym; } public void setAcronym(String acronym) { Acronym = acronym; } public String getAbbrev() { return Abbrev; } public void setAbbrev(String abbrev) { Abbrev = abbrev; } public GlossDef getGlossDef() { return GlossDef; } public void setGlossDef(GlossDef glossDef) { GlossDef = glossDef; } public String getGlossSee() { return GlossSee; } public void setGlossSee(String glossSee) { GlossSee = glossSee; } } ``` package per.sww.eleven_five; public class GlossList { private GlossEntry GlossEntry; public GlossEntry getGlossEntry() { return GlossEntry; } public void setGlossEntry(GlossEntry glossEntry) { GlossEntry = glossEntry; } } ``` package per.sww.eleven_five; import java.io.File; import java.io.IOException; import java.util.ArrayList; import java.util.List; import org.codehaus.jackson.JsonEncoding; import org.codehaus.jackson.JsonGenerator; import org.codehaus.jackson.JsonParseException; import org.codehaus.jackson.map.JsonMappingException; import org.codehaus.jackson.map.ObjectMapper; import org.junit.After; import org.junit.Before; import org.junit.Test; public class JasksonBean { private JsonGenerator jsonGenerator=null; private ObjectMapper objectMapper=null; private Glossary glossary=null; private GlossDef glossDef=null; private GlossDiv glossDiv=null; private GlossEntry glossEntry=null; private GlossList glossList=null; @Before public void init(){ glossary=new Glossary(); glossary.setTitle("example glossary"); # **此处需补充(jackson版本为1.7.6,补充代码为去读json,json在最下面附上)** objectMapper =new ObjectMapper(); try{ jsonGenerator=objectMapper.getJsonFactory().createJsonGenerator(System.out,JsonEncoding.UTF8); }catch(IOException e){ e.printStackTrace(); } } @Test public void readJson2Entity() { try { Glossary glo = objectMapper.readValue(new File("d:" + File.separator + "a2.json"), Glossary.class); System.out.println(glo.getTitle()); System.out.println(glo.getGlossDiv().getTitle()); System.out.println(glo.getGlossDiv().getGlossList()); System.out.println(glo.getGlossDiv().getGlossList().getGlossEntry()); System.out.println(glo.getGlossDiv().getGlossList().getGlossEntry().getID()); System.out.println(glo.getGlossDiv().getGlossList().getGlossEntry().getSortAs()); System.out.println(glo.getGlossDiv().getGlossList().getGlossEntry().getGlossTerm()); System.out.println(glo.getGlossDiv().getGlossList().getGlossEntry().getAcronym()); System.out.println(glo.getGlossDiv().getGlossList().getGlossEntry().getAbbrev()); System.out.println(glo.getGlossDiv().getGlossList().getGlossEntry().getGlossDef()); System.out.println(glo.getGlossDiv().getGlossList().getGlossEntry().getGlossDef().getPara()); System.out.println(glo.getGlossDiv().getGlossList().getGlossEntry().getGlossSee()); System.out.println(glo); } catch (JsonParseException e) { e.printStackTrace(); } catch (JsonMappingException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } } @After public void putAfter(){ try{ if(jsonGenerator!=null){ jsonGenerator.flush(); } if(!jsonGenerator.isClosed()){ jsonGenerator.close();; } jsonGenerator=null; objectMapper=null; glossary=null; }catch(IOException e){ e.printStackTrace(); } } } ``` { "glossary": { "title": "example glossary", "GlossDiv": { "title": "S", "GlossList": { "GlossEntry": { "ID": "SGML", "SortAs": "SGML", "GlossTerm": "Standard Generalized Markup Language", "Acronym": "SGML", "Abbrev": "ISO 8879:1986", "GlossDef": { "para": "A meta-markup language, used to create markup languages such as DocBook." }, "GlossSee": "markup" } } } } }
我现在一个程序是siebel的输入格式转成json,但是遍历时紧急联系人个电话这个集合只输出一个其他的不出来
{"phone":"18610217536","name":"实际司机1","vehicalModel":"","extenalId":"LRDV6PDC2JL605998","vehicalSeries":"拓陆者S系列","vehicleBrand":"拓陆者","vehicalType":"03","memberEmergencyContact":{"phone":"18610217536","name":"紧急联系人1"}}` ``` package com.siebel.json; import com.google.gson.Gson; import com.google.gson.JsonArray; import com.google.gson.JsonElement; import com.google.gson.JsonObject; import com.siebel.data.SiebelPropertySet; import com.siebel.eai.SiebelBusinessService; import com.siebel.eai.SiebelBusinessServiceException; import java.io.PrintStream; import java.util.Iterator; import java.util.Map; import java.util.Set; @SuppressWarnings("unused") public class EAIJSONConverter extends SiebelBusinessService { @Override public void doInvokeMethod(String methodName, SiebelPropertySet input, SiebelPropertySet output) throws SiebelBusinessServiceException { if (methodName.equals("PropSetToJSON")) { JsonObject myJSON = new JsonObject(); myJSON = PropertySetToJsonObject(input, myJSON); output.setValue(myJSON.toString()); } if (methodName.equals("JSONToPropSet")) { JsonObject obj = new JsonObject(); obj = (JsonObject)new Gson().fromJson(input.getValue(), JsonObject.class); JsonObjectToPropertySet(obj, output); } } @SuppressWarnings("rawtypes") public static SiebelPropertySet JsonObjectToPropertySet(JsonObject obj, SiebelPropertySet ps) { Iterator<Map.Entry<String, JsonElement>> iterator = obj.entrySet().iterator(); while (iterator.hasNext()) { JsonArray jsonArray = new JsonArray(); JsonObject jsonObject = new JsonObject(); Map.Entry mapEntry = (Map.Entry)iterator.next(); // Map.Entry mapEntry = iterator.next(); if (mapEntry != null) { JsonElement jsonelement = (JsonElement)mapEntry.getValue(); if (jsonelement.isJsonArray()) { jsonArray = jsonelement.getAsJsonArray(); SiebelPropertySet child = new SiebelPropertySet(); child.setType("ListOf-" + mapEntry.getKey().toString()); SiebelPropertySet temp = new SiebelPropertySet(); for (int i = 0; i < jsonArray.size(); i++) { if ((jsonArray.get(i).isJsonObject()) || (jsonArray.get(i).isJsonArray())) { temp.setType("i"); child.addChild(JsonObjectToPropertySet(jsonArray.get(i).getAsJsonObject(), temp)); } else { child.setProperty("i", jsonArray.get(i).getAsString()); } } ps.addChild(child); } else if (jsonelement.isJsonObject()) { jsonObject = jsonelement.getAsJsonObject(); SiebelPropertySet child = new SiebelPropertySet(); child.setType(mapEntry.getKey().toString()); ps.addChild(JsonObjectToPropertySet(jsonObject, child)); } else { ps.setProperty(mapEntry.getKey().toString(), mapEntry.getValue().toString().replace("\"", "")); } } } return ps; } public JsonObject PropertySetToJsonObject(SiebelPropertySet ps, JsonObject jObj) { JsonObject siebJSON = new JsonObject(); String propName = ps.getFirstProperty(); while (propName != "") { String propVal = ps.getProperty(propName); siebJSON.addProperty(propName, propVal); propName = ps.getNextProperty(); } JsonObject child = new JsonObject(); for (int i = 0; i < ps.getChildCount(); i++) { child = PropertySetToJsonObject(ps.getChild(i), child); siebJSON.add(ps.getChild(i).getType(), child); } return siebJSON; } public static void main(String[] args) { SiebelPropertySet input = new SiebelPropertySet(); SiebelPropertySet output = new SiebelPropertySet(); input.setProperty("excelname", "jbs.xlsx"); input.setProperty("sheetID", "1"); input.setProperty("skip", "0"); input.setValue("{status:true,returnCode:0,message:Registeruserissuccess!,user:{brandCode:T,mobile:13900000000,customerName:姓名,sourceChannel:05,dealerCode:000001,memberNo:000001,idType:IdentityCard,email:test@qq.com,idCode:230111111111111111}}"); EAIJSONConverter jbs = new EAIJSONConverter(); try { jbs.doInvokeMethod("PropSetToJSON", input, output); } catch (SiebelBusinessServiceException e) { e.printStackTrace(); } System.out.print(output); } } ```
关于Android获取JSON数据的问题,直接上代码了
打开URL后的数据:![图片说明](https://img-ask.csdn.net/upload/201506/25/1435221962_504928.png) 写的测试类: ``` package com.zb.json_text; import java.io.ByteArrayOutputStream; import java.io.IOException; import java.io.InputStream; import java.net.HttpURLConnection; import java.net.URL; import java.util.ArrayList; import java.util.Date; import java.util.HashMap; import java.util.List; import java.util.Map; import org.json.JSONArray; import org.json.JSONObject; import android.app.Activity; import android.os.Bundle; import android.util.Log; public class MainActivity extends Activity { private List<Map<String, String>> slist= new ArrayList<Map<String, String>>(); @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); String path="http://m.lalas.cn/help/all_1.html?format=json&size=5"; try { slist=getJSONObject(path); } catch (Exception e) { // TODO Auto-generated catch block e.printStackTrace(); } System.out.println("----------------------------listchangdu-------------------------------"+slist.size()); } public static List<Map<String, String>> getJSONObject(String path) throws Exception { System.out.println("---------------------------------------进来了-------------------------------------"); List<Map<String, String>> list = new ArrayList<Map<String, String>>(); Map<String, String> map =null; URL url=new URL(path); //利用HttpURLConnection对象,我们可以从网页中获取网页数据 HttpURLConnection conn=(HttpURLConnection) url.openConnection(); //单位为毫秒,设置超时时间为5秒 conn.setConnectTimeout(15*1000); //HttpURLConnection对象是通过HTTP协议请求path路径的,所以需要设置请求方式,可以不设置,因为默认为get conn.setRequestMethod("GET"); System.out.println("-------------------------conn.getResponseCode"+conn.getResponseCode()); if(conn.getResponseCode()==200){//判断请求码是否200,否则为失败 InputStream is=conn.getInputStream(); //获取输入流 byte[] data =readStream(is); //把输入流转换成字符串组 String json=new String(data); //把字符串组转换成字符串 //数据形式:{"total":2,"success":true,"arrayData":[{"id":1,"name":"小猪"},{"id":2,"name":"小猫"}]} JSONObject jsonObject=new JSONObject(json); //返回的数据形式是一个Object类型,所以可以直接转换成一个Object int total=jsonObject.getInt("count"); String keywords=jsonObject.getString("keywords"); System.out.println("========================================================================="); System.out.println("!!!!!!!!!!!!!!--------------"+total+"--------"+keywords); //里面有一个数组数据,可以用getJSONArray获取数组 JSONArray jsonArray=jsonObject.getJSONArray("data"); for(int i =1;i<jsonArray.length();i++){ JSONObject item=jsonArray.getJSONObject(i); //得到没个对象 int id =item.getInt("id"); String title=item.getString("title"); String description=item.getString("description"); int time =item.getInt("time"); map=new HashMap<String,String>(); map.put("id", id+""); map.put("title", title); map.put("description", description); map.put("time", time+""); list.add(map); } for (Map<String, String> list2 : list) { String id = list2.get("id"); String name = list2.get("description"); Log.i("abc", "id:" + id + " | name:" + name); } } return list; } private static byte[] readStream(InputStream inputStream) throws Exception { ByteArrayOutputStream bout = new ByteArrayOutputStream(); byte[] buffer = new byte[1024]; int len = 0; while ((len = inputStream.read(buffer)) != -1) { bout.write(buffer, 0, len); } bout.close(); inputStream.close(); return bout.toByteArray(); } } ``` AndroidManifest.xml里也加了权限: <uses-permission android:name="android.permission.INTERNET"/> 但是运行后是没有拿到任何数据的。调试在走到获取请求码(conn.getResponseCode())的时候就断了。为什么这里会报错呢?
怎么在html文件中,通过js获取json数据并以表格形式输出
这是我的.jsp文件: ``` <%@ page language="java" contentType="text/html; charset=ISO-8859-1" pageEncoding="ISO-8859-1"%> <!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd"> <%@ taglib prefix="json" uri="http://www.atg.com/taglibs/json" %> <%@ taglib uri="http://java.sun.com/jsp/jstl/sql" prefix="sql"%> <%@ page import="java.io.*,java.util.*,java.sql.*"%> <%@ page import="javax.servlet.http.*,javax.servlet.*" %> <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"> <title>Insert title here</title> </head> <body> <sql:setDataSource var="snapshot" driver="com.mysql.jdbc.Driver" url="jdbc:mysql://localhost/login" user="root" password="147852369"/> <sql:query dataSource="${snapshot}" var="result"> SELECT * from Employees; </sql:query> <json:array name="items" var="item" items="${result.rows}"> <json:object> <json:property name="id" value="${item.id}"/> <json:property name="firstname" value="${item.first}"/> <json:property name="lastname" value="${item.last}"/> <json:property name="age" value="${item.age}"/> </json:object> </json:array> </body> </html> ``` 怎么在html文件中,通过js将json数据获取并以表格形式输出
安卓联网解析长JSON字符串
接口地址:http://api.k780.com:88/?app=weather.city&&appkey=10003&sign=b59bc3ef6191eb9f747dd4e83c99f2a4&format=json 代码: package com.example.cityinfo; import java.util.ArrayList; import org.json.JSONArray; import org.json.JSONException; import org.json.JSONObject; import android.util.Log; public class JsonUtils { //cityjson是从网络获取的json字符串 public ArrayList<City> getlist(String cityjson) { Log.i("==========", "getlist方法"); try { ArrayList<City> list = new ArrayList<City>(); JSONObject object = new JSONObject(cityjson); JSONArray array = object.getJSONArray("result"); for(int i=0;i<array.length();i++) { JSONObject wea = array.getJSONObject(i); String name =wea.getString("citynm"); String key = wea.getString("weaid"); list.add(new City(name,key)); Log.i("========", name); } return list; } catch (JSONException e) { } return null; } } 获取json字符串代码 package com.example.cityinfo; import java.io.IOException; import java.io.InputStream; import java.net.HttpURLConnection; import java.net.MalformedURLException; import java.net.URL; import android.util.Log; public class HttpUtils { public String getJson(String cityURL) { try { Log.i("======", "正在连接。。。"); HttpURLConnection conn=(HttpURLConnection)new URL(cityURL).openConnection(); conn.setRequestMethod("GET"); conn.connect(); if(conn.getResponseCode() == 200) { Log.i("=====", "连接成功。。。"); InputStream is = conn.getInputStream(); byte[] b = new byte[1024]; int num=0; StringBuffer buffer = new StringBuffer(); while ((num=is.read(b))!=-1) { buffer.append(new String(b,0,num)); } return buffer.toString(); } } catch (MalformedURLException e) { } catch (IOException e) { } return null; } } 为什么解析不了,还有每次获取的json字符串长度都不一样。 解析里的try里面的代码不能用运行![图片说明](http://forum.csdn.net/PointForum/ui/scripts/csdn/Plugin/001/face/9.gif)
java.net.SocketException: socket closed
返回的httpresponse 内容如下 HTTP/1.1 200 OK [Server: zhihu_nginx, Date: Sun, 29 Sep 2013 02:08:35 GMT, Content-Type: application/json, Transfer-Encoding: chunked, Connection: keep-alive, Vary: Accept-Encoding, Expires: Fri, 02 Jan 2000 00:00:00 GMT, Pragma: no-cache, Cache-Control: private, no-store, max-age=0,no-cache, must-revalidate, post-check=0, pre-check=0] 报错如下: Exception in thread "main" java.net.SocketException: socket closed at java.net.SocketInputStream.socketRead0(Native Method) at java.net.SocketInputStream.read(Unknown Source) at java.net.SocketInputStream.read(Unknown Source) at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:136) at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:152) at org.apache.http.impl.io.SessionInputBufferImpl.read(SessionInputBufferImpl.java:203) at org.apache.http.impl.io.ChunkedInputStream.read(ChunkedInputStream.java:174) at org.apache.http.conn.EofSensorInputStream.read(EofSensorInputStream.java:137) at java.util.zip.InflaterInputStream.fill(Unknown Source) at java.util.zip.InflaterInputStream.read(Unknown Source) at java.util.zip.GZIPInputStream.read(Unknown Source) at sun.nio.cs.StreamDecoder.readBytes(Unknown Source) at sun.nio.cs.StreamDecoder.implRead(Unknown Source) at sun.nio.cs.StreamDecoder.read(Unknown Source) at java.io.InputStreamReader.read(Unknown Source) at java.io.Reader.read(Unknown Source) at org.apache.http.util.EntityUtils.toString(EntityUtils.java:244) at org.apache.http.util.EntityUtils.toString(EntityUtils.java:288) at zhihu.ZhihuDownload.getAnswersList(ZhihuDownload.java:138) at zhihu.ZhihuDownload.main(ZhihuDownload.java:54)
从txt里读取数据生成类再将类转换成json
我的代码如下,但是达不到效果,希望有高人指点一下 package com.aminer; import org.json.*; import java.io.*; import java.util.*; /** * @author Administrator * */ public class Paper { private int index; private List<String> title; private List<String> authors; private List<String> affiliations; private int year; private List<String> references; private List<String> abstract_; private List<String> venue; public Paper(){ title = new ArrayList<>(); authors = new ArrayList<>(); affiliations = new ArrayList<>(); references = new ArrayList<>(); abstract_ = new ArrayList<>(); venue = new ArrayList<>(); } public void setIndex(int index) { this.index = index; } public void setYear(int year) { this.year = year; } public void addAbstract(String abstrct) { abstract_.add(abstrct); } public void addAffiliations(String affiliation) { affiliations.add(affiliation); } public void addAuthor(String author ) { authors.add(author); } public void addReference(String reference) { references.add(reference); } public void addVenue(String venue_) { venue.add(venue_); } public void addTitle(String title_) { title.add(title_); } @Override public String toString() { JSONObject json = new JSONObject(); try{ json.put("index",index); json.put("year",year); for(String abstrct:abstract_){ json.append("abstract_",abstrct); } for(String affiliation:affiliations){ json.append("affiliations", affiliation); } for(String author:authors){ json.append("authors", author); } for(String reference:references){ json.append("references", reference); } for(String venue_:venue){ json.append("venue", venue_); } for(String title_:title){ json.append("title", title_); } }catch(JSONException e){ e.printStackTrace(); } return json.toString(); } public void clear(){ index = 0; year = 0; title.clear(); authors.clear(); affiliations.clear(); references.clear(); abstract_.clear(); venue.clear(); } } package com.aminer; import java.io.*; import java.util.*; import org.json.JSONObject; public class Inf { public static void main(String[] args) throws IOException { BufferedReader reader = new BufferedReader ( new FileReader(new File("G:\\test_txt\\trty.txt"))); BufferedWriter writer = new BufferedWriter( new FileWriter(new File("G:\\test_txt\\trty.json"))); Paper paper = new Paper(); String line; int count = 0; while((line =reader.readLine())!=null){ if(line.length() <2)continue; char identifier = line.charAt(1); if(!(identifier=='i'||identifier=='*'||identifier=='@'||identifier=='o'||identifier=='t'||identifier=='c'||identifier=='%'||identifier=='!')) continue; if(identifier == 'i') { if(count>0) { writer.write(paper.toString()+"\n"); paper.clear(); } count++; } switch(identifier){ case 'i' : paper.setIndex(Integer.valueOf(line.substring(7).trim())); break; case 't' : paper.setYear(Integer.valueOf(line.substring(10).trim())); break; case '@' : String authors = line.substring(3).trim(); for (String author : authors.split(";")) { paper.addAuthor(author); } break; case 'o' : paper.addAffiliations(String.valueOf(line.substring(3).trim())); break; case '*': paper.addVenue(String.valueOf(line.substring(3).trim())); break; case 'c' : paper.addVenue(String.valueOf(line.substring(3).trim())); break; case '!' : paper.addAbstract(String.valueOf(line.substring(3).trim())); break; case '%' : paper.addReference(String.valueOf(line.substring(3).trim())); break; default: System.out.println("not successful"); } } reader.close(); writer.close(); }
对象转换成json格式数据,私有属性没有get方法 怎么也出来了~~
Page.java *********************************************************************** package com.ufida.console.common.service; import java.io.Serializable; import java.util.ArrayList; import java.util.List; import com.sitechasia.webx.core.exception.BaseUnCheckedException; @SuppressWarnings("unchecked") public class Page implements Serializable { private static final long serialVersionUID = 2728437513842150372L; private int pageSize; private int start; private List data; private long totalCount; public Page() { this(0, 0L, 20, ((List) (new ArrayList()))); } public Page(int start, long totalSize, int pageSize, List data) { this.pageSize = 20; if (pageSize <= 0 || start < 0 || totalSize < 0L) { throw new BaseUnCheckedException("Illegal Arguments!"); } else { this.pageSize = pageSize; this.start = start; totalCount = totalSize; this.data = data; return; } } public long getTotalCount() { return totalCount; } public long getTotalPageCount() { return totalCount % (long) pageSize != 0L ? totalCount / (long) pageSize + 1L : totalCount / (long) pageSize; } public int getPageSize() { return pageSize; } public void setResult(List data) { this.data = data; } public List getResult() { return data; } public int getCurrentPageNo() { return start / pageSize + 1; } public boolean hasNextPage() { return (long) getCurrentPageNo() < getTotalPageCount(); } public boolean hasPreviousPage() { return getCurrentPageNo() > 1; } public boolean isEmpty() { return data == null || data.isEmpty(); } public int getStartIndex() { return (getCurrentPageNo() - 1) * pageSize; } public int getEndIndex() { int endIndex = getCurrentPageNo() * pageSize - 1; return (long) endIndex < totalCount ? endIndex : (int) totalCount - 1; } protected static int getStartOfPage(int pageNo) { return getStartOfPage(pageNo, 20); } public static int getStartOfPage(int pageNo, int pageSize) { return (pageNo - 1) * pageSize; } } ************************************************************************************* 此类的对象解析成json格式的数据为: {"currentPageNo":1,"data":[],"empty":true,"endIndex":-1,"pageSize":20,"result":[],"startIndex":0,"totalCount":0,"totalPageCount":0} 出现"result"是理所当然的, 我不明白这里为什么会用这个"data",因为它是私有属性,而且也没有getData()方法来访问它,怎么会得出data呢? ************************************************************************************ 另外,我写了个测试类(作为对比,请你参考一下): package com.ufida.leaf.ajaxservice.internal; import java.util.ArrayList; import java.util.List; public class TestObject { private String name="liuwenhui"; private int no=10086; private String sex="M"; private List children=new ArrayList(); public String getRenameSex(){ return sex; } public List getSons(){ return children; } } 输出结果是: {"renameSex":"M","sons":[]} *********************************************************** [b]问题补充:[/b] 现在的问题是: 上面的Page的对象解析成json格式的数据为: {"currentPageNo":1,"data":[],"empty":true,"endIndex":-1,"pageSize":20,"result":[],"startIndex":0,"totalCount":0,"totalPageCount":0} 我不明白这里为什么会有这个"data",因为它是私有属性,而且也没有getData()方法来访问它,怎么会得出data呢? 按理说在这里不应该出现"data"属性啊 [b]问题补充:[/b] 回一楼的lovewhzlq ,那怎么取私有属性,连私有属性都可以随便访问,那不是java的安全机制有问题吗? 另外,既然您说可能是取到了私有属性,那我在上面提到的TestObject 里面的私有属性还是没有被解析到json中去呢~ [b]问题补充:[/b] 回四楼的pan_java ,请您看看我写的那个测试类TestObject您就明白我为什么这么问了~~ [b]问题补充:[/b] 回lovewhzlq 我用的json包是 json-lib-2.2.1-jdk15.jar, 我看了您的访问私有变量的方法,那是因为您重写了toString方法,相当于你自己还是提供了访问提口给它,因为System.out.println(p)实际会执行p.toString(); 然后再打印出来的,这跟反射机制应该没有什么关系吧~~呵呵~~ 我用了反射机制取不到那个data. [b]问题补充:[/b] 我找到问题的原因了,下面是我从page.class中反编译过来的类,请看: [code="java"] package com.ufida.console.common.service; import com.sitechasia.webx.core.exception.BaseUnCheckedException; import java.io.Serializable; import java.util.ArrayList; import java.util.List; public class Page implements Serializable { private static final long serialVersionUID = -2073474076L; private int pageSize; private int start; private List data; private long totalCount; public Page() { this(0, 0L, 20, new ArrayList()); } public Page(int start, long totalSize, int pageSize, List data) { this.pageSize = 20; if ((pageSize <= 0) || (start < 0) || (totalSize < 0L)) throw new BaseUnCheckedException("Illegal Arguments!"); this.pageSize = pageSize; this.start = start; this.totalCount = totalSize; this.data = data; } public long getTotalCount() { return this.totalCount; } public long getTotalPageCount() { return ((this.totalCount % this.pageSize != 0L) ? this.totalCount / this.pageSize + 1L : this.totalCount / this.pageSize); } public int getPageSize() { return this.pageSize; } public void setResult(List data) { this.data = data; } public List getResult() { return this.data; } public int getCurrentPageNo() { return (this.start / this.pageSize + 1); } public boolean hasNextPage() { return (getCurrentPageNo() < getTotalPageCount()); } public boolean hasPreviousPage() { return (getCurrentPageNo() > 1); } public boolean isEmpty() { return ((this.data == null) || (this.data.isEmpty())); } public int getStartIndex() { return ((getCurrentPageNo() - 1) * this.pageSize); } public int getEndIndex() { int endIndex = getCurrentPageNo() * this.pageSize - 1; return ((endIndex < this.totalCount) ? endIndex : (int)this.totalCount - 1); } protected static int getStartOfPage(int pageNo) { return getStartOfPage(pageNo, 20); } public static int getStartOfPage(int pageNo, int pageSize) { return ((pageNo - 1) * pageSize); } public List getData() { return this.data; } } [/code] 这里面最后竟然有这个方法 [code="java"] public List getData() { return this.data; } [/code] 不知道什么时候编译器留下的,好气人啊,我都重新编译了一遍都没有更新到~~ 感谢大家关注,特别感谢lovewhzlq,的确在您给的代码里我学到运用反射机制改变私有变量的方法~~虽然分不多,还是意思一下哈
java一个简单的LOL战绩爬虫(输入ID点击搜索后cookie为空 ) 大神求指教
package com.spider.dean.lab; import java.io.IOException; import java.util.HashMap; import java.util.List; import java.util.Map; import org.jsoup.Jsoup; import com.alibaba.fastjson.JSON; import com.spider.dean.lab.bean.CourseDB; import com.spider.dean.lab.util.ParseLabCourseUtil; import com.spider.util.LoginLabUtil; import com.spider.util.SpiderUrl; import com.spider.util.UrlRedirectUtil; import org.jsoup.Connection; import org.jsoup.Connection.Method; import org.omg.CosNaming.NamingContextExtPackage.StringNameHelper; public class Main { /** * @param args * @throws IOException */ public static void main(String[] args) throws IOException { Map<String, String> postData = new HashMap<>(); postData.put("keyWords", "哈哈哈"); Map<String, String> cookie; cookie = Jsoup.connect("http://lolbox.duowan.com/playerSearch.php").timeout(10000).data(postData).method(Method.POST).execute().cookies(); System.out.println(cookie); String labHtml; labHtml = Jsoup.connect("http://lolbox.duowan.com/playerList.php").cookies(cookie).timeout(10000).get().html(); System.out.println(labHtml); } }
Springboot集成hibernate+HibernateDaoSupport
``` package com.ruoyi.project.common.base.dao; import java.io.Serializable; import java.util.ArrayList; import java.util.Collection; import java.util.List; import java.util.Map; import org.hibernate.Query; import org.hibernate.SQLQuery; import org.hibernate.Session; import org.hibernate.SessionFactory; import org.hibernate.transform.Transformers; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.data.jpa.repository.Modifying; import org.springframework.orm.hibernate5.support.HibernateDaoSupport; import org.springframework.stereotype.Repository; import org.springframework.transaction.annotation.Transactional; import com.ruoyi.common.dto.BaseDtoImpl; import com.ruoyi.common.dto.IBaseDto; import com.ruoyi.common.utils.SystemExtUtils; import com.ruoyi.common.utils.json.Json; @Transactional @Repository("baseDao") public class BaseDaoImpl extends HibernateDaoSupport implements IBaseDao { @Autowired public void setSuperSessionFactory(SessionFactory sessionFactory) { super.setSessionFactory(sessionFactory); } // @Autowired // private EntityManagerFactory entityManagerFactory; // public Session getSession() { // return entityManagerFactory.unwrap(SessionFactory.class).openSession(); // } @Override public <T> void saveOrUpdateNow(T clazz){ Session session=this.getSession(); session.merge(clazz); //立即写入数据库并释放内存 session.flush(); session.clear(); } @Override @Transactional public <T> void saveOrUpdate(T clazz){ Session session=this.getSession(); //方法三 session.beginTransaction(); session.saveOrUpdate(clazz); Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'org.hibernate.SessionFactory' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {} ``` ![图片说明](https://img-ask.csdn.net/upload/201910/05/1570286693_807259.png) Maven工程,是网上若依的,但是我们这边用是Hibernate,所以想改成Hibernate的,但是真心问题多 我们还Hibernate这里继承HibernateDaoSupport来实现,出现如上错误 不写方法 @Autowired public void setSuperSessionFactory(SessionFactory sessionFactory) { super.setSessionFactory(sessionFactory); } 的时候错误出下 Caused by: java.lang.IllegalArgumentException: 'sessionFactory' or 'hibernateTemplate' is required 用Hibernate的人少了,人熟悉这块的吗
spring boot 运行出错了 Application failed to start with classpath?
![图片说明](https://img-ask.csdn.net/upload/201908/23/1566554233_948115.png) ```java "C:\Program Files\Java\jdk1.8.0_121\bin\java.exe" -XX:TieredStopAtLevel=1 -noverify -Dspring.output.ansi.enabled=always -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.port=50188 -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.ssl=false -Djava.rmi.server.hostname=localhost -Dspring.liveBeansView.mbeanDomain -Dspring.application.admin.enabled=true -javaagent:E:\IDEA2018.2.3\lib\idea_rt.jar=50189:E:\IDEA2018.2.3\bin -Dfile.encoding=UTF-8 -classpath "C:\Program Files\Java\jdk1.8.0_121\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\ext\zipfs.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\resources.jar;C:\Program Files\Java\jdk1.8.0_121\jre\lib\rt.jar;E:\maven\kexie\target\classes;C:\Users\amdin\.m2\repository\org\springframework\boot\spring-boot-starter-thymeleaf\2.1.7.RELEASE\spring-boot-starter-thymeleaf-2.1.7.RELEASE.jar;C:\Users\amdin\.m2\repository\org\springframework\boot\spring-boot-starter\2.1.7.RELEASE\spring-boot-starter-2.1.7.RELEASE.jar;C:\Users\amdin\.m2\repository\org\springframework\boot\spring-boot\2.1.7.RELEASE\spring-boot-2.1.7.RELEASE.jar;C:\Users\amdin\.m2\repository\org\springframework\boot\spring-boot-autoconfigure\2.1.7.RELEASE\spring-boot-autoconfigure-2.1.7.RELEASE.jar;C:\Users\amdin\.m2\repository\org\springframework\boot\spring-boot-starter-logging\2.1.7.RELEASE\spring-boot-starter-logging-2.1.7.RELEASE.jar;C:\Users\amdin\.m2\repository\ch\qos\logback\logback-classic\1.2.3\logback-classic-1.2.3.jar;C:\Users\amdin\.m2\repository\ch\qos\logback\logback-core\1.2.3\logback-core-1.2.3.jar;C:\Users\amdin\.m2\repository\org\apache\logging\log4j\log4j-to-slf4j\2.11.2\log4j-to-slf4j-2.11.2.jar;C:\Users\amdin\.m2\repository\org\apache\logging\log4j\log4j-api\2.11.2\log4j-api-2.11.2.jar;C:\Users\amdin\.m2\repository\org\slf4j\jul-to-slf4j\1.7.26\jul-to-slf4j-1.7.26.jar;C:\Users\amdin\.m2\repository\javax\annotation\javax.annotation-api\1.3.2\javax.annotation-api-1.3.2.jar;C:\Users\amdin\.m2\repository\org\yaml\snakeyaml\1.23\snakeyaml-1.23.jar;C:\Users\amdin\.m2\repository\org\thymeleaf\thymeleaf-spring5\3.0.11.RELEASE\thymeleaf-spring5-3.0.11.RELEASE.jar;C:\Users\amdin\.m2\repository\org\thymeleaf\thymeleaf\3.0.11.RELEASE\thymeleaf-3.0.11.RELEASE.jar;C:\Users\amdin\.m2\repository\org\attoparser\attoparser\2.0.5.RELEASE\attoparser-2.0.5.RELEASE.jar;C:\Users\amdin\.m2\repository\org\unbescape\unbescape\1.1.6.RELEASE\unbescape-1.1.6.RELEASE.jar;C:\Users\amdin\.m2\repository\org\thymeleaf\extras\thymeleaf-extras-java8time\3.0.4.RELEASE\thymeleaf-extras-java8time-3.0.4.RELEASE.jar;C:\Users\amdin\.m2\repository\org\mybatis\spring\boot\mybatis-spring-boot-starter\2.0.1\mybatis-spring-boot-starter-2.0.1.jar;C:\Users\amdin\.m2\repository\org\springframework\boot\spring-boot-starter-jdbc\2.1.7.RELEASE\spring-boot-starter-jdbc-2.1.7.RELEASE.jar;C:\Users\amdin\.m2\repository\com\zaxxer\HikariCP\3.2.0\HikariCP-3.2.0.jar;C:\Users\amdin\.m2\repository\org\springframework\spring-jdbc\5.1.9.RELEASE\spring-jdbc-5.1.9.RELEASE.jar;C:\Users\amdin\.m2\repository\org\springframework\spring-tx\5.1.9.RELEASE\spring-tx-5.1.9.RELEASE.jar;C:\Users\amdin\.m2\repository\org\mybatis\spring\boot\mybatis-spring-boot-autoconfigure\2.0.1\mybatis-spring-boot-autoconfigure-2.0.1.jar;C:\Users\amdin\.m2\repository\org\mybatis\mybatis\3.5.1\mybatis-3.5.1.jar;C:\Users\amdin\.m2\repository\org\mybatis\mybatis-spring\2.0.1\mybatis-spring-2.0.1.jar;C:\Users\amdin\.m2\repository\com\alibaba\druid\1.1.16\druid-1.1.16.jar;C:\Users\amdin\.m2\repository\com\microsoft\sqlserver\mssql-jdbc\6.4.0.jre8\mssql-jdbc-6.4.0.jre8.jar;C:\Users\amdin\.m2\repository\org\mybatis\generator\mybatis-generator-core\1.3.7\mybatis-generator-core-1.3.7.jar;C:\Users\amdin\.m2\repository\org\springframework\boot\spring-boot-starter-web\2.1.7.RELEASE\spring-boot-starter-web-2.1.7.RELEASE.jar;C:\Users\amdin\.m2\repository\org\springframework\boot\spring-boot-starter-json\2.1.7.RELEASE\spring-boot-starter-json-2.1.7.RELEASE.jar;C:\Users\amdin\.m2\repository\com\fasterxml\jackson\datatype\jackson-datatype-jdk8\2.9.9\jackson-datatype-jdk8-2.9.9.jar;C:\Users\amdin\.m2\repository\com\fasterxml\jackson\datatype\jackson-datatype-jsr310\2.9.9\jackson-datatype-jsr310-2.9.9.jar;C:\Users\amdin\.m2\repository\com\fasterxml\jackson\module\jackson-module-parameter-names\2.9.9\jackson-module-parameter-names-2.9.9.jar;C:\Users\amdin\.m2\repository\org\springframework\boot\spring-boot-starter-tomcat\2.1.7.RELEASE\spring-boot-starter-tomcat-2.1.7.RELEASE.jar;C:\Users\amdin\.m2\repository\org\apache\tomcat\embed\tomcat-embed-core\9.0.22\tomcat-embed-core-9.0.22.jar;C:\Users\amdin\.m2\repository\org\apache\tomcat\embed\tomcat-embed-el\9.0.22\tomcat-embed-el-9.0.22.jar;C:\Users\amdin\.m2\repository\org\apache\tomcat\embed\tomcat-embed-websocket\9.0.22\tomcat-embed-websocket-9.0.22.jar;C:\Users\amdin\.m2\repository\org\hibernate\validator\hibernate-validator\6.0.17.Final\hibernate-validator-6.0.17.Final.jar;C:\Users\amdin\.m2\repository\javax\validation\validation-api\2.0.1.Final\validation-api-2.0.1.Final.jar;C:\Users\amdin\.m2\repository\org\jboss\logging\jboss-logging\3.3.2.Final\jboss-logging-3.3.2.Final.jar;C:\Users\amdin\.m2\repository\com\fasterxml\classmate\1.4.0\classmate-1.4.0.jar;C:\Users\amdin\.m2\repository\org\springframework\spring-web\5.1.9.RELEASE\spring-web-5.1.9.RELEASE.jar;C:\Users\amdin\.m2\repository\org\springframework\spring-beans\5.1.9.RELEASE\spring-beans-5.1.9.RELEASE.jar;C:\Users\amdin\.m2\repository\org\springframework\spring-webmvc\5.1.9.RELEASE\spring-webmvc-5.1.9.RELEASE.jar;C:\Users\amdin\.m2\repository\org\springframework\spring-aop\5.1.9.RELEASE\spring-aop-5.1.9.RELEASE.jar;C:\Users\amdin\.m2\repository\org\springframework\spring-context\5.1.9.RELEASE\spring-context-5.1.9.RELEASE.jar;C:\Users\amdin\.m2\repository\org\springframework\spring-expression\5.1.9.RELEASE\spring-expression-5.1.9.RELEASE.jar;C:\Users\amdin\.m2\repository\junit\junit\4.12\junit-4.12.jar;C:\Users\amdin\.m2\repository\org\hamcrest\hamcrest-core\1.3\hamcrest-core-1.3.jar;C:\Users\amdin\.m2\repository\org\springframework\spring-core\5.1.9.RELEASE\spring-core-5.1.9.RELEASE.jar;C:\Users\amdin\.m2\repository\org\springframework\spring-jcl\5.1.9.RELEASE\spring-jcl-5.1.9.RELEASE.jar;C:\Users\amdin\.m2\repository\org\slf4j\slf4j-api\1.7.26\slf4j-api-1.7.26.jar;C:\Users\amdin\.m2\repository\org\slf4j\log4j-over-slf4j\1.7.26\log4j-over-slf4j-1.7.26.jar;C:\Users\amdin\.m2\repository\org\slf4j\jcl-over-slf4j\1.7.26\jcl-over-slf4j-1.7.26.jar;C:\Users\amdin\.m2\repository\com\github\pagehelper\pagehelper-spring-boot-starter\1.2.5\pagehelper-spring-boot-starter-1.2.5.jar;C:\Users\amdin\.m2\repository\com\github\pagehelper\pagehelper-spring-boot-autoconfigure\1.2.5\pagehelper-spring-boot-autoconfigure-1.2.5.jar;C:\Users\amdin\.m2\repository\org\apache\commons\commons-lang3\3.9\commons-lang3-3.9.jar;C:\Users\amdin\.m2\repository\com\fasterxml\jackson\core\jackson-databind\2.9.8\jackson-databind-2.9.8.jar;C:\Users\amdin\.m2\repository\com\fasterxml\jackson\core\jackson-annotations\2.9.0\jackson-annotations-2.9.0.jar;C:\Users\amdin\.m2\repository\com\fasterxml\jackson\core\jackson-core\2.9.9\jackson-core-2.9.9.jar;C:\Users\amdin\.m2\repository\org\apache\httpcomponents\httpclient\4.5.8\httpclient-4.5.8.jar;C:\Users\amdin\.m2\repository\org\json\json\20180813\json-20180813.jar;C:\Users\amdin\.m2\repository\commons-fileupload\commons-fileupload\1.4\commons-fileupload-1.4.jar;C:\Users\amdin\.m2\repository\commons-io\commons-io\2.2\commons-io-2.2.jar;C:\Users\amdin\.m2\repository\org\bouncycastle\bcprov-jdk15on\1.61\bcprov-jdk15on-1.61.jar;C:\Users\amdin\.m2\repository\commons-codec\commons-codec\1.12\commons-codec-1.12.jar;C:\Users\amdin\.m2\repository\jdom\jdom\1.0\jdom-1.0.jar;C:\Users\amdin\.m2\repository\com\google\code\gson\gson\2.8.5\gson-2.8.5.jar;C:\Users\amdin\.m2\repository\commons-logging\commons-logging\1.2\commons-logging-1.2.jar;C:\Users\amdin\.m2\repository\dom4j\dom4j\1.6.1\dom4j-1.6.1.jar;C:\Users\amdin\.m2\repository\xml-apis\xml-apis\1.4.01\xml-apis-1.4.01.jar;C:\Users\amdin\.m2\repository\org\apache\httpcomponents\httpcore\4.4.11\httpcore-4.4.11.jar;C:\Users\amdin\.m2\repository\org\apache\httpcomponents\httpmime\4.5.8\httpmime-4.5.8.jar;C:\Users\amdin\.m2\repository\com\googlecode\json-simple\json-simple\1.1.1\json-simple-1.1.1.jar;C:\Users\amdin\.m2\repository\org\apache\shiro\shiro-spring\1.3.2\shiro-spring-1.3.2.jar;C:\Users\amdin\.m2\repository\org\apache\shiro\shiro-core\1.3.2\shiro-core-1.3.2.jar;C:\Users\amdin\.m2\repository\commons-beanutils\commons-beanutils\1.8.3\commons-beanutils-1.8.3.jar;C:\Users\amdin\.m2\repository\org\apache\shiro\shiro-web\1.3.2\shiro-web-1.3.2.jar;C:\Users\amdin\.m2\repository\com\github\pagehelper\pagehelper\5.1.2\pagehelper-5.1.2.jar;C:\Users\amdin\.m2\repository\com\github\jsqlparser\jsqlparser\1.0\jsqlparser-1.0.jar" com.example.kexie.KexieApplication 17:55:12.366 [main] DEBUG org.springframework.boot.context.logging.ClasspathLoggingApplicationListener - Application failed to start with classpath: [file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/charsets.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/deploy.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/ext/access-bridge-64.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/ext/cldrdata.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/ext/dnsns.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/ext/jaccess.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/ext/jfxrt.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/ext/localedata.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/ext/nashorn.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/ext/sunec.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/ext/sunjce_provider.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/ext/sunmscapi.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/ext/sunpkcs11.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/ext/zipfs.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/javaws.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/jce.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/jfr.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/jfxswt.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/jsse.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/management-agent.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/plugin.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/resources.jar, file:/C:/Program%20Files/Java/jdk1.8.0_121/jre/lib/rt.jar, file:/E:/maven/kexie/target/classes/, file:/C:/Users/amdin/.m2/repository/org/springframework/boot/spring-boot-starter-thymeleaf/2.1.7.RELEASE/spring-boot-starter-thymeleaf-2.1.7.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/boot/spring-boot-starter/2.1.7.RELEASE/spring-boot-starter-2.1.7.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/boot/spring-boot/2.1.7.RELEASE/spring-boot-2.1.7.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/boot/spring-boot-autoconfigure/2.1.7.RELEASE/spring-boot-autoconfigure-2.1.7.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/boot/spring-boot-starter-logging/2.1.7.RELEASE/spring-boot-starter-logging-2.1.7.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar, file:/C:/Users/amdin/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar, file:/C:/Users/amdin/.m2/repository/org/apache/logging/log4j/log4j-to-slf4j/2.11.2/log4j-to-slf4j-2.11.2.jar, file:/C:/Users/amdin/.m2/repository/org/apache/logging/log4j/log4j-api/2.11.2/log4j-api-2.11.2.jar, file:/C:/Users/amdin/.m2/repository/org/slf4j/jul-to-slf4j/1.7.26/jul-to-slf4j-1.7.26.jar, file:/C:/Users/amdin/.m2/repository/javax/annotation/javax.annotation-api/1.3.2/javax.annotation-api-1.3.2.jar, file:/C:/Users/amdin/.m2/repository/org/yaml/snakeyaml/1.23/snakeyaml-1.23.jar, file:/C:/Users/amdin/.m2/repository/org/thymeleaf/thymeleaf-spring5/3.0.11.RELEASE/thymeleaf-spring5-3.0.11.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/thymeleaf/thymeleaf/3.0.11.RELEASE/thymeleaf-3.0.11.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/attoparser/attoparser/2.0.5.RELEASE/attoparser-2.0.5.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/unbescape/unbescape/1.1.6.RELEASE/unbescape-1.1.6.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/thymeleaf/extras/thymeleaf-extras-java8time/3.0.4.RELEASE/thymeleaf-extras-java8time-3.0.4.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/mybatis/spring/boot/mybatis-spring-boot-starter/2.0.1/mybatis-spring-boot-starter-2.0.1.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/boot/spring-boot-starter-jdbc/2.1.7.RELEASE/spring-boot-starter-jdbc-2.1.7.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/com/zaxxer/HikariCP/3.2.0/HikariCP-3.2.0.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/spring-jdbc/5.1.9.RELEASE/spring-jdbc-5.1.9.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/spring-tx/5.1.9.RELEASE/spring-tx-5.1.9.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/mybatis/spring/boot/mybatis-spring-boot-autoconfigure/2.0.1/mybatis-spring-boot-autoconfigure-2.0.1.jar, file:/C:/Users/amdin/.m2/repository/org/mybatis/mybatis/3.5.1/mybatis-3.5.1.jar, file:/C:/Users/amdin/.m2/repository/org/mybatis/mybatis-spring/2.0.1/mybatis-spring-2.0.1.jar, file:/C:/Users/amdin/.m2/repository/com/alibaba/druid/1.1.16/druid-1.1.16.jar, file:/C:/Users/amdin/.m2/repository/com/microsoft/sqlserver/mssql-jdbc/6.4.0.jre8/mssql-jdbc-6.4.0.jre8.jar, file:/C:/Users/amdin/.m2/repository/org/mybatis/generator/mybatis-generator-core/1.3.7/mybatis-generator-core-1.3.7.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/boot/spring-boot-starter-web/2.1.7.RELEASE/spring-boot-starter-web-2.1.7.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/boot/spring-boot-starter-json/2.1.7.RELEASE/spring-boot-starter-json-2.1.7.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/com/fasterxml/jackson/datatype/jackson-datatype-jdk8/2.9.9/jackson-datatype-jdk8-2.9.9.jar, file:/C:/Users/amdin/.m2/repository/com/fasterxml/jackson/datatype/jackson-datatype-jsr310/2.9.9/jackson-datatype-jsr310-2.9.9.jar, file:/C:/Users/amdin/.m2/repository/com/fasterxml/jackson/module/jackson-module-parameter-names/2.9.9/jackson-module-parameter-names-2.9.9.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/boot/spring-boot-starter-tomcat/2.1.7.RELEASE/spring-boot-starter-tomcat-2.1.7.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/apache/tomcat/embed/tomcat-embed-core/9.0.22/tomcat-embed-core-9.0.22.jar, file:/C:/Users/amdin/.m2/repository/org/apache/tomcat/embed/tomcat-embed-el/9.0.22/tomcat-embed-el-9.0.22.jar, file:/C:/Users/amdin/.m2/repository/org/apache/tomcat/embed/tomcat-embed-websocket/9.0.22/tomcat-embed-websocket-9.0.22.jar, file:/C:/Users/amdin/.m2/repository/org/hibernate/validator/hibernate-validator/6.0.17.Final/hibernate-validator-6.0.17.Final.jar, file:/C:/Users/amdin/.m2/repository/javax/validation/validation-api/2.0.1.Final/validation-api-2.0.1.Final.jar, file:/C:/Users/amdin/.m2/repository/org/jboss/logging/jboss-logging/3.3.2.Final/jboss-logging-3.3.2.Final.jar, file:/C:/Users/amdin/.m2/repository/com/fasterxml/classmate/1.4.0/classmate-1.4.0.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/spring-web/5.1.9.RELEASE/spring-web-5.1.9.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/spring-beans/5.1.9.RELEASE/spring-beans-5.1.9.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/spring-webmvc/5.1.9.RELEASE/spring-webmvc-5.1.9.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/spring-aop/5.1.9.RELEASE/spring-aop-5.1.9.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/spring-context/5.1.9.RELEASE/spring-context-5.1.9.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/spring-expression/5.1.9.RELEASE/spring-expression-5.1.9.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/junit/junit/4.12/junit-4.12.jar, file:/C:/Users/amdin/.m2/repository/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/spring-core/5.1.9.RELEASE/spring-core-5.1.9.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/springframework/spring-jcl/5.1.9.RELEASE/spring-jcl-5.1.9.RELEASE.jar, file:/C:/Users/amdin/.m2/repository/org/slf4j/slf4j-api/1.7.26/slf4j-api-1.7.26.jar, file:/C:/Users/amdin/.m2/repository/org/slf4j/log4j-over-slf4j/1.7.26/log4j-over-slf4j-1.7.26.jar, file:/C:/Users/amdin/.m2/repository/org/slf4j/jcl-over-slf4j/1.7.26/jcl-over-slf4j-1.7.26.jar, file:/C:/Users/amdin/.m2/repository/com/github/pagehelper/pagehelper-spring-boot-starter/1.2.5/pagehelper-spring-boot-starter-1.2.5.jar, file:/C:/Users/amdin/.m2/repository/com/github/pagehelper/pagehelper-spring-boot-autoconfigure/1.2.5/pagehelper-spring-boot-autoconfigure-1.2.5.jar, file:/C:/Users/amdin/.m2/repository/org/apache/commons/commons-lang3/3.9/commons-lang3-3.9.jar, file:/C:/Users/amdin/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.8/jackson-databind-2.9.8.jar, file:/C:/Users/amdin/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.9.0/jackson-annotations-2.9.0.jar, file:/C:/Users/amdin/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.9.9/jackson-core-2.9.9.jar, file:/C:/Users/amdin/.m2/repository/org/apache/httpcomponents/httpclient/4.5.8/httpclient-4.5.8.jar, file:/C:/Users/amdin/.m2/repository/org/json/json/20180813/json-20180813.jar, file:/C:/Users/amdin/.m2/repository/commons-fileupload/commons-fileupload/1.4/commons-fileupload-1.4.jar, file:/C:/Users/amdin/.m2/repository/commons-io/commons-io/2.2/commons-io-2.2.jar, file:/C:/Users/amdin/.m2/repository/org/bouncycastle/bcprov-jdk15on/1.61/bcprov-jdk15on-1.61.jar, file:/C:/Users/amdin/.m2/repository/commons-codec/commons-codec/1.12/commons-codec-1.12.jar, file:/C:/Users/amdin/.m2/repository/jdom/jdom/1.0/jdom-1.0.jar, file:/C:/Users/amdin/.m2/repository/com/google/code/gson/gson/2.8.5/gson-2.8.5.jar, file:/C:/Users/amdin/.m2/repository/commons-logging/commons-logging/1.2/commons-logging-1.2.jar, file:/C:/Users/amdin/.m2/repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar, file:/C:/Users/amdin/.m2/repository/xml-apis/xml-apis/1.4.01/xml-apis-1.4.01.jar, file:/C:/Users/amdin/.m2/repository/org/apache/httpcomponents/httpcore/4.4.11/httpcore-4.4.11.jar, file:/C:/Users/amdin/.m2/repository/org/apache/httpcomponents/httpmime/4.5.8/httpmime-4.5.8.jar, file:/C:/Users/amdin/.m2/repository/com/googlecode/json-simple/json-simple/1.1.1/json-simple-1.1.1.jar, file:/C:/Users/amdin/.m2/repository/org/apache/shiro/shiro-spring/1.3.2/shiro-spring-1.3.2.jar, file:/C:/Users/amdin/.m2/repository/org/apache/shiro/shiro-core/1.3.2/shiro-core-1.3.2.jar, file:/C:/Users/amdin/.m2/repository/commons-beanutils/commons-beanutils/1.8.3/commons-beanutils-1.8.3.jar, file:/C:/Users/amdin/.m2/repository/org/apache/shiro/shiro-web/1.3.2/shiro-web-1.3.2.jar, file:/C:/Users/amdin/.m2/repository/com/github/pagehelper/pagehelper/5.1.2/pagehelper-5.1.2.jar, file:/C:/Users/amdin/.m2/repository/com/github/jsqlparser/jsqlparser/1.0/jsqlparser-1.0.jar, file:/E:/IDEA2018.2.3/lib/idea_rt.jar] 17:55:12.492 [background-preinit] DEBUG org.hibernate.validator.messageinterpolation.ResourceBundleMessageInterpolator - Loaded expression factory via original TCCL 17:55:12.497 [background-preinit] DEBUG org.hibernate.validator.internal.engine.ValidatorFactoryImpl - HV000234: Using org.hibernate.validator.messageinterpolation.ResourceBundleMessageInterpolator as ValidatorFactory-scoped message interpolator. 17:55:12.498 [background-preinit] DEBUG org.hibernate.validator.internal.engine.ValidatorFactoryImpl - HV000234: Using org.hibernate.validator.internal.engine.resolver.TraverseAllTraversableResolver as ValidatorFactory-scoped traversable resolver. 17:55:12.498 [background-preinit] DEBUG org.hibernate.validator.internal.engine.ValidatorFactoryImpl - HV000234: Using org.hibernate.validator.internal.util.ExecutableParameterNameProvider as ValidatorFactory-scoped parameter name provider. 17:55:12.498 [background-preinit] DEBUG org.hibernate.validator.internal.engine.ValidatorFactoryImpl - HV000234: Using org.hibernate.validator.internal.engine.DefaultClockProvider as ValidatorFactory-scoped clock provider. 17:55:12.498 [background-preinit] DEBUG org.hibernate.validator.internal.engine.ValidatorFactoryImpl - HV000234: Using org.hibernate.validator.internal.engine.scripting.DefaultScriptEvaluatorFactory as ValidatorFactory-scoped script evaluator factory. 17:55:12.755 [main] ERROR org.springframework.boot.SpringApplication - Application run failed java.lang.IllegalStateException: Failed to load property source from location 'classpath:/application.yml' at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.load(ConfigFileApplicationListener.java:525) at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.loadForFileExtension(ConfigFileApplicationListener.java:474) at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.load(ConfigFileApplicationListener.java:444) at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.lambda$null$6(ConfigFileApplicationListener.java:426) at java.lang.Iterable.forEach(Iterable.java:75) at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.lambda$load$7(ConfigFileApplicationListener.java:426) at java.lang.Iterable.forEach(Iterable.java:75) at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.load(ConfigFileApplicationListener.java:423) at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.load(ConfigFileApplicationListener.java:322) at org.springframework.boot.context.config.ConfigFileApplicationListener.addPropertySources(ConfigFileApplicationListener.java:203) at org.springframework.boot.context.config.ConfigFileApplicationListener.postProcessEnvironment(ConfigFileApplicationListener.java:187) at org.springframework.boot.context.config.ConfigFileApplicationListener.onApplicationEnvironmentPreparedEvent(ConfigFileApplicationListener.java:177) at org.springframework.boot.context.config.ConfigFileApplicationListener.onApplicationEvent(ConfigFileApplicationListener.java:165) at org.springframework.context.event.SimpleApplicationEventMulticaster.doInvokeListener(SimpleApplicationEventMulticaster.java:172) at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:165) at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:139) at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:127) at org.springframework.boot.context.event.EventPublishingRunListener.environmentPrepared(EventPublishingRunListener.java:76) at org.springframework.boot.SpringApplicationRunListeners.environmentPrepared(SpringApplicationRunListeners.java:53) at org.springframework.boot.SpringApplication.prepareEnvironment(SpringApplication.java:341) at org.springframework.boot.SpringApplication.run(SpringApplication.java:305) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1214) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1203) at com.example.kexie.KexieApplication.main(KexieApplication.java:14) Caused by: org.yaml.snakeyaml.error.YAMLException: java.nio.charset.MalformedInputException: Input length = 1 at org.yaml.snakeyaml.reader.StreamReader.update(StreamReader.java:218) at org.yaml.snakeyaml.reader.StreamReader.ensureEnoughData(StreamReader.java:176) at org.yaml.snakeyaml.reader.StreamReader.ensureEnoughData(StreamReader.java:171) at org.yaml.snakeyaml.reader.StreamReader.peek(StreamReader.java:126) at org.yaml.snakeyaml.scanner.ScannerImpl.scanToNextToken(ScannerImpl.java:1177) at org.yaml.snakeyaml.scanner.ScannerImpl.fetchMoreTokens(ScannerImpl.java:287) at org.yaml.snakeyaml.scanner.ScannerImpl.checkToken(ScannerImpl.java:227) at org.yaml.snakeyaml.parser.ParserImpl$ParseImplicitDocumentStart.produce(ParserImpl.java:195) at org.yaml.snakeyaml.parser.ParserImpl.peekEvent(ParserImpl.java:158) at org.yaml.snakeyaml.parser.ParserImpl.checkEvent(ParserImpl.java:148) at org.yaml.snakeyaml.composer.Composer.checkNode(Composer.java:72) at org.yaml.snakeyaml.constructor.BaseConstructor.checkData(BaseConstructor.java:112) at org.yaml.snakeyaml.Yaml$1.hasNext(Yaml.java:542) at org.springframework.beans.factory.config.YamlProcessor.process(YamlProcessor.java:160) at org.springframework.beans.factory.config.YamlProcessor.process(YamlProcessor.java:134) at org.springframework.boot.env.OriginTrackedYamlLoader.load(OriginTrackedYamlLoader.java:75) at org.springframework.boot.env.YamlPropertySourceLoader.load(YamlPropertySourceLoader.java:50) at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.loadDocuments(ConfigFileApplicationListener.java:543) at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.load(ConfigFileApplicationListener.java:498) ... 23 common frames omitted Caused by: java.nio.charset.MalformedInputException: Input length = 1 at java.nio.charset.CoderResult.throwException(CoderResult.java:281) at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:339) at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178) at java.io.InputStreamReader.read(InputStreamReader.java:184) at org.yaml.snakeyaml.reader.UnicodeReader.read(UnicodeReader.java:125) at org.yaml.snakeyaml.reader.StreamReader.update(StreamReader.java:183) ... 41 common frames omitted Process finished with exit code 1 ``` ## yml文件 ![图片说明](https://img-ask.csdn.net/upload/201908/26/1566781610_481748.png)
json ssh ztree 无数据返回到页面
``` package com.action; import java.io.PrintWriter; import java.util.ArrayList; import java.util.List; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import org.apache.struts2.ServletActionContext; import org.apache.struts2.convention.annotation.Action; import org.apache.struts2.convention.annotation.Result; import org.hibernate.Session; import org.hibernate.SessionFactory; import org.hibernate.query.Query; import org.springframework.beans.factory.annotation.Autowired; import com.alibaba.fastjson.JSON; import com.base.BaseService; import com.entity.Administrator; import com.entity.PowerEntity; import com.instrument.Parameter; public class IndexAction { @Autowired BaseService baseservice; @Autowired SessionFactory sessionFactory; List<String> pnamelist = new ArrayList<String>(); List<ZtreeJS> ztreelist = null; @Action(value = "index", results = { @Result(name = "success", location = "/WEB-INF/jsp/index.jsp"), @Result(name = "defeated", location = "/login.jsp") }) // 登录的方法 public String index() { HttpServletRequest request = ServletActionContext.getRequest(); Parameter.CoustomParameter(Administrator.class, request); String hql = "from admin where account=? and apassword=? "; Session session = sessionFactory.openSession(); Query<Administrator> query = session.createQuery(hql); query.setParameter(0, request.getParameter("username")); query.setParameter(1, request.getParameter("password")); List<Administrator> adminlist = query.list(); if (adminlist.size() > 0) { System.err.println("登录成功"); session.close(); for (Administrator administrator : adminlist) { qureyPostId(administrator.getAid()); } return "success"; } else { System.err.println("登录失败"); session.close(); return "defeated"; } } public void qureyPostId(int uid) { List<Integer> pidlist = new ArrayList(); Session session = sessionFactory.openSession(); Query<Object[]> query = session.createQuery("select r.rid from role r join r.listadmin u where u.aid=" + uid); List<Object[]> userlist = query.list(); for (Object ob : userlist) { pidlist.add((Integer) ob); session.close(); } qureyjuriId(pidlist); session.close(); } public void qureyjuriId(List<Integer> pidlist) { List<Integer> powerlist = new ArrayList(); Session session = sessionFactory.openSession(); int pidlsitsize = pidlist.size() - 1; Query<Object[]> query = session.createSQLQuery("select pid from role_power p where rid=" + pidlist.get(pidlsitsize)); List<Object[]> userlist = query.list(); for (Object ob : userlist) { powerlist.add((Integer) ob); session.close(); } qureyjuriName(powerlist); session.close(); } public void qureyjuriName(List<Integer> jidlist) { pnamelist = new ArrayList(); Session session = sessionFactory.openSession(); for (Integer integer : jidlist) { Query<PowerEntity> query = session.createQuery("from power p where p.pid=" + integer); List<PowerEntity> userlist = query.list(); for (PowerEntity ob : userlist) { pnamelist.add(ob.getPname()); } } for (String integer : pnamelist) { System.out.println(integer); } session.close(); } @Action(value = "ztree") public void ztree() { HttpServletResponse response = ServletActionContext.getResponse(); List<ZtreeJS> ztreelist = new ArrayList<ZtreeJS>(); for (String string : pnamelist) { ZtreeJS z = new ZtreeJS(); z.setName(string); ztreelist.add(z); } String data = JSON.toJSONString(ztreelist); try { response.setCharacterEncoding("UTF-8"); PrintWriter out = response.getWriter(); out.write(data); out.flush(); out.close(); } catch (Exception e) { // TODO Auto-generated catch block e.printStackTrace(); } } } ``` ``` 后台查询有数据,前台接收不了, ```
Javaweb 省市联动 json数据获取 出现undefined
``` public class Province { private int ProvinceID; private String Name; public int getProvinceID() { return ProvinceID; } public void setProvinceID(int provinceID) { ProvinceID = provinceID; } public String getName() { return Name; } public void setName(String name) { Name = name; } public Province() { super(); // TODO Auto-generated constructor stub } public Province(int provinceID, String name) { super(); this.ProvinceID = provinceID; this.Name = name; } ``` ``` package com.web.servlet; import java.io.IOException; import java.sql.SQLException; import java.util.List; import javax.servlet.ServletException; import javax.servlet.http.HttpServlet; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import com.web.domain.Province; import com.web.service.ProvinceService; import net.sf.json.JSONArray; /** * 省市联动——省 * Servlet implementation class PrivinceSelectServlet */ public class ProvinceSelectServlet extends HttpServlet { private static final long serialVersionUID = 1L; protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { //调用service 查询所有省份、 List<Province> list=null; try { list = new ProvinceService().findAll( ); } catch (SQLException e) { // TODO Auto-generated catch block e.printStackTrace(); } //将所有的省份 response.setContentType("text/html;charset=utf-8"); if(list!=null&&list.size()>0){ response.getWriter().print(JSONArray.fromObject(list)); } } protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { // TODO Auto-generated method stub doGet(request, response); } } ``` ``` package com.web.service; import java.sql.SQLException; import java.util.List; import com.web.dao.Provincedao; import com.web.domain.Province; public class ProvinceService { public List<Province> findAll() throws SQLException { // TODO Auto-generated method stub return new Provincedao().findAll() ; } } ``` ``` package com.web.dao; import java.sql.SQLException; import java.util.List; import org.apache.commons.dbutils.QueryRunner; import org.apache.commons.dbutils.handlers.BeanHandler; import org.apache.commons.dbutils.handlers.BeanListHandler; import org.apache.commons.dbutils.handlers.ColumnListHandler; import com.web.domain.Province; import Utils.DatasourceUtils; public class Provincedao { public List<Province> findAll( ) throws SQLException { // TODO Auto-generated method stub QueryRunner qr = new QueryRunner(DatasourceUtils.getDatasource()); String sql="select * from province "; return qr.query(sql, new BeanListHandler< >(Province.class)); } } ``` ``` <%@ page language="java" contentType="text/html; charset=UTF-8" pageEncoding="UTF-8"%> <!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd"> <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"> <title>省事联动</title> <script type="text/javascript" src="${pageContext.request.contextPath }/js/jquery-1.11.3.min.js"></script> <script type="text/javascript"> $(function () { //页面加载成功,查询所有的省市 $.get("/Day15/ProvinceSelectServlet",function(obj){ //alert(obj) var $pro=$("#proId"); $(obj).each(function(){ $pro.append($("<option value="+this.ProvinceID+"> "+this.Name+"</option>")); }); },"json"); }) </script> </head> <body> <select id="proId" name="province"> <option >-省份-</option> </select> <select id="citId" name="city"> <option >-城市-</option> </select> </body> </html> ```
Failed to retrieve data from /webhdfs/v1/?op=LISTSTATUS: Server Error,同时无法put文件到hdfs
hadoop版本是3.1,ubuntu是18, 问题一:浏览hdfs目录显示: Failed to retrieve data from /webhdfs/v1/?op=LISTSTATUS: Server Error 问题二: namenode的log如下: ``` 438 WARN org.eclipse.jetty.servlet.ServletHandler: Error for /webhdfs/v1/ java.lang.NoClassDefFoundError: javax/activation/DataSource at com.sun.xml.bind.v2.model.impl.RuntimeBuiltinLeafInfoImpl.<clinit>(RuntimeBuiltinLeafInfoImpl.java:457) at com.sun.xml.bind.v2.model.impl.RuntimeTypeInfoSetImpl.<init>(RuntimeTypeInfoSetImpl.java:65) at com.sun.xml.bind.v2.model.impl.RuntimeModelBuilder.createTypeInfoSet(RuntimeModelBuilder.java:133) at com.sun.xml.bind.v2.model.impl.RuntimeModelBuilder.createTypeInfoSet(RuntimeModelBuilder.java:85) at com.sun.xml.bind.v2.model.impl.ModelBuilder.<init>(ModelBuilder.java:156) at com.sun.xml.bind.v2.model.impl.RuntimeModelBuilder.<init>(RuntimeModelBuilder.java:93) at com.sun.xml.bind.v2.runtime.JAXBContextImpl.getTypeInfoSet(JAXBContextImpl.java:473) at com.sun.xml.bind.v2.runtime.JAXBContextImpl.<init>(JAXBContextImpl.java:319) at com.sun.xml.bind.v2.runtime.JAXBContextImpl$JAXBContextBuilder.build(JAXBContextImpl.java:1170) at com.sun.xml.bind.v2.ContextFactory.createContext(ContextFactory.java:145) at com.sun.xml.bind.v2.ContextFactory.createContext(ContextFactory.java:236) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at javax.xml.bind.ContextFinder.newInstance(ContextFinder.java:186) at javax.xml.bind.ContextFinder.newInstance(ContextFinder.java:146) at javax.xml.bind.ContextFinder.find(ContextFinder.java:350) at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:446) at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:409) at com.sun.jersey.server.impl.wadl.WadlApplicationContextImpl.<init>(WadlApplicationContextImpl.java:103) at com.sun.jersey.server.impl.wadl.WadlFactory.init(WadlFactory.java:100) at com.sun.jersey.server.impl.application.RootResourceUriRules.initWadl(RootResourceUriRules.java:169) at com.sun.jersey.server.impl.application.RootResourceUriRules.<init>(RootResourceUriRules.java:106) at com.sun.jersey.server.impl.application.WebApplicationImpl._initiate(WebApplicationImpl.java:1359) at com.sun.jersey.server.impl.application.WebApplicationImpl.access$700(WebApplicationImpl.java:180) at com.sun.jersey.server.impl.application.WebApplicationImpl$13.f(WebApplicationImpl.java:799) at com.sun.jersey.server.impl.application.WebApplicationImpl$13.f(WebApplicationImpl.java:795) at com.sun.jersey.spi.inject.Errors.processWithErrors(Errors.java:193) at com.sun.jersey.server.impl.application.WebApplicationImpl.initiate(WebApplicationImpl.java:795) at com.sun.jersey.server.impl.application.WebApplicationImpl.initiate(WebApplicationImpl.java:790) at com.sun.jersey.spi.container.servlet.ServletContainer.initiate(ServletContainer.java:509) at com.sun.jersey.spi.container.servlet.ServletContainer$InternalWebComponent.initiate(ServletContainer.java:339) at com.sun.jersey.spi.container.servlet.WebComponent.load(WebComponent.java:605) at com.sun.jersey.spi.container.servlet.WebComponent.init(WebComponent.java:207) at com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:394) at com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:577) at javax.servlet.GenericServlet.init(GenericServlet.java:244) at org.eclipse.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:643) at org.eclipse.jetty.servlet.ServletHolder.getServlet(ServletHolder.java:499) at org.eclipse.jetty.servlet.ServletHolder.ensureInstance(ServletHolder.java:791) at org.eclipse.jetty.servlet.ServletHolder.prepare(ServletHolder.java:776) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:579) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) at org.eclipse.jetty.server.Server.handle(Server.java:539) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:333) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108) at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671) at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: java.lang.ClassNotFoundException: javax.activation.DataSource at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583) at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521) ... 65 more 2019-06-18 15:35:01,950 WARN org.eclipse.jetty.servlet.ServletHandler: /webhdfs/v1/ java.lang.NullPointerException at com.sun.jersey.spi.container.ContainerRequest.<init>(ContainerRequest.java:189) at com.sun.jersey.spi.container.servlet.WebComponent.createRequest(WebComponent.java:446) at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:373) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:848) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1772) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:644) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.apache.hadoop.hdfs.web.AuthFilter.doFilter(AuthFilter.java:90) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1609) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) at org.eclipse.jetty.server.Server.handle(Server.java:539) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:333) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108) at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671) at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589) at java.base/java.lang.Thread.run(Thread.java:834) 2019-06-18 15:39:17,698 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Number of transactions: 3 Total time for transactions(ms): 56 Number of transactions batched in Syncs: 0 Number of syncs: 2 SyncTimes(ms): 22 2019-06-18 15:39:25,202 WARN org.eclipse.jetty.servlet.ServletHandler: /webhdfs/v1/ java.lang.NullPointerException at com.sun.jersey.spi.container.ContainerRequest.<init>(ContainerRequest.java:189) at com.sun.jersey.spi.container.servlet.WebComponent.createRequest(WebComponent.java:446) at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:373) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:848) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1772) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:644) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.apache.hadoop.hdfs.web.AuthFilter.doFilter(AuthFilter.java:90) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1609) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) at org.eclipse.jetty.server.Server.handle(Server.java:539) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:333) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108) at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671) at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589) at java.base/java.lang.Thread.run(Thread.java:834) 2019-06-18 15:39:45,858 WARN org.eclipse.jetty.servlet.ServletHandler: /webhdfs/v1/ java.lang.NullPointerException at com.sun.jersey.spi.container.ContainerRequest.<init>(ContainerRequest.java:189) at com.sun.jersey.spi.container.servlet.WebComponent.createRequest(WebComponent.java:446) at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:373) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558) at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:848) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1772) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:644) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:592) at org.apache.hadoop.hdfs.web.AuthFilter.doFilter(AuthFilter.java:90) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1609) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) at org.eclipse.jetty.server.Server.handle(Server.java:539) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:333) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251) at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283) at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108) at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148) at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671) at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589) at java.base/java.lang.Thread.run(Thread.java:834) ``` 附datanode日志: 2019-06-18 14:52:36,785 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = gx-virtual-machine/127.0.1.1 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.2.0 STARTUP_MSG: classpath = /usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/kerby-xdr-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-core-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.7.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-crypto-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/re2j-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-databind-2.9.5.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.25.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-annotations-2.9.5.jar:/usr/local/hadoop/share/hadoop/common/lib/audience-annotations-0.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.12.0.jar:/usr/local/hadoop/share/hadoop/common/lib/json-smart-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jcip-annotations-1.0-1.jar:/usr/local/hadoop/share/hadoop/common/lib/kerby-pkix-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-3.2.0.jar:/usr/local/hadoop/share/hadoop/common/lib/token-provider-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr311-api-1.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.19.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core4-4.1.0-incubating.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-lang3-3.7.jar:/usr/local/hadoop/share/hadoop/common/lib/dnsjava-2.1.7.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.9.3.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/javax.servlet-api-3.1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-server-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.12.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-client-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/woodstox-core-5.0.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-admin-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-security-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.6.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-text-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.5.2.jar:/usr/local/hadoop/share/hadoop/common/lib/kerby-config-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-server-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-servlet-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-util-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-5.0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.11.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-identity-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/accessors-smart-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-xml-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/jul-to-slf4j-1.7.25.jar:/usr/local/hadoop/share/hadoop/common/lib/kerby-util-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-http-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/metrics-core-3.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration2-2.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.12.0.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-common-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.10.5.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/nimbus-jose-jwt-4.41.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.5.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.4.4.jar:/usr/local/hadoop/share/hadoop/common/lib/kerby-asn1-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.19.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-3.0.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.19.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-2.9.5.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-3.2.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.54.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-webapp-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/stax2-api-3.1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-servlet-1.19.jar:/usr/local/hadoop/share/hadoop/common/lib/kerb-simplekdc-1.0.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-io-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/common/hadoop-kms-3.2.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-3.2.0.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/kerby-xdr-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-core-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/avro-1.7.7.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-crypto-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/re2j-1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/json-simple-1.1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-databind-2.9.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-annotations-2.9.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/audience-annotations-0.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/curator-client-2.12.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/json-smart-2.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jcip-annotations-1.0-1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerby-pkix-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/hadoop-annotations-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/token-provider-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr311-api-1.1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.19.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core4-4.1.0-incubating.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang3-3.7.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/dnsjava-2.1.7.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-collections-3.2.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-beanutils-1.9.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/javax.servlet-api-3.1.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-server-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/curator-recipes-2.12.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-client-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/okhttp-2.7.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/woodstox-core-5.0.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-admin-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-security-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-net-3.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-text-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/httpclient-4.5.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerby-config-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-server-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-servlet-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/okio-1.6.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-ajax-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-util-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-5.0.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/zookeeper-3.4.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.11.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-identity-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/accessors-smart-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-xml-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-all-4.0.52.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerby-util-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-http-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-configuration2-2.1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/curator-framework-2.12.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-common-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.10.5.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/nimbus-jose-jwt-4.41.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/snappy-java-1.0.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/httpcore-4.4.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerby-asn1-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-json-1.19.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.19.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-2.9.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/hadoop-auth-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jaxb-api-2.2.11.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsch-0.1.54.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-webapp-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/stax2-api-3.1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-servlet-1.19.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/kerb-simplekdc-1.0.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-io-9.3.24.v20180605.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-client-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-native-client-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-native-client-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-rbf-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-httpfs-3.2.0.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-client-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-rbf-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-nativetask-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-uploader-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.2.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.2.0-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-4.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/json-io-2.5.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.19.jar:/usr/local/hadoop/share/hadoop/yarn/lib/mssql-jdbc-6.2.1.jre7.jar:/usr/local/hadoop/share/hadoop/yarn/lib/java-util-1.9.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/ehcache-3.3.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/swagger-annotations-1.5.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/HikariCP-java7-2.4.12.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/fst-2.50.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.19.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-4.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/snakeyaml-1.16.jar:/usr/local/hadoop/share/hadoop/yarn/lib/metrics-core-3.2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/objenesis-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-base-2.9.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-module-jaxb-annotations-2.9.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-json-provider-2.9.5.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-submarine-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-timeline-pluginstorage-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-services-api-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-services-core-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-router-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-3.2.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-3.2.0.jar STARTUP_MSG: build = https://github.com/apache/hadoop.git -r e97acb3bd8f3befd27418996fa5d4b50bf2e17bf; compiled by 'sunilg' on 2019-01-08T06:08Z STARTUP_MSG: java = 11.0.3 ************************************************************/ 2019-06-18 14:52:36,863 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT] 2019-06-18 14:52:41,503 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/usr/local/hadoop/tmp/dfs/data 2019-06-18 14:52:42,424 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2019-06-18 14:52:44,123 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2019-06-18 14:52:44,123 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started 2019-06-18 14:52:46,504 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2019-06-18 14:52:46,511 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2019-06-18 14:52:46,566 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is gx-virtual-machine 2019-06-18 14:52:46,567 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2019-06-18 14:52:46,592 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 0 2019-06-18 14:52:46,798 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:9866 2019-06-18 14:52:46,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2019-06-18 14:52:46,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50 2019-06-18 14:52:47,198 INFO org.eclipse.jetty.util.log: Logging initialized @15269ms 2019-06-18 14:52:48,022 INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. 2019-06-18 14:52:48,062 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2019-06-18 14:52:48,161 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2019-06-18 14:52:48,174 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode 2019-06-18 14:52:48,174 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static 2019-06-18 14:52:48,174 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs 2019-06-18 14:52:48,556 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 44121 2019-06-18 14:52:48,580 INFO org.eclipse.jetty.server.Server: jetty-9.3.24.v20180605, build timestamp: 2018-06-06T01:11:56+08:00, git hash: 84205aa28f11a4f31f2a3b86d1bba2cc8ab69827 2019-06-18 14:52:49,011 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@7876d598{/logs,file:///usr/local/hadoop/logs/,AVAILABLE} 2019-06-18 14:52:49,018 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@5af28b27{/static,file:///usr/local/hadoop/share/hadoop/hdfs/webapps/static/,AVAILABLE} 2019-06-18 14:52:50,151 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@547e29a4{/,file:///usr/local/hadoop/share/hadoop/hdfs/webapps/datanode/,AVAILABLE}{/datanode} 2019-06-18 14:52:50,242 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@6f45a1a0{HTTP/1.1,[http/1.1]}{localhost:44121} 2019-06-18 14:52:50,243 INFO org.eclipse.jetty.server.Server: Started @18329ms 2019-06-18 14:52:52,165 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /0.0.0.0:9864 2019-06-18 14:52:52,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hadoop 2019-06-18 14:52:52,273 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup 2019-06-18 14:52:52,242 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2019-06-18 14:52:52,720 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue, queueCapacity: 1000, scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler, ipcBackoff: false. 2019-06-18 14:52:52,880 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 9867 2019-06-18 14:52:54,839 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /0.0.0.0:9867 2019-06-18 14:52:55,160 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null 2019-06-18 14:52:55,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: <default> 2019-06-18 14:52:55,418 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:9000 starting to offer service 2019-06-18 14:52:55,532 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2019-06-18 14:52:55,561 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 9867: starting 2019-06-18 14:52:58,314 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:9000 2019-06-18 14:52:58,329 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 1 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=1, dataDirs=1) 2019-06-18 14:52:58,458 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /usr/local/hadoop/tmp/dfs/data/in_use.lock acquired by nodename 55815@gx-virtual-machine 2019-06-18 14:52:58,478 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory with location [DISK]file:/usr/local/hadoop/tmp/dfs/data is not formatted for namespace 317473294. Formatting... 2019-06-18 14:52:58,479 INFO org.apache.hadoop.hdfs.server.common.Storage: Generated new storageID DS-8b3e1e6d-135a-433a-93bb-3e62598daf5e for directory /usr/local/hadoop/tmp/dfs/data 2019-06-18 14:52:58,749 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-200946205-127.0.1.1-1560840480894 2019-06-18 14:52:58,750 INFO org.apache.hadoop.hdfs.server.common.Storage: Locking is disabled for /usr/local/hadoop/tmp/dfs/data/current/BP-200946205-127.0.1.1-1560840480894 2019-06-18 14:52:58,753 INFO org.apache.hadoop.hdfs.server.common.Storage: Block pool storage directory for location [DISK]file:/usr/local/hadoop/tmp/dfs/data and block pool id BP-200946205-127.0.1.1-1560840480894 is not formatted. Formatting ... 2019-06-18 14:52:58,753 INFO org.apache.hadoop.hdfs.server.common.Storage: Formatting block pool BP-200946205-127.0.1.1-1560840480894 directory /usr/local/hadoop/tmp/dfs/data/current/BP-200946205-127.0.1.1-1560840480894/current 2019-06-18 14:52:58,772 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Setting up storage: nsid=317473294;bpid=BP-200946205-127.0.1.1-1560840480894;lv=-57;nsInfo=lv=-65;cid=CID-eb45654d-0bc6-4348-b02f-e03603e1ae37;nsid=317473294;c=1560840480894;bpid=BP-200946205-127.0.1.1-1560840480894;dnuuid=null 2019-06-18 14:52:58,776 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Generated and persisted new Datanode UUID 6a2049c6-1a18-437a-97bd-51c5bb65a639 2019-06-18 14:52:59,549 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added new volume: DS-8b3e1e6d-135a-433a-93bb-3e62598daf5e 2019-06-18 14:52:59,553 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Added volume - [DISK]file:/usr/local/hadoop/tmp/dfs/data, StorageType: DISK 2019-06-18 14:52:59,615 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Registered FSDatasetState MBean 2019-06-18 14:52:59,680 INFO org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker: Scheduling a check for /usr/local/hadoop/tmp/dfs/data 2019-06-18 14:52:59,801 INFO org.apache.hadoop.hdfs.server.datanode.checker.DatasetVolumeChecker: Scheduled health check for volume /usr/local/hadoop/tmp/dfs/data 2019-06-18 14:52:59,809 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding block pool BP-200946205-127.0.1.1-1560840480894 2019-06-18 14:52:59,839 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Scanning block pool BP-200946205-127.0.1.1-1560840480894 on volume /usr/local/hadoop/tmp/dfs/data... 2019-06-18 14:53:00,166 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time taken to scan block pool BP-200946205-127.0.1.1-1560840480894 on /usr/local/hadoop/tmp/dfs/data: 327ms 2019-06-18 14:53:00,168 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to scan all replicas for block pool BP-200946205-127.0.1.1-1560840480894: 359ms 2019-06-18 14:53:00,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Adding replicas to map for block pool BP-200946205-127.0.1.1-1560840480894 on volume /usr/local/hadoop/tmp/dfs/data... 2019-06-18 14:53:00,181 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.BlockPoolSlice: Replica Cache file: /usr/local/hadoop/tmp/dfs/data/current/BP-200946205-127.0.1.1-1560840480894/current/replicas doesn't exist 2019-06-18 14:53:00,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Time to add replicas to map for block pool BP-200946205-127.0.1.1-1560840480894 on volume /usr/local/hadoop/tmp/dfs/data: 17ms 2019-06-18 14:53:00,198 INFO org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl: Total time to add all replicas to map for block pool BP-200946205-127.0.1.1-1560840480894: 27ms 2019-06-18 14:53:00,208 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: Now scanning bpid BP-200946205-127.0.1.1-1560840480894 on volume /usr/local/hadoop/tmp/dfs/data 2019-06-18 14:53:00,221 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/usr/local/hadoop/tmp/dfs/data, DS-8b3e1e6d-135a-433a-93bb-3e62598daf5e): finished scanning block pool BP-200946205-127.0.1.1-1560840480894 2019-06-18 14:53:00,401 INFO org.apache.hadoop.hdfs.server.datanode.VolumeScanner: VolumeScanner(/usr/local/hadoop/tmp/dfs/data, DS-8b3e1e6d-135a-433a-93bb-3e62598daf5e): no suitable block pools found to scan. Waiting 1814399799 ms. 2019-06-18 14:53:00,418 INFO org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Periodic Directory Tree Verification scan starting at 2019/6/18 下午8:05 with interval of 21600000ms 2019-06-18 14:53:00,463 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool BP-200946205-127.0.1.1-1560840480894 (Datanode Uuid 6a2049c6-1a18-437a-97bd-51c5bb65a639) service to localhost/127.0.0.1:9000 beginning handshake with NN 2019-06-18 14:53:00,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool Block pool BP-200946205-127.0.1.1-1560840480894 (Datanode Uuid 6a2049c6-1a18-437a-97bd-51c5bb65a639) service to localhost/127.0.0.1:9000 successfully registered with NN 2019-06-18 14:53:00,825 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: For namenode localhost/127.0.0.1:9000 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000 2019-06-18 14:53:01,524 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Successfully sent block report 0xb210af820fa10abf, containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 19 msec to generate and 231 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2019-06-18 14:53:01,525 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Got finalize command for block pool BP-200946205-127.0.1.1-1560840480894 2019-06-18 15:44:37,567 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741825_1001 src: /127.0.0.1:34774 dest: /127.0.0.1:9866 2019-06-18 15:44:37,733 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34774, dest: /127.0.0.1:9866, bytes: 8260, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741825_1001, duration(ns): 75831098 2019-06-18 15:44:37,737 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,256 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741826_1002 src: /127.0.0.1:34776 dest: /127.0.0.1:9866 2019-06-18 15:44:38,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34776, dest: /127.0.0.1:9866, bytes: 953, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741826_1002, duration(ns): 5252820 2019-06-18 15:44:38,266 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,340 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741827_1003 src: /127.0.0.1:34778 dest: /127.0.0.1:9866 2019-06-18 15:44:38,365 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34778, dest: /127.0.0.1:9866, bytes: 11392, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741827_1003, duration(ns): 19816531 2019-06-18 15:44:38,372 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741827_1003, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,428 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741828_1004 src: /127.0.0.1:34780 dest: /127.0.0.1:9866 2019-06-18 15:44:38,455 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34780, dest: /127.0.0.1:9866, bytes: 1061, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741828_1004, duration(ns): 9820674 2019-06-18 15:44:38,464 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741828_1004, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,517 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741829_1005 src: /127.0.0.1:34782 dest: /127.0.0.1:9866 2019-06-18 15:44:38,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34782, dest: /127.0.0.1:9866, bytes: 620, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741829_1005, duration(ns): 9424051 2019-06-18 15:44:38,537 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741829_1005, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,569 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741830_1006 src: /127.0.0.1:34784 dest: /127.0.0.1:9866 2019-06-18 15:44:38,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34784, dest: /127.0.0.1:9866, bytes: 3518, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741830_1006, duration(ns): 6662498 2019-06-18 15:44:38,579 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741830_1006, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,642 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741831_1007 src: /127.0.0.1:34786 dest: /127.0.0.1:9866 2019-06-18 15:44:38,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34786, dest: /127.0.0.1:9866, bytes: 682, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741831_1007, duration(ns): 5047916 2019-06-18 15:44:38,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741831_1007, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,713 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741832_1008 src: /127.0.0.1:34788 dest: /127.0.0.1:9866 2019-06-18 15:44:38,726 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34788, dest: /127.0.0.1:9866, bytes: 758, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741832_1008, duration(ns): 8532382 2019-06-18 15:44:38,727 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741832_1008, type=LAST_IN_PIPELINE terminating 2019-06-18 15:44:38,789 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741833_1009 src: /127.0.0.1:34790 dest: /127.0.0.1:9866 2019-06-18 15:44:38,807 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:34790, dest: /127.0.0.1:9866, bytes: 690, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1864191814_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741833_1009, duration(ns): 5589094 2019-06-18 15:44:38,813 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741833_1009, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:01,961 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741834_1010 src: /127.0.0.1:36578 dest: /127.0.0.1:9866 2019-06-19 09:54:02,003 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36578, dest: /127.0.0.1:9866, bytes: 8260, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741834_1010, duration(ns): 32739756 2019-06-19 09:54:02,011 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741834_1010, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,125 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741835_1011 src: /127.0.0.1:36580 dest: /127.0.0.1:9866 2019-06-19 09:54:02,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36580, dest: /127.0.0.1:9866, bytes: 953, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741835_1011, duration(ns): 12137675 2019-06-19 09:54:02,154 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741835_1011, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741836_1012 src: /127.0.0.1:36582 dest: /127.0.0.1:9866 2019-06-19 09:54:02,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36582, dest: /127.0.0.1:9866, bytes: 11392, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741836_1012, duration(ns): 8740891 2019-06-19 09:54:02,249 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741836_1012, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,307 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741837_1013 src: /127.0.0.1:36584 dest: /127.0.0.1:9866 2019-06-19 09:54:02,322 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36584, dest: /127.0.0.1:9866, bytes: 1061, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741837_1013, duration(ns): 8680367 2019-06-19 09:54:02,323 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741837_1013, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,399 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741838_1014 src: /127.0.0.1:36586 dest: /127.0.0.1:9866 2019-06-19 09:54:02,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36586, dest: /127.0.0.1:9866, bytes: 620, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741838_1014, duration(ns): 8474258 2019-06-19 09:54:02,413 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741838_1014, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,491 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741839_1015 src: /127.0.0.1:36588 dest: /127.0.0.1:9866 2019-06-19 09:54:02,502 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36588, dest: /127.0.0.1:9866, bytes: 3518, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741839_1015, duration(ns): 6946259 2019-06-19 09:54:02,503 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741839_1015, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,560 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741840_1016 src: /127.0.0.1:36590 dest: /127.0.0.1:9866 2019-06-19 09:54:02,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36590, dest: /127.0.0.1:9866, bytes: 682, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741840_1016, duration(ns): 6602106 2019-06-19 09:54:02,571 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder: BP-200946205-127.0.1.1-1560840480894:blk_1073741840_1016, type=LAST_IN_PIPELINE terminating 2019-06-19 09:54:02,635 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-200946205-127.0.1.1-1560840480894:blk_1073741841_1017 src: /127.0.0.1:36592 dest: /127.0.0.1:9866 2019-06-19 09:54:02,650 INFO org.apache.hadoop.hdfs.server.datanode.DataNode.clienttrace: src: /127.0.0.1:36592, dest: /127.0.0.1:9866, bytes: 758, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_853656761_1, offset: 0, srvID: 6a2049c6-1a18-437a-97bd-51c5bb65a639, blockid: BP-200946205-127.0.1.1-1560840480894:blk_1073741841_1017, duration(ns): 9690339 2019-06-19 09:54:02,654 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: PacketResponder:
大大们,实在不行了,关于从json的图片url获取图片添加到imageview
首先呢,附上我的代码,字符串可以获取,唯独图片获取不到,这该怎么办,大大们帮忙看看,给我点介意,我会感激不尽。 package textview.exam; import android.graphics.Bitmap; import android.graphics.BitmapFactory; import android.os.Bundle; import android.support.annotation.Nullable; import android.support.v4.app.Fragment; import android.util.Log; import android.view.LayoutInflater; import android.view.View; import android.view.ViewGroup; import android.widget.ListView; import android.widget.SimpleAdapter; import com.baidu.apistore.sdk.ApiCallBack; import com.baidu.apistore.sdk.ApiStoreSDK; import com.baidu.apistore.sdk.network.Parameters; import org.json.JSONArray; import org.json.JSONException; import org.json.JSONObject; import java.io.IOException; import java.io.InputStream; import java.net.HttpURLConnection; import java.net.URL; import java.util.ArrayList; import java.util.HashMap; import java.util.List; import java.util.Map; public class FragmentMainchatnews extends Fragment { private String[] title;//标题 private String[] abs;//新闻简介 private String[] url;//新闻详情地址 private String[] datatime;//新闻发布时间 private String[] img_url;//新闻缩略图 Bitmap image; @Override public void onCreate(Bundle savedInstanceState) { // TODO Auto-generated method stub super.onCreate(savedInstanceState); } @Override public View onCreateView(LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) { // TODO Auto-generated method stub View view = inflater.inflate(R.layout.main_chat_news, null); apiTest(view); return view; } private void apiTest(final View view) { Parameters para = new Parameters(); //para.put("keyword", "娱乐"); ApiStoreSDK.execute("http://apis.baidu.com/songshuxiansheng/news/news", ApiStoreSDK.GET, para, new ApiCallBack() { @Override public void onSuccess(int status, String responseString) { Log.i("连接状态:", "连接成功"); try { JSONObject dataJson=new JSONObject(responseString); JSONArray data=dataJson.getJSONArray("retData"); // 创建一个List集合,List集合的元素是Map List<Map<String, Object>> listItems =new ArrayList<Map<String, Object>>(); //for (int i = 0; i < 5; i++) { JSONObject info = data.getJSONObject(0); image=getBitmap("http://p1.pstatp.com/list/9831/218724483"); //Bitmap bit= BitmapFactory.decodeFile(info.getString("image_url")); //bit.compress(Bitmap.CompressFormat.JPEG, 100,stream); Map<String, Object> listItem = new HashMap<String, Object>(); listItem.put("title", info.getString("title")); listItem.put("image", image); listItem.put("abs", info.getString("abstract")); listItems.add(listItem); //} // 创建一个SimpleAdapter SimpleAdapter simpleAdapter = new SimpleAdapter(getActivity(), listItems,R.layout.news_simple_item, new String[]{"image", "title", "abs"}, new int[]{R.id.image, R.id.title, R.id.abs}); ListView list = (ListView) view.findViewById(R.id.chat_news); // 为ListView设置Adapter list.setAdapter(simpleAdapter); } catch (JSONException e) { Log.i("获取到数据:",responseString); } catch (IOException e) { e.printStackTrace(); } } @Override public void onComplete() { Log.i("sdkdemo", "onComplete"); } @Override public void onError(int status, String responseString, Exception e) { Log.i("sdkdemo", "onError, status: " + status); Log.i("sdkdemo", "errMsg: " + (e == null ? "" : e.getMessage())); } }); } public Bitmap getBitmap(String path) throws IOException { URL url = new URL(path); HttpURLConnection conn = (HttpURLConnection) url.openConnection(); conn.setConnectTimeout(5000); conn.setRequestMethod("GET"); if (conn.getResponseCode() == 200) { //得到输入流 InputStream inputStream = conn.getInputStream(); Bitmap bitmap = BitmapFactory.decodeStream(inputStream); return bitmap; } return null; } } 其次呢,我对线程不是很了解,能不能有大大给我讲解一下它的用法,就拿我这个例子来说,应该怎么写线程就能使ui不堵塞
Java学习的正确打开方式
在博主认为,对于入门级学习java的最佳学习方法莫过于视频+博客+书籍+总结,前三者博主将淋漓尽致地挥毫于这篇博客文章中,至于总结在于个人,实际上越到后面你会发现学习的最好方式就是阅读参考官方文档其次就是国内的书籍,博客次之,这又是一个层次了,这里暂时不提后面再谈。博主将为各位入门java保驾护航,各位只管冲鸭!!!上天是公平的,只要不辜负时间,时间自然不会辜负你。 何谓学习?博主所理解的学习,它是一个过程,是一个不断累积、不断沉淀、不断总结、善于传达自己的个人见解以及乐于分享的过程。
程序员必须掌握的核心算法有哪些?
由于我之前一直强调数据结构以及算法学习的重要性,所以就有一些读者经常问我,数据结构与算法应该要学习到哪个程度呢?,说实话,这个问题我不知道要怎么回答你,主要取决于你想学习到哪些程度,不过针对这个问题,我稍微总结一下我学过的算法知识点,以及我觉得值得学习的算法。这些算法与数据结构的学习大多数是零散的,并没有一本把他们全部覆盖的书籍。下面是我觉得值得学习的一些算法以及数据结构,当然,我也会整理一些看过...
大学四年自学走来,这些私藏的实用工具/学习网站我贡献出来了
大学四年,看课本是不可能一直看课本的了,对于学习,特别是自学,善于搜索网上的一些资源来辅助,还是非常有必要的,下面我就把这几年私藏的各种资源,网站贡献出来给你们。主要有:电子书搜索、实用工具、在线视频学习网站、非视频学习网站、软件下载、面试/求职必备网站。 注意:文中提到的所有资源,文末我都给你整理好了,你们只管拿去,如果觉得不错,转发、分享就是最大的支持了。 一、电子书搜索 对于大部分程序员...
linux系列之常用运维命令整理笔录
本博客记录工作中需要的linux运维命令,大学时候开始接触linux,会一些基本操作,可是都没有整理起来,加上是做开发,不做运维,有些命令忘记了,所以现在整理成博客,当然vi,文件操作等就不介绍了,慢慢积累一些其它拓展的命令,博客不定时更新 free -m 其中:m表示兆,也可以用g,注意都要小写 Men:表示物理内存统计 total:表示物理内存总数(total=used+free) use...
比特币原理详解
一、什么是比特币 比特币是一种电子货币,是一种基于密码学的货币,在2008年11月1日由中本聪发表比特币白皮书,文中提出了一种去中心化的电子记账系统,我们平时的电子现金是银行来记账,因为银行的背后是国家信用。去中心化电子记账系统是参与者共同记账。比特币可以防止主权危机、信用风险。其好处不多做赘述,这一层面介绍的文章很多,本文主要从更深层的技术原理角度进行介绍。 二、问题引入 假设现有4个人...
程序员接私活怎样防止做完了不给钱?
首先跟大家说明一点,我们做 IT 类的外包开发,是非标品开发,所以很有可能在开发过程中会有这样那样的需求修改,而这种需求修改很容易造成扯皮,进而影响到费用支付,甚至出现做完了项目收不到钱的情况。 那么,怎么保证自己的薪酬安全呢? 我们在开工前,一定要做好一些证据方面的准备(也就是“讨薪”的理论依据),这其中最重要的就是需求文档和验收标准。一定要让需求方提供这两个文档资料作为开发的基础。之后开发...
网页实现一个简单的音乐播放器(大佬别看。(⊙﹏⊙))
今天闲着无事,就想写点东西。然后听了下歌,就打算写个播放器。 于是乎用h5 audio的加上js简单的播放器完工了。 演示地点演示 html代码如下` music 这个年纪 七月的风 音乐 ` 然后就是css`*{ margin: 0; padding: 0; text-decoration: none; list-...
Python十大装B语法
Python 是一种代表简单思想的语言,其语法相对简单,很容易上手。不过,如果就此小视 Python 语法的精妙和深邃,那就大错特错了。本文精心筛选了最能展现 Python 语法之精妙的十个知识点,并附上详细的实例代码。如能在实战中融会贯通、灵活使用,必将使代码更为精炼、高效,同时也会极大提升代码B格,使之看上去更老练,读起来更优雅。
数据库优化 - SQL优化
以实际SQL入手,带你一步一步走上SQL优化之路!
通俗易懂地给女朋友讲:线程池的内部原理
餐盘在灯光的照耀下格外晶莹洁白,女朋友拿起红酒杯轻轻地抿了一小口,对我说:“经常听你说线程池,到底线程池到底是个什么原理?”
经典算法(5)杨辉三角
杨辉三角 是经典算法,这篇博客对它的算法思想进行了讲解,并有完整的代码实现。
使用 Docker 部署 Spring Boot 项目
Docker 技术发展为微服务落地提供了更加便利的环境,使用 Docker 部署 Spring Boot 其实非常简单,这篇文章我们就来简单学习下。首先构建一个简单的 S...
英特尔不为人知的 B 面
从 PC 时代至今,众人只知在 CPU、GPU、XPU、制程、工艺等战场中,英特尔在与同行硬件芯片制造商们的竞争中杀出重围,且在不断的成长进化中,成为全球知名的半导体公司。殊不知,在「刚硬」的背后,英特尔「柔性」的软件早已经做到了全方位的支持与支撑,并持续发挥独特的生态价值,推动产业合作共赢。 而对于这一不知人知的 B 面,很多人将其称之为英特尔隐形的翅膀,虽低调,但是影响力却不容小觑。 那么,在...
面试官:你连RESTful都不知道我怎么敢要你?
干货,2019 RESTful最贱实践
刷了几千道算法题,这些我私藏的刷题网站都在这里了!
遥想当年,机缘巧合入了 ACM 的坑,周边巨擘林立,从此过上了"天天被虐似死狗"的生活… 然而我是谁,我可是死狗中的战斗鸡,智力不够那刷题来凑,开始了夜以继日哼哧哼哧刷题的日子,从此"读题与提交齐飞, AC 与 WA 一色 ",我惊喜的发现被题虐既刺激又有快感,那一刻我泪流满面。这么好的事儿作为一个正直的人绝不能自己独享,经过激烈的颅内斗争,我决定把我私藏的十几个 T 的,阿不,十几个刷题网...
白话阿里巴巴Java开发手册高级篇
不久前,阿里巴巴发布了《阿里巴巴Java开发手册》,总结了阿里巴巴内部实际项目开发过程中开发人员应该遵守的研发流程规范,这些流程规范在一定程度上能够保证最终的项目交付质量,通过在时间中总结模式,并推广给广大开发人员,来避免研发人员在实践中容易犯的错误,确保最终在大规模协作的项目中达成既定目标。 无独有偶,笔者去年在公司里负责升级和制定研发流程、设计模板、设计标准、代码标准等规范,并在实际工作中进行...
SQL-小白最佳入门sql查询一
不要偷偷的查询我的个人资料,即使你再喜欢我,也不要这样,真的不好;
redis分布式锁,面试官请随便问,我都会
文章有点长并且绕,先来个图片缓冲下! 前言 现在的业务场景越来越复杂,使用的架构也就越来越复杂,分布式、高并发已经是业务要求的常态。像腾讯系的不少服务,还有CDN优化、异地多备份等处理。 说到分布式,就必然涉及到分布式锁的概念,如何保证不同机器不同线程的分布式锁同步呢? 实现要点 互斥性,同一时刻,智能有一个客户端持有锁。 防止死锁发生,如果持有锁的客户端崩溃没有主动释放锁,也要保证锁可以正常释...
项目中的if else太多了,该怎么重构?
介绍 最近跟着公司的大佬开发了一款IM系统,类似QQ和微信哈,就是聊天软件。我们有一部分业务逻辑是这样的 if (msgType = "文本") { // dosomething } else if(msgType = "图片") { // doshomething } else if(msgType = "视频") { // doshomething } else { // doshom...
Nginx 原理和架构
Nginx 是一个免费的,开源的,高性能的 HTTP 服务器和反向代理,以及 IMAP / POP3 代理服务器。Nginx 以其高性能,稳定性,丰富的功能,简单的配置和低资源消耗而闻名。 Nginx 的整体架构 Nginx 里有一个 master 进程和多个 worker 进程。master 进程并不处理网络请求,主要负责调度工作进程:加载配置、启动工作进程及非停升级。worker 进程负责处...
“狗屁不通文章生成器”登顶GitHub热榜,分分钟写出万字形式主义大作
一、垃圾文字生成器介绍 最近在浏览GitHub的时候,发现了这样一个骨骼清奇的雷人项目,而且热度还特别高。 项目中文名:狗屁不通文章生成器 项目英文名:BullshitGenerator 根据作者的介绍,他是偶尔需要一些中文文字用于GUI开发时测试文本渲染,因此开发了这个废话生成器。但由于生成的废话实在是太过富于哲理,所以最近已经被小伙伴们给玩坏了。 他的文风可能是这样的: 你发现,...
程序员:我终于知道post和get的区别
是一个老生常谈的话题,然而随着不断的学习,对于以前的认识有很多误区,所以还是需要不断地总结的,学而时习之,不亦说乎
《程序人生》系列-这个程序员只用了20行代码就拿了冠军
你知道的越多,你不知道的越多 点赞再看,养成习惯GitHub上已经开源https://github.com/JavaFamily,有一线大厂面试点脑图,欢迎Star和完善 前言 这一期不算《吊打面试官》系列的,所有没前言我直接开始。 絮叨 本来应该是没有这期的,看过我上期的小伙伴应该是知道的嘛,双十一比较忙嘛,要值班又要去帮忙拍摄年会的视频素材,还得搞个程序员一天的Vlog,还要写BU...
加快推动区块链技术和产业创新发展,2019可信区块链峰会在京召开
11月8日,由中国信息通信研究院、中国通信标准化协会、中国互联网协会、可信区块链推进计划联合主办,科技行者协办的2019可信区块链峰会将在北京悠唐皇冠假日酒店开幕。   区块链技术被认为是继蒸汽机、电力、互联网之后,下一代颠覆性的核心技术。如果说蒸汽机释放了人类的生产力,电力解决了人类基本的生活需求,互联网彻底改变了信息传递的方式,区块链作为构造信任的技术有重要的价值。   1...
Java世界最常用的工具类库
Apache Commons Apache Commons有很多子项目 Google Guava 参考博客
程序员把地府后台管理系统做出来了,还有3.0版本!12月7号最新消息:已在开发中有github地址
第一幕:缘起 听说阎王爷要做个生死簿后台管理系统,我们派去了一个程序员…… 996程序员做的梦: 第一场:团队招募 为了应对地府管理危机,阎王打算找“人”开发一套地府后台管理系统,于是就在地府总经办群中发了项目需求。 话说还是中国电信的信号好,地府都是满格,哈哈!!! 经常会有外行朋友问:看某网站做的不错,功能也简单,你帮忙做一下? 而这次,面对这样的需求,这个程序员...
网易云6亿用户音乐推荐算法
网易云音乐是音乐爱好者的集聚地,云音乐推荐系统致力于通过 AI 算法的落地,实现用户千人千面的个性化推荐,为用户带来不一样的听歌体验。 本次分享重点介绍 AI 算法在音乐推荐中的应用实践,以及在算法落地过程中遇到的挑战和解决方案。 将从如下两个部分展开: AI算法在音乐推荐中的应用 音乐场景下的 AI 思考 从 2013 年 4 月正式上线至今,网易云音乐平台持续提供着:乐屏社区、UGC...
【技巧总结】位运算装逼指南
位算法的效率有多快我就不说,不信你可以去用 10 亿个数据模拟一下,今天给大家讲一讲位运算的一些经典例子。不过,最重要的不是看懂了这些例子就好,而是要在以后多去运用位运算这些技巧,当然,采用位运算,也是可以装逼的,不信,你往下看。我会从最简单的讲起,一道比一道难度递增,不过居然是讲技巧,那么也不会太难,相信你分分钟看懂。 判断奇偶数 判断一个数是基于还是偶数,相信很多人都做过,一般的做法的代码如下...
为什么要学数据结构?
一、前言 在可视化化程序设计的今天,借助于集成开发环境可以很快地生成程序,程序设计不再是计算机专业人员的专利。很多人认为,只要掌握几种开发工具就可以成为编程高手,其实,这是一种误解。要想成为一个专业的开发人员,至少需要以下三个条件: 1) 能够熟练地选择和设计各种数据结构和算法 2) 至少要能够熟练地掌握一门程序设计语言 3) 熟知所涉及的相关应用领域的知识 其中,后两个条件比较容易实现,而第一个...
Android 9.0 init 启动流程
阅读五分钟,每日十点,和您一起终身学习,这里是程序员Android本篇文章主要介绍Android开发中的部分知识点,通过阅读本篇文章,您将收获以下内容:一、启动流程概述一、 启动流程概述Android启动流程跟Linux启动类似,大致分为如下五个阶段。1.开机上电,加载固化的ROM。2.加载BootLoader,拉起Android OS。3.加载Uboot,初始外设,引导Kernel启动等。...
相关热词 c# 图片上传 c# gdi 占用内存 c#中遍历字典 c#控制台模拟dos c# 斜率 最小二乘法 c#进程延迟 c# mysql完整项目 c# grid 总行数 c# web浏览器插件 c# xml 生成xsd
立即提问

相似问题

6
maven管理的SSM项目,其他module都没问题,有一个module报错,大神路过看看
4
项目报错 GIT改MAVEN之后就报错在改回GIT也是报错
3
GIT改MAVEN之后就报错在改回GIT也是报错
10
支付宝java服务端 测试报错 大神来救
4
websocket不知什么原因莫名断开
3
com.alibaba.druid.pool.DruidDataSource 报错
10
SSM,Maven项目的tomcat7.0启动不了
2
spring cloud gateway收到请求后报Only one connection receive subscriber allowed
1
使用本地mysql数据库启动web项目报错,相同配置外网msql数据库可以用,是数据库设置问题吗?
3
java.io.IOException: stream does not represent a PKCS12 key store
1
sparkSql使用insert、create table tablename as select 。。。会报一个错,查了很久都没有查到原因。
1
WebService中文报错.无法创建SOAP消息 XML读进程错误
1
java.util.zip.ZipException: invalid CEN header (bad signature)
1
手机端请求后台下载更新,出现错误java.io.IOException: 远程主机强迫关闭了一个现有的连接。
1
redis报错 有大神遇到过么
0
Android Studio编译器报空指针
2
按照第一行代码上打酷欧天气提示我 Value Request of type java.lang.String cannot be converte
1
用jmeter测试json时请求失败并报错,但是用postman却没问题,这是什么原因呢?
2
Failed to start component [StandardEngine[Catalina].StandardHost[localhost].StandardContext[/jk111]]
1
dubbo服务调用报错,哪位大神知道这是什么错误