map算子里面使用sparkContext 报 java.io.NotSerializableException: org.apache.spark.SparkContext错?

val receiverStream: ReceiverInputDStream[ String ] = RabbitMQUtils.createStream String
receiverStream.print()

receiverStream.map(value => {
 //@transient val sc = spark.sparkContext
  val jsonS = JSON.parseFull(value)
  val mapjson: Map[ String, String ] = regJson(jsonS)
  val alarmContent = mapjson.get("alarmContent").toString.replace("Some(", "").replace(")", "")
  val alarmEventId = mapjson.get("alarmEventId").toString.replace("Some(", "").replace(")", "")
  val alarmLevel = mapjson.get("alarmLevel").toString.replace("Some(", "").replace(")", "")
  val alarmType = mapjson.get("alarmType").toString.replace("Some(", "").replace(")", "")
  val buildingId = mapjson.get("buildingId").toString.replace("Some(", "").replace(")", "")
  val chargesCode = mapjson.get("chargesCode").toString.replace("Some(", "").replace(")", "")
  val createDate = mapjson.get("createDate").toString.replace("Some(", "").replace(")", "").toDouble
  val delFlag = mapjson.get("delFlag").toString.replace("Some(", "").replace(")", "")
  val deviceId = mapjson.get("deviceId").toString.replace("Some(", "").replace(")", "")
  val happenTime = mapjson.get("happenTime").toString.replace("Some(", "").replace(")", "").toDouble
  val isNewRecord = mapjson.get("isNewRecord").toString.replace("Some(", "").replace(")", "").toBoolean
  val page = mapjson.get("page").toString.replace("Some(", "").replace(")", "")
  val producerCode = mapjson.get("producerCode").toString.replace("Some(", "").replace(")", "")
  val sqlMap = mapjson.get("sqlMap").toString.replace("Some(", "").replace(")", "")
  println(alarmEventId)
  val strings: Apple = Apple(alarmContent, alarmEventId, alarmLevel,
    alarmType, buildingId, chargesCode, createDate, delFlag,
    deviceId, happenTime, isNewRecord, page, producerCode, sqlMap)
  val apples: Seq[ Apple ] = Seq(strings)
  //println("走到这里了!")
 println("logs:" + apples)
 // val appRdd: RDD[ Apple ] = sc.makeRDD(apples)
 /* value1.foreachPartition(iter =>{
    import spark.implicits._
    val frameDF: DataFrame = value1.toDF()
    frameDF.createTempView("t_1")
    frameDF.show()
  })*/
 val value1: RDD[ Apple ] = sc.parallelize(apples)
  import spark.implicits._
  val frameDF: DataFrame = value1.toDF()
  frameDF.createTempView("t_1")
  frameDF.show()
}).print()

1个回答

报错信息:Exception in thread "main" org.apache.spark.SparkException: Task not serializable
at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:298)
at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:288)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:108)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2039)
at org.apache.spark.streaming.dstream.DStream$$anonfun$map$1.apply(DStream.scala:546)
at org.apache.spark.streaming.dstream.DStream$$anonfun$map$1.apply(DStream.scala:546)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.SparkContext.withScope(SparkContext.scala:679)
at org.apache.spark.streaming.StreamingContext.withScope(StreamingContext.scala:264)
at org.apache.spark.streaming.dstream.DStream.map(DStream.scala:545)
at example.RabbitMQ2Spark$.main(RabbitMQ2Spark.scala:54)
at example.RabbitMQ2Spark.main(RabbitMQ2Spark.scala)
Caused by: java.io.NotSerializableException: org.apache.spark.SparkContext
Serialization stack:
- object not serializable (class: org.apache.spark.SparkContext, value: org.apache.spark.SparkContext@185f7840)
- field (class: example.RabbitMQ2Spark$$anonfun$main$1, name: sc$1, type: class org.apache.spark.SparkContext)
- object (class example.RabbitMQ2Spark$$anonfun$main$1, )
at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:46)
at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100)
at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:295)
... 12 more

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
idea编译出错 java.io.IOException: Cannot create empty file:

Error:Internal error: (java.io.IOException) Cannot create empty file: C:\Users\鏋楁案鍩�\.IntelliJIdea2019.1\system\compile-server\schoolrollsystem_d8001572\timestamps\data java.io.IOException: Cannot create empty file: C:\Users\鏋楁案鍩?\.IntelliJIdea2019.1\system\compile-server\schoolrollsystem_d8001572\timestamps\data at com.intellij.util.io.PersistentEnumeratorBase.<init>(PersistentEnumeratorBase.java:175) at com.intellij.util.io.PersistentBTreeEnumerator.<init>(PersistentBTreeEnumerator.java:73) at com.intellij.util.io.PersistentEnumeratorDelegate.<init>(PersistentEnumeratorDelegate.java:47) at com.intellij.util.io.PersistentHashMap.<init>(PersistentHashMap.java:163) at com.intellij.util.io.PersistentHashMap.<init>(PersistentHashMap.java:152) at com.intellij.util.io.PersistentHashMap.<init>(PersistentHashMap.java:143) at com.intellij.util.io.PersistentHashMap.<init>(PersistentHashMap.java:135) at com.intellij.util.io.PersistentHashMap.<init>(PersistentHashMap.java:128) at org.jetbrains.jps.incremental.storage.AbstractStateStorage.createMap(AbstractStateStorage.java:124) at org.jetbrains.jps.incremental.storage.AbstractStateStorage.<init>(AbstractStateStorage.java:27) at org.jetbrains.jps.incremental.storage.TimestampStorage.<init>(TimestampStorage.java:21) at org.jetbrains.jps.incremental.storage.ProjectTimestamps.<init>(ProjectTimestamps.java:35) at org.jetbrains.jps.cmdline.BuildRunner.load(BuildRunner.java:111) at org.jetbrains.jps.cmdline.BuildSession.runBuild(BuildSession.java:279) at org.jetbrains.jps.cmdline.BuildSession.run(BuildSession.java:135) at org.jetbrains.jps.cmdline.BuildMain$MyMessageHandler.lambda$channelRead0$0(BuildMain.java:228) at org.jetbrains.jps.service.impl.SharedThreadPoolImpl.lambda$executeOnPooledThread$0(SharedThreadPoolImpl.java:42) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Please perform full project rebuild (Build | Rebuild Project)

执行jar报错 Hadoop java.io.IOException

[img=http://img.bbs.csdn.net/upload/201703/15/1489518401_142809.png][/img] Error: java.io.IOException: Initialization of all the collectors failed. Error in last collector was :interface javax.xml.soap.Text hadoop jar Hadoop_Demo1.jar /user/myData/ /user/out/ 执行简单jar包 17/03/15 02:52:37 INFO client.RMProxy: Connecting to ResourceManager at s0/192.168.253.130:8032 17/03/15 02:52:37 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this. 17/03/15 02:52:38 INFO input.FileInputFormat: Total input paths to process : 2 17/03/15 02:52:38 INFO mapreduce.JobSubmitter: number of splits:2 17/03/15 02:52:38 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1489512856623_0004 17/03/15 02:52:39 INFO impl.YarnClientImpl: Submitted application application_1489512856623_0004 17/03/15 02:52:39 INFO mapreduce.Job: The url to track the job: http://s0:8088/proxy/application_1489512856623_0004/ 17/03/15 02:52:39 INFO mapreduce.Job: Running job: job_1489512856623_0004 17/03/15 02:52:50 INFO mapreduce.Job: Job job_1489512856623_0004 running in uber mode : false 17/03/15 02:52:50 INFO mapreduce.Job: map 0% reduce 0% 17/03/15 02:55:18 INFO mapreduce.Job: map 50% reduce 0% 17/03/15 02:55:18 INFO mapreduce.Job: Task Id : attempt_1489512856623_0004_m_000001_0, Status : FAILED Error: java.io.IOException: Initialization of all the collectors failed. Error in last collector was :interface javax.xml.soap.Text at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:414) at org.apache.hadoop.mapred.MapTask.access$100(MapTask.java:81) at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:698) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:770) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: java.lang.ClassCastException: interface javax.xml.soap.Text at java.lang.Class.asSubclass(Class.java:3404) at org.apache.hadoop.mapred.JobConf.getOutputKeyComparator(JobConf.java:887) at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.init(MapTask.java:1004) at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:402) ... 9 more Container killed by the ApplicationMaster. 17/03/15 02:55:18 INFO mapreduce.Job: Task Id : attempt_1489512856623_0004_m_000000_0, Status : FAILED Error: java.io.IOException: Initialization of all the collectors failed. Error in last collector was :interface javax.xml.soap.Text at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:414) at org.apache.hadoop.mapred.MapTask.access$100(MapTask.java:81) at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:698) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:770) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: java.lang.ClassCastException: interface javax.xml.soap.Text at java.lang.Class.asSubclass(Class.java:3404) at org.apache.hadoop.mapred.JobConf.getOutputKeyComparator(JobConf.java:887) at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.init(MapTask.java:1004) at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:402) ... 9 more 17/03/15 02:55:19 INFO mapreduce.Job: map 0% reduce 0% 17/03/15 02:55:31 INFO mapreduce.Job: Task Id : attempt_1489512856623_0004_m_000000_1, Status : FAILED Error: java.io.IOException: Initialization of all the collectors failed. Error in last collector was :interface javax.xml.soap.Text at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:414) at org.apache.hadoop.mapred.MapTask.access$100(MapTask.java:81) at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:698) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:770) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

手机端请求后台下载更新,出现错误java.io.IOException: 远程主机强迫关闭了一个现有的连接。

报错代码段 OutputStream toClient = new BufferedOutputStream(response.getOutputStream()); response.setContentType("application/octet-stream"); toClient.write(buffer);//这一行开始报错 toClient.flush(); toClient.close(); org.apache.catalina.connector.ClientAbortException: java.io.IOException: 远程主机强迫关闭了一个现有的连接。 at org.apache.catalina.connector.OutputBuffer.realWriteBytes(OutputBuffer.java:356) at org.apache.catalina.connector.OutputBuffer.appendByteArray(OutputBuffer.java:795) at org.apache.catalina.connector.OutputBuffer.append(OutputBuffer.java:724) at org.apache.catalina.connector.OutputBuffer.writeBytes(OutputBuffer.java:391) at org.apache.catalina.connector.OutputBuffer.write(OutputBuffer.java:369) at org.apache.catalina.connector.CoyoteOutputStream.write(CoyoteOutputStream.java:96) at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122) at java.io.FilterOutputStream.write(FilterOutputStream.java:97) at com.inspur.mobile.controller.MobileController.download(MobileController.java:954) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.springframework.web.bind.annotation.support.HandlerMethodInvoker.invokeHandlerMethod(HandlerMethodInvoker.java:176) at org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter.invokeHandlerMethod(AnnotationMethodHandlerAdapter.java:440) at org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter.handle(AnnotationMethodHandlerAdapter.java:428) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:933) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:867) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:951) at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:842) at javax.servlet.http.HttpServlet.service(HttpServlet.java:635) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:827) at javax.servlet.http.HttpServlet.service(HttpServlet.java:742) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:198) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:96) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:493) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:81) at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:650) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:342) at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:800) at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66) at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:800) at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1471) at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) at java.lang.Thread.run(Thread.java:745) Caused by: java.io.IOException: 远程主机强迫关闭了一个现有的连接。 at sun.nio.ch.SocketDispatcher.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:51) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:65) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) at org.apache.tomcat.util.net.NioChannel.write(NioChannel.java:134) at org.apache.tomcat.util.net.NioBlockingSelector.write(NioBlockingSelector.java:101) at org.apache.tomcat.util.net.NioSelectorPool.write(NioSelectorPool.java:157) at org.apache.tomcat.util.net.NioEndpoint$NioSocketWrapper.doWrite(NioEndpoint.java:1279) at org.apache.tomcat.util.net.SocketWrapperBase.doWrite(SocketWrapperBase.java:670) at org.apache.tomcat.util.net.SocketWrapperBase.writeBlocking(SocketWrapperBase.java:450) at org.apache.tomcat.util.net.SocketWrapperBase.write(SocketWrapperBase.java:388) at org.apache.coyote.http11.Http11OutputBuffer$SocketOutputBuffer.doWrite(Http11OutputBuffer.java:623) at org.apache.coyote.http11.filters.IdentityOutputFilter.doWrite(IdentityOutputFilter.java:116) at org.apache.coyote.http11.Http11OutputBuffer.doWrite(Http11OutputBuffer.java:225) at org.apache.coyote.Response.doWrite(Response.java:541) at org.apache.catalina.connector.OutputBuffer.realWriteBytes(OutputBuffer.java:351) ... 44 more

spark shell在存运算结果到hdfs时报java.io.IOException: Not a file: hdfs://mini1:9000/spark/res

scala> sc.textFile("hdfs://mini1:9000/spark").flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_).saveAsTextFile("hdfs://mini1:9000/spark/res2") 执行上面的代码出错,这个目录在hdfs下是有的,而且就算没有也会创建。还有就是我运行的代码中是保存到res2目录 ,这里为什么报没有res目录 18/11/05 19:06:44 WARN SizeEstimator: Failed to check whether UseCompressedOops is set; assuming yes java.io.IOException: Not a file: hdfs://mini1:9000/spark/res at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:320) at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:199) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.Partitioner$.defaultPartitioner(Partitioner.scala:65) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$3.apply(PairRDDFunctions.scala:331) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$3.apply(PairRDDFunctions.scala:331) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111) at org.apache.spark.rdd.RDD.withScope(RDD.scala:316) at org.apache.spark.rdd.PairRDDFunctions.reduceByKey(PairRDDFunctions.scala:330) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:28) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35) at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:37) at $iwC$$iwC$$iwC$$iwC.<init>(<console>:39) at $iwC$$iwC$$iwC.<init>(<console>:41) at $iwC$$iwC.<init>(<console>:43) at $iwC.<init>(<console>:45) at <init>(<console>:47) at .<init>(<console>:51) at .<clinit>(<console>) at .<init>(<console>:7) at .<clinit>(<console>) at $print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657) at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

此处为啥一直报找不到文件的异常:java.io.FileNotFoundException: Template "contextxml.ftl" not found.

public class ExportToXmlUtil { private Configuration configuration = null; public ExportToXmlUtil(){ configuration = new Configuration(); // configuration.setDefaultEncoding("UTF-8"); } public static void main(String[] args) { ExportToXmlUtil test = new ExportToXmlUtil(); test.createContextXml(); } public void createContextXml(){ Map<String,String> dataMap=new HashMap<>(); getData(dataMap); configuration.setClassForTemplateLoading(this.getClass(), "classpath:/template"); //FTL文件所存在的位置,放在与java相同的包下 Template t=null; try { t = configuration.getTemplate("contextxml.ftl"); //文件名 } catch (IOException e) { e.printStackTrace(); } File outFile = new File("D:/forK8sOutFile/context.xml"); //生成文件的路径 Writer out = null; try { out = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(outFile))); } catch (FileNotFoundException e1) { e1.printStackTrace(); } try { t.process(dataMap, out); } catch (TemplateException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } } //这里赋值的时候需要注意,xml中需要的数据你必须提供给它,不然会报找不到某元素错的. private Map<String,String> getData(Map<String, String> dataMap) { dataMap.put("name", "testName"); dataMap.put("auth", "testAuth"); return dataMap; } } 我的项目结构图随后附上

spark sparkcontext 初始化失败

环境 Ubuntu 16.04 hadoop 2.7.3 scala 2.11.8 spark 2.1.0 已经安装好了hadoop scala,之后配置了下 spark 运行 spark-shell 就爆出来下面的错误 ``` 18/05/22 15:43:30 ERROR spark.SparkContext: Error initializing SparkContext. java.lang.IllegalArgumentException: For input string: "true #是否记录Spark事件,用于应用程序在完成后重构webUI" at scala.collection.immutable.StringLike$class.parseBoolean(StringLike.scala:290) at scala.collection.immutable.StringLike$class.toBoolean(StringLike.scala:260) at scala.collection.immutable.StringOps.toBoolean(StringOps.scala:29) at org.apache.spark.SparkConf$$anonfun$getBoolean$2.apply(SparkConf.scala:407) at org.apache.spark.SparkConf$$anonfun$getBoolean$2.apply(SparkConf.scala:407) at scala.Option.map(Option.scala:146) at org.apache.spark.SparkConf.getBoolean(SparkConf.scala:407) at org.apache.spark.SparkContext.isEventLogEnabled(SparkContext.scala:238) at org.apache.spark.SparkContext.<init>(SparkContext.scala:407) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95) at $line3.$read$$iw$$iw.<init>(<console>:15) at $line3.$read$$iw.<init>(<console>:42) at $line3.$read.<init>(<console>:44) at $line3.$read$.<init>(<console>:48) at $line3.$read$.<clinit>(<console>) at $line3.$eval$.$print$lzycompute(<console>:7) at $line3.$eval$.$print(<console>:6) at $line3.$eval.$print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786) at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047) at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638) at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637) at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31) at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19) at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637) at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569) at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565) at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807) at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681) at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395) at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38) at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214) at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37) at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97) at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909) at org.apache.spark.repl.Main$.doMain(Main.scala:68) at org.apache.spark.repl.Main$.main(Main.scala:51) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) java.lang.IllegalArgumentException: For input string: "true #是否记录Spark事件,用于应用程序在完成后重构webUI" at scala.collection.immutable.StringLike$class.parseBoolean(StringLike.scala:290) at scala.collection.immutable.StringLike$class.toBoolean(StringLike.scala:260) at scala.collection.immutable.StringOps.toBoolean(StringOps.scala:29) at org.apache.spark.SparkConf$$anonfun$getBoolean$2.apply(SparkConf.scala:407) at org.apache.spark.SparkConf$$anonfun$getBoolean$2.apply(SparkConf.scala:407) at scala.Option.map(Option.scala:146) at org.apache.spark.SparkConf.getBoolean(SparkConf.scala:407) at org.apache.spark.SparkContext.isEventLogEnabled(SparkContext.scala:238) at org.apache.spark.SparkContext.<init>(SparkContext.scala:407) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95) ... 47 elided <console>:14: error: not found: value spark import spark.implicits._ ^ <console>:14: error: not found: value spark import spark.sql ```

Mapper 中的 resultType 使用 pojo 跟使用 java.util.Map 的区别?

``` <select id="" resultType="pojo类"> // sql代码 。。。 </select> <select id="" resultType="java.util.Map"> // sql代码 。。。 </select> ``` 虽然我知道既然有 pojo 这个概念就不可能纯用 map 去取代它, 但我只知其然不知其所以然,我想知道: 1. 为什么不能纯使用 map? 2. 而在什么情况下,又应该使用 map ? 求大神解答,这个问题困惑我好久了,写着代码都不安心?

Could not find parameter map java.util.Map 错误

错误: org.mybatis.spring.MyBatisSystemException: nested exception is org.apache.ibatis.builder.xml.IncompleteStatementException: Could not find parameter map java.util.Map 报错的地方: this.sqlSessionTxTemplate.selectOne("UserManageMapper.addUser", ht); “UserManageMapper.addUser”: <resultMap id="map" type="java.util.HashMap"> </resultMap> <select id="getUserById" parameterType="java.util.Map" resultMap="map"> select * from subscriber where provider_id=#{providerId} order by updated_on desc </select> 之前一直都很正常,现在莫名其妙的报错,而且所有的查询,更新都是报这个错误,检查了很多地方都没发现问题。 请大家帮帮忙

急急急请教各位大神,dubbo的HessianProtocolException

调用同事的dubbo接口所有的配置都检查过没有问题,可以注入当代码走到调用注入的接口时报异常: java.lang.reflect.InvocationTargetException 接口那边的同事说请求可以过去 但是在return的时候报异常: [com.alibaba.dubbo.rpc.protocol.dubbo.DecodeableRpcInvocation] - [DUBBO] Decode rpc invocation failed: expected map/object at java.lang.String (Ljava/lang/String;Lorg/bigdata/framework/pay/model/WebDownResourceParam;), dubbo version: 2.8.4, current host: 10.10.184.116 com.alibaba.com.caucho.hessian.io.HessianProtocolException: expected map/object at java.lang.String (Ljava/lang/String;Lorg/bigdata/framework/pay/model/WebDownResourceParam;) at com.alibaba.com.caucho.hessian.io.AbstractDeserializer.error(AbstractDeserializer.java:108) at com.alibaba.com.caucho.hessian.io.AbstractMapDeserializer.readObject(AbstractMapDeserializer.java:70) at com.alibaba.com.caucho.hessian.io.Hessian2Input.readObject(Hessian2Input.java:1696) at com.alibaba.dubbo.common.serialize.support.hessian.Hessian2ObjectInput.readObject(Hessian2ObjectInput.java:94) at com.alibaba.dubbo.rpc.protocol.dubbo.DecodeableRpcInvocation.decode(DecodeableRpcInvocation.java:150) at com.alibaba.dubbo.rpc.protocol.dubbo.DecodeableRpcInvocation.decode(DecodeableRpcInvocation.java:74) at com.alibaba.dubbo.rpc.protocol.dubbo.DubboCodec.decodeBody(DubboCodec.java:138) at com.alibaba.dubbo.remoting.exchange.codec.ExchangeCodec.decode(ExchangeCodec.java:134) at com.alibaba.dubbo.remoting.exchange.codec.ExchangeCodec.decode(ExchangeCodec.java:95) at com.alibaba.dubbo.rpc.protocol.dubbo.DubboCountCodec.decode(DubboCountCodec.java:46) at com.alibaba.dubbo.remoting.transport.netty.NettyCodecAdapter$InternalDecoder.messageReceived(NettyCodecAdapter.java:134) at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:109) at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312) at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:90) at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) [com.alibaba.dubbo.remoting.exchange.codec.ExchangeCodec] - [DUBBO] Skip input stream 330, dubbo version: 2.8.4, current host: 10.10.184.116 [com.alibaba.dubbo.rpc.protocol.dubbo.DubboProtocol] - [DUBBO] disconected from /10.20.21.130:52655,url:dubbo://10.10.184.116:9977/org.bigdata.framework.security.iservice.SecurityFacadeService?anyhost=true&application=dubbo-webservice-provider&channel.readonly.sent=true&codec=dubbo&default.delay=-1&default.retries=0&default.timeout=100000&delay=-1&dubbo=2.8.4&generic=false&heartbeat=60000&interface=org.bigdata.framework.security.iservice.SecurityFacadeService&methods=intercept,encrypt&organization=dubbox&owner=programmer&pid=28695&side=provider&timestamp=1499949941270, dubbo version: 2.8.4, current host: 10.10.184.116

java.lang.ClassCastException

public class ipSort { public static class Map extends Mapper<LongWritable, IntWritable, IntWritable, Text>{ //将输入文件转换成<ipNum,ipAdd>的形式 private final static IntWritable ipNum = new IntWritable(); private Text ipAdd = new Text(); public void map(LongWritable key, IntWritable value, Context context) throws IOException, InterruptedException{ //把每一行转成字符串 String line = value.toString(); // 分割每一行 StringTokenizer token = new StringTokenizer(line); //solve every line while(token.hasMoreElements()){ //divided by blank StringTokenizer tokenLine = new StringTokenizer(token.nextToken()); ipAdd.set(token.nextToken().trim()); ipNum.set(Integer.valueOf(token.nextToken().trim())); context.write(ipNum,new Text(ipAdd)); } } } public static class Reduce extends Reducer<IntWritable, Text, Text, IntWritable>{ //把Map阶段的输出结果颠倒; private Text result = new Text(); public void reduce(IntWritable key,Iterable<Text> values, Context context) throws IOException, InterruptedException{ for(Text val : values){ result.set(val.toString()); context.write(new Text(result),key); } } } public static class IntKeyDescComparator extends WritableComparator{ protected IntKeyDescComparator(){ super(IntWritable.class,true); } public int compare(WritableComparable a, WritableComparable b){ return super.compare(a, b); } } public static void main(String args[]) throws IOException, ClassNotFoundException, InterruptedException{ System.setProperty("hadoop.home.dir", "C:\\Users\\lenovo\\Desktop\\hadoop-2.6.0\\hadoop-2.6.0"); Configuration conf = new Configuration(); conf.set("mapred.job.tracker", "192.168.142.138"); Job job = new Job(conf,"ipSort"); job.setJarByClass(ipSort.class); job.setSortComparatorClass(IntKeyDescComparator.class); job.setMapperClass(Map.class); job.setReducerClass(Reduce.class); job.setOutputKeyClass(IntWritable.class); job.setOutputValueClass(Text.class); FileInputFormat.addInputPath(job, new Path("hdfs://10.170.54.193:9000/input")); FileOutputFormat.setOutputPath(job, new Path("hdfs://10.170.54.193:9000/output")); System.exit(job.waitForCompletion(true)?0:1); } 运行时出现问题Caused by: java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.io.IntWritable,但是找不到哪里类型转换错误了

Exception in thread "Thread-12" java.lang.RuntimeException: java.lang.NullPointerException

第一次着手项目,springboot+mybatis,项目搭建是跟着教程做的。项目搭建好之后项目能正常运行,但是启动后报: Exception in thread "Thread-12" java.lang.RuntimeException: java.lang.NullPointerException at com.mysql.jdbc.JDBC4Connection$1$1.run(JDBC4Connection.java:106) Caused by: java.lang.NullPointerException at com.mysql.jdbc.ConnectionImpl.abortInternal(ConnectionImpl.java:1240) at com.mysql.jdbc.JDBC4Connection$1$1.run(JDBC4Connection.java:104) 启动Console: 2019-08-05 10:29:12.921 INFO 7416 --- [ restartedMain] com.xintujiuzhang.aimanagement.App : Starting App on DESKTOP-J57CTN6 with PID 7416 (D:\JAVA\eclipse\TaxHandwritingRecognition\target\classes started by Administrator in D:\JAVA\eclipse\TaxHandwritingRecognition) 2019-08-05 10:29:12.921 INFO 7416 --- [ restartedMain] com.xintujiuzhang.aimanagement.App : No active profile set, falling back to default profiles: default 2019-08-05 10:29:12.958 INFO 7416 --- [ restartedMain] ConfigServletWebServerApplicationContext : Refreshing org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@6070b0bb: startup date [Mon Aug 05 10:29:12 CST 2019]; root of context hierarchy 2019-08-05 10:29:13.450 INFO 7416 --- [ restartedMain] o.s.b.f.xml.XmlBeanDefinitionReader : Loading XML bean definitions from class path resource [mybatis-config.xml] 2019-08-05 10:29:13.503 INFO 7416 --- [ restartedMain] o.s.b.f.s.DefaultListableBeanFactory : Overriding bean definition for bean 'sqlSessionFactory' with a different definition: replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=app; factoryMethodName=sqlSessionFactoryBean; initMethodName=null; destroyMethodName=(inferred); defined in com.xintujiuzhang.aimanagement.App] with [Generic bean: class [org.mybatis.spring.SqlSessionFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null; defined in class path resource [mybatis-config.xml]] 2019-08-05 10:29:13.599 WARN 7416 --- [ restartedMain] o.m.s.mapper.ClassPathMapperScanner : No MyBatis mapper was found in '[com.xintujiuzhang.aimanagement]' package. Please check your configuration. 2019-08-05 10:29:13.958 INFO 7416 --- [ restartedMain] trationDelegate$BeanPostProcessorChecker : Bean 'org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration' of type [org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration$$EnhancerBySpringCGLIB$$b535b905] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2019-08-05 10:29:14.332 INFO 7416 --- [ restartedMain] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port(s): 8080 (http) 2019-08-05 10:29:14.348 INFO 7416 --- [ restartedMain] o.apache.catalina.core.StandardService : Starting service [Tomcat] 2019-08-05 10:29:14.349 INFO 7416 --- [ restartedMain] org.apache.catalina.core.StandardEngine : Starting Servlet Engine: Apache Tomcat/8.5.28 2019-08-05 10:29:14.355 INFO 7416 --- [ost-startStop-1] o.a.catalina.core.AprLifecycleListener : Loaded APR based Apache Tomcat Native library [1.2.21] using APR version [1.6.5]. 2019-08-05 10:29:14.355 INFO 7416 --- [ost-startStop-1] o.a.catalina.core.AprLifecycleListener : APR capabilities: IPv6 [true], sendfile [true], accept filters [false], random [true]. 2019-08-05 10:29:14.355 INFO 7416 --- [ost-startStop-1] o.a.catalina.core.AprLifecycleListener : APR/OpenSSL configuration: useAprConnector [false], useOpenSSL [true] 2019-08-05 10:29:14.357 INFO 7416 --- [ost-startStop-1] o.a.catalina.core.AprLifecycleListener : OpenSSL successfully initialized [OpenSSL 1.1.1a 20 Nov 2018] 2019-08-05 10:29:14.418 INFO 7416 --- [ost-startStop-1] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext 2019-08-05 10:29:14.418 INFO 7416 --- [ost-startStop-1] o.s.web.context.ContextLoader : Root WebApplicationContext: initialization completed in 1462 ms 2019-08-05 10:29:14.514 INFO 7416 --- [ost-startStop-1] o.s.b.w.servlet.ServletRegistrationBean : Servlet dispatcherServlet mapped to [/] 2019-08-05 10:29:14.517 INFO 7416 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'characterEncodingFilter' to: [/*] 2019-08-05 10:29:14.517 INFO 7416 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'hiddenHttpMethodFilter' to: [/*] 2019-08-05 10:29:14.517 INFO 7416 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'httpPutFormContentFilter' to: [/*] 2019-08-05 10:29:14.517 INFO 7416 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'requestContextFilter' to: [/*] 2019-08-05 10:29:14.631 INFO 7416 --- [ restartedMain] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Starting... 2019-08-05 10:29:19.625 INFO 7416 --- [ restartedMain] com.zaxxer.hikari.pool.PoolBase : HikariPool-1 - Driver does not support get/set network timeout for connections. (com.mysql.jdbc.JDBC4Connection.getNetworkTimeout()I) 2019-08-05 10:29:19.629 INFO 7416 --- [ restartedMain] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Start completed. 2019-08-05 10:29:19.670 INFO 7416 --- [ restartedMain] j.LocalContainerEntityManagerFactoryBean : Building JPA container EntityManagerFactory for persistence unit 'default' 2019-08-05 10:29:19.684 INFO 7416 --- [ restartedMain] o.hibernate.jpa.internal.util.LogHelper : HHH000204: Processing PersistenceUnitInfo [ name: default ...] 2019-08-05 10:29:19.738 INFO 7416 --- [ restartedMain] org.hibernate.Version : HHH000412: Hibernate Core {5.2.10.Final} 2019-08-05 10:29:19.739 INFO 7416 --- [ restartedMain] org.hibernate.cfg.Environment : HHH000206: hibernate.properties not found 2019-08-05 10:29:19.770 INFO 7416 --- [ restartedMain] o.hibernate.annotations.common.Version : HCANN000001: Hibernate Commons Annotations {5.0.1.Final} 2019-08-05 10:29:19.874 INFO 7416 --- [ restartedMain] org.hibernate.dialect.Dialect : HHH000400: Using dialect: org.hibernate.dialect.MySQL5Dialect 2019-08-05 10:29:19.897 INFO 7416 --- [ restartedMain] o.h.e.j.e.i.LobCreatorBuilderImpl : HHH000423: Disabling contextual LOB creation as JDBC driver reported JDBC version [3] less than 4 2019-08-05 10:29:20.037 INFO 7416 --- [ restartedMain] j.LocalContainerEntityManagerFactoryBean : Initialized JPA EntityManagerFactory for persistence unit 'default' 10:29:20,543 INFO App:32-org.springframework.orm.jpa.JpaTransactionManager 2019-08-05 10:29:20.796 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerAdapter : Looking for @ControllerAdvice: org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@6070b0bb: startup date [Mon Aug 05 10:29:12 CST 2019]; root of context hierarchy 2019-08-05 10:29:20.824 WARN 7416 --- [ restartedMain] aWebConfiguration$JpaWebMvcConfiguration : spring.jpa.open-in-view is enabled by default. Therefore, database queries may be performed during view rendering. Explicitly configure spring.jpa.open-in-view to disable this warning 2019-08-05 10:29:20.846 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/application/handwriting_authorization],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.ApplicationAuthorizationController.joinHWA(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.847 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/application/handwriting_information],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.ApplicationInformationController.joinHWI(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.847 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/log/background_management_login],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.BackgroundManagementLoginController.joinHWI(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.847 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/log/computer_login],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.ComputerLoginController.jumpDEM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.847 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/authority/department_management],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.DepartmentManagementController.jumpDEM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.848 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/authority/dictionary_management],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.DictionaryManagementController.jumpDM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.848 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/equipment/hardware_devices],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.EquipmentManagementController.jumpDM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.848 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/queryGetServerStatusInterface],methods=[GET]}" onto public java.util.Map<java.lang.String, java.lang.Object> com.xintujiuzhang.aimanagement.controller.GetServerStatusInterfaceController.queryGetServerStatus(java.lang.String,java.lang.String) 2019-08-05 10:29:20.848 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/queryHandwritingInterface],methods=[GET]}" onto public java.util.Map<java.lang.String, java.lang.Object> com.xintujiuzhang.aimanagement.controller.HandwritingInterfaceController.queryHandwriting(java.lang.String,java.lang.String,java.lang.Integer,java.util.Date) 2019-08-05 10:29:20.849 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/identity/identity_management],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.IdentityManagementController.jumpDM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.849 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/queryInstallationRequest],methods=[GET]}" onto public java.util.Map<java.lang.String, java.lang.Object> com.xintujiuzhang.aimanagement.controller.InstallationRequestInterfaceController.queryInstallationRequest(java.lang.String,java.lang.String,java.lang.String) 2019-08-05 10:29:20.849 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/queryLicenseInterface],methods=[GET]}" onto public java.util.Map<java.lang.String, java.lang.Object> com.xintujiuzhang.aimanagement.controller.LicenseInterfaceController.queryLicense(java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.util.Date) 2019-08-05 10:29:20.851 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.MainController.main(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.851 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/login],methods=[GET]}" onto public org.springframework.web.servlet.ModelAndView com.xintujiuzhang.aimanagement.controller.MainController.login() 2019-08-05 10:29:20.851 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/login],methods=[POST]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.MainController.login(java.lang.String,java.lang.String,java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.851 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/main],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.MainController.jumpmain(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.852 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/authority/menu_management],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.MenuManagementController.jumpMM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.852 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/log/module_browsing],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.ModuleBrowsingController.jumpTM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.852 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/log/module_operation],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.ModuleOperationController.jumpMM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.852 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/deletehandwriting],methods=[POST]}" onto public void com.xintujiuzhang.aimanagement.controller.NotesController.deletehandwriting(java.lang.String) 2019-08-05 10:29:20.852 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/updatehandwriting],methods=[POST]}" onto public void com.xintujiuzhang.aimanagement.controller.NotesController.updatehandwriting(com.xintujiuzhang.aimanagement.pojo.Handwriting) 2019-08-05 10:29:20.852 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/addhandwriting],methods=[POST]}" onto public void com.xintujiuzhang.aimanagement.controller.NotesController.addhandwriting(com.xintujiuzhang.aimanagement.pojo.Handwriting) 2019-08-05 10:29:20.853 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/querynotes],methods=[GET]}" onto public com.xintujiuzhang.aimanagement.pojo.Handwriting com.xintujiuzhang.aimanagement.controller.NotesController.querynotes(java.lang.String) 2019-08-05 10:29:20.853 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/authority/parameter_settings],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.parameterSettingsController.jumpPS(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.853 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/authority/role_authorization],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.RoleAuthorizationController.jumpRA(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.853 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/log/service_call],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.ServiceCallController.jumpRA(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.853 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/authority/task_management],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.TaskManagementController.jumpTM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.854 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/authority/user_management]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.UserController.jnmpUM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.854 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/queryUserInterface],methods=[GET]}" onto public java.util.Map<java.lang.String, java.lang.Object> com.xintujiuzhang.aimanagement.controller.UserInterfaceController.queryLicense(java.lang.String,java.lang.String,java.lang.Integer,java.util.Date) 2019-08-05 10:29:20.854 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/queryVersionCheckingInterface],methods=[GET]}" onto public java.util.Map<java.lang.String, java.lang.Object> com.xintujiuzhang.aimanagement.controller.VersionCheckingInterfaceController.queryVersionChecking(java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.Integer,java.util.Date) 2019-08-05 10:29:20.854 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/queryVersionupdateInterface],methods=[GET]}" onto public java.util.Map<java.lang.String, java.lang.Object> com.xintujiuzhang.aimanagement.controller.VersionUpdateInterfaceController.queryVersionupdate(java.lang.String,java.lang.String,java.lang.String,java.util.Date) 2019-08-05 10:29:20.857 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/error]}" onto public org.springframework.http.ResponseEntity<java.util.Map<java.lang.String, java.lang.Object>> org.springframework.boot.autoconfigure.web.servlet.error.BasicErrorController.error(javax.servlet.http.HttpServletRequest) 2019-08-05 10:29:20.857 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/error],produces=[text/html]}" onto public org.springframework.web.servlet.ModelAndView org.springframework.boot.autoconfigure.web.servlet.error.BasicErrorController.errorHtml(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse) 2019-08-05 10:29:20.881 INFO 7416 --- [ restartedMain] o.s.w.s.handler.SimpleUrlHandlerMapping : Mapped URL path [/webjars/**] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler] 2019-08-05 10:29:20.881 INFO 7416 --- [ restartedMain] o.s.w.s.handler.SimpleUrlHandlerMapping : Mapped URL path [/**] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler] 2019-08-05 10:29:20.906 INFO 7416 --- [ restartedMain] o.s.w.s.handler.SimpleUrlHandlerMapping : Mapped URL path [/**/favicon.ico] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler] 2019-08-05 10:29:21.115 INFO 7416 --- [ restartedMain] o.s.b.d.a.OptionalLiveReloadServer : LiveReload server is running on port 35729 2019-08-05 10:29:21.164 INFO 7416 --- [ restartedMain] o.s.j.e.a.AnnotationMBeanExporter : Registering beans for JMX exposure on startup 2019-08-05 10:29:21.165 INFO 7416 --- [ restartedMain] o.s.j.e.a.AnnotationMBeanExporter : Bean with name 'dataSource' has been autodetected for JMX exposure 2019-08-05 10:29:21.170 INFO 7416 --- [ restartedMain] o.s.j.e.a.AnnotationMBeanExporter : Located MBean 'dataSource': registering with JMX server as MBean [com.zaxxer.hikari:name=dataSource,type=HikariDataSource] 2019-08-05 10:29:21.196 INFO 7416 --- [ restartedMain] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 8080 (http) with context path '' 2019-08-05 10:29:21.198 INFO 7416 --- [ restartedMain] com.xintujiuzhang.aimanagement.App : Started App in 8.458 seconds (JVM running for 8.828) Exception in thread "Thread-12" java.lang.RuntimeException: java.lang.NullPointerException at com.mysql.jdbc.JDBC4Connection$1$1.run(JDBC4Connection.java:106) Caused by: java.lang.NullPointerException at com.mysql.jdbc.ConnectionImpl.abortInternal(ConnectionImpl.java:1240) at com.mysql.jdbc.JDBC4Connection$1$1.run(JDBC4Connection.java:104) **在执行一些方法后“Thread-12”会不断增加** **ps:“Thread-668"**

ClassCastException错误 想要实现下载功能

数据库字段id为varchar2类型 name同上 path同上 sizes为bumber type为varchar2类型 dates为date类型 java.lang.ClassCastException: java.io.FileInputStream cannot be cast to java.lang.String at com.opensymphony.xwork2.DefaultActionInvocation.saveResult(DefaultActionInvocation.java:502) ~[xwork-core-2.3.28.1.jar:2.3.28.1] at com.opensymphony.xwork2.DefaultActionInvocation.invokeAction(DefaultActionInvocation.java:465) ~[xwork-core-2.3.28.1.jar:2.3.28.1] package com.javakc.action; import java.io.File; import java.io.FileInputStream; import java.io.FileNotFoundException; import java.io.InputStream; import java.io.UnsupportedEncodingException; import com.javakc.dao.FileDaoimpl; import com.javakc.entity.FileEntity; import com.opensymphony.xwork2.ActionSupport; public class DownloadAction extends ActionSupport { private String id; private String fileName; public String execute() throws Exception { System.out.println("开始执行文件下载操作!"); return SUCCESS; } public InputStream getDownloadFile() { //根据主键id获取当前记录 FileEntity file=FileDaoimpl.load(id); fileName=file.getFileName(); try { fileName=new String(fileName.getBytes("GBK"),"ISO8859-1"); } catch (UnsupportedEncodingException e1) { // TODO Auto-generated catch block e1.printStackTrace(); } //根据对象中路径信息获取文件 File serverFile=new File(file.getFilePath()); //将文件转为输入流 InputStream input=null; try { input=new FileInputStream(serverFile); } catch (FileNotFoundException e) { // TODO Auto-generated catch block e.printStackTrace(); } return input; } public String getId() { return id; } public void setId(String id) { this.id = id; } public String getFileName() { return fileName; } public void setLoadFileFileName(String fileName) { this.fileName = fileName; } }

list转换成map错误:java.lang.String cannot be cast to j

我想把类似[{name=小明, age=18}, {name=小红, age=20}, {name=大熊, age=17}]这样从 数据库取出来的list转换成[小明,18,小红,20,大熊,17](String类型list。 项目中这段代码是这样实现的 ``` public List executeSQL(String sql){ log.debug(sql); List<String> aList = new ArrayList(); List jlist = jdbcTemplate.queryForList(sql); for(int i=0;i<jlist.size();i++){ System.out.println(jlist.get(i).getClass()); } Iterator ite = jlist.iterator(); while(ite.hasNext()){ Map map = (Map)ite.next(); for(Object o:map.keySet()){ if(map.get(o.toString())==null){ aList.add(""); }else{ aList.add(map.get(o.toString()).toString()); } } } return aList; } ``` 我自己写了简单的想实现下 ``` public class mapDemo { public static void main(String[] args) { List<String> a = new ArrayList(); List b = new ArrayList<>(); String c1 = new String("{name=小明, age=18}"); String c2 = new String("{name=小红, age=20}"); String c3 = new String("{name=大熊, age=17}"); b.add(c1); b.add(c2); b.add(c3); Iterator it =b.iterator(); while(it.hasNext()){ Map map = (Map)it.next(); for(Object o:map.keySet()){ if(map.get(o.toString())==null){ a.add(""); }else{ a.add(map.get(o.toString()).toString()); } } } } } ``` 但是提示错误:java.lang.String cannot be cast to java.util.Map 数据库取出来的list和我add进去的有什么不一样吗,我这段是哪里错了?

java 中 关于java.lang.ArrayStoreException: java.lang.Integer异常,是什么原因?

[code="java"] package pack.java.demo; import java.util.HashMap; import java.util.Map; public class Test { /** * @param args */ public static void main(String[] args) { // TODO Auto-generated method stub Map<String, Object> map = new HashMap<String, Object>(); map.put("A", 12); map.put("B", "SAP"); map.put("C", '中'); String[] keyArr = map.keySet().toArray(new String[map.size()]); for(int i=0;i<keyArr.length;i++){ System.out.println(keyArr[i]); } System.out.println(""); Object[] valueArr = map.values().toArray(new String[map.size()]); for(int i = 0;i<valueArr.length;i++){ System.out.println(valueArr[i]); } } } [/code] 直接运行上面那段代码,就报如下,错误。请问是什么原因? A C B Exception in thread "main" java.lang.ArrayStoreException: java.lang.Integer at java.util.AbstractCollection.toArray(AbstractCollection.java:176) at pack.java.demo.Test.main(Test.java:25)

java.lang.Object;cannot be cast to java.util.map

String sql = "SELECT DISTINCT(qrcode_id) as qrcode_id ,scan_area as scan_area from ts_scan_recode where qrcode_id in ( SELECT id from ts_qrcode WHERE batch_id in (select p.id from ts_batch p INNER JOIN ts_product_batch z on p.id=z.batch_id and z.product_id in (:productId) OR z.product_id in (SELECT z.product_id from ts_batch p INNER join ts_product_batch z on p.id =z.batch_id and p.dealer_id =:daelerId))) and (date_time BETWEEN :beginDate and :endDate)" + unitSql; SQLQueryConfig SQLQueryConfig = new SQLQueryConfig(); List<ScalarConfig> scalarConfigList = new ArrayList<ScalarConfig>(); scalarConfigList.add(new ScalarConfig("qrcode_id", StringType.INSTANCE)); scalarConfigList.add(new ScalarConfig("scan_area", StringType.INSTANCE)); SQLQueryConfig.setScalarConfigList(scalarConfigList); SQLQueryConfig.setScalarResultTransformer(Transformers.ALIAS_TO_ENTITY_MAP); 最后一句设置无用!! return时强转异常!return (List<Map<String, Object>>) list.getResultList(); 那我不用这句设置,自己将List<?> resultList = list.getResultList(); 封装成 (List<Map<String, Object>>)的话,应该怎么转化?

sqlMap:no parameter map named java.util.Map

com.ibatis.sqlmap.client.SqlMapException: There is no parameter map named java.util.Map in this SqlMap. <!--多表连查 新增查询的数据为审核不通过原因表的不通过原因和审核人--> <select id="queryCompanyInfoBycompanyId" parameterClass="java.util.Map" resultMap="companyInfoResult"> select <include refid="companyInfo.infoColumn" /> ,ca.gmt_last_login as gmt_last_login, cfl.auditor,cfl.fail_reason_search from company_info ci left join company_account ca on ca.company_id=ci.id left join certification_fail_log cfl on cfl.company_id=ci.id; <include refid="companyInfo.whereCaulseAdmin" /> <include refid="common.pageOrderBy" /> <include refid="common.pageLimit" /> </select>

dubbo无法序列化HttpServletRequest问题

dubbo接口中有这样一个方法public Map<String, String> uploadIdImage(HttpServletRequest request) 启动项目调用该接口后报错 nested exception is com.alibaba.dubbo.rpc.RpcException: Failed to invoke remote method: uploadIdImage, java.lang.IllegalStateException: Serialized class org.springframework.web.multipart.support.DefaultMultipartHttpServletRequest must implement java.io.Serializable 能看出来是无法反序列化的问题 ,如何能在接口实现类中得到HttpServletRequest 这个对象,正确调用方法。请教大虾,感激不尽

新人求解:JAVA导入外部包后提示java.lang.NoClassDefFoundError

代码如下: ``` package test.copy; import com.jd.open.api.sdk.DefaultJdClient; import com.jd.open.api.sdk.JdClient; import com.jd.open.api.sdk.JdException; import com.jd.open.api.sdk.request.mall.WarePriceGetRequest; import com.jd.open.api.sdk.response.mall.WarePriceGetResponse; public class Tes { public static void main(String[] args) { String SERVER_URL = "https://api.jd.com/routerjson?"; String appKey = "bd……8"; String appSecret = "a7……da"; String accessToken = ""; JdClient client=new DefaultJdClient(SERVER_URL,accessToken,appKey,appSecret); WarePriceGetRequest request=new WarePriceGetRequest(); request.setSkuId("4……2"); try { WarePriceGetResponse response=client.execute(request); System.out.println(response); } catch (JdException e) { e.printStackTrace(); } } } ``` 运行报错如下: Exception in thread "main" java.lang.NoClassDefFoundError: org/codehaus/jackson/map/ObjectMapper at com.jd.open.api.sdk.internal.util.JsonUtil.<clinit>(JsonUtil.java:18) at com.jd.open.api.sdk.request.mall.WarePriceGetRequest.getAppJsonParams(WarePriceGetRequest.java:35) at com.jd.open.api.sdk.DefaultJdClient.buildUrl(DefaultJdClient.java:131) at com.jd.open.api.sdk.DefaultJdClient.execute(DefaultJdClient.java:96) at test.copy.Tes.main(Tes.java:22) Caused by: java.lang.ClassNotFoundException: org.codehaus.jackson.map.ObjectMapper at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583) at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521) ... 5 more 有小哥哥知道是什么原因吗?233333……

Hadoop序列化问题,实现WritableComparable,readFields报错EOFException

``` public class MyKey implements WritableComparable<MyKey> { //flag == 1 : user //flag == 0 : shopping private Integer flag; private Integer u_id; private Integer s_id; private Integer s_u_id; private String u_info; private String s_info; @Override public int compareTo(MyKey o) { if (flag.equals(1)){ //user return u_id - o.u_id; }else { //shopping return s_id - o.s_id; } } @Override public void write(DataOutput out) throws IOException { out.writeInt(flag); out.writeInt(u_id); out.writeInt(s_id); out.writeInt(s_u_id); out.writeUTF(u_info); out.writeUTF(s_info); } @Override public void readFields(DataInput in) throws IOException { flag = in.readInt(); u_id = in.readInt(); s_id = in.readInt(); s_u_id = in.readInt(); u_info = in.readUTF(); s_info = in.readUTF(); } } ``` 报错异常 2018-10-08 19:55:15,246 INFO Configuration.deprecation: mapred.skip.on is deprecated. Instead, use mapreduce.job.skiprecords 2018-10-08 19:55:15,250 INFO mapred.LocalJobRunner: reduce task executor complete. 2018-10-08 19:55:15,253 WARN mapred.LocalJobRunner: job_local85671337_0001 java.lang.Exception: java.lang.RuntimeException: java.io.EOFException at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:492) at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:559) Caused by: java.lang.RuntimeException: java.io.EOFException at org.apache.hadoop.io.WritableComparator.compare(WritableComparator.java:165) at org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKeyValue(ReduceContextImpl.java:158) at org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKey(ReduceContextImpl.java:121) at org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer$Context.nextKey(WrappedReducer.java:302) at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:170) at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:628) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:390) at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:347) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.io.EOFException at java.io.DataInputStream.readInt(DataInputStream.java:392) at sortjoin.MyKey.readFields(MyKey.java:43) at org.apache.hadoop.io.WritableComparator.compare(WritableComparator.java:158) ... 12 more 2018-10-08 19:55:15,962 INFO mapreduce.Job: Job job_local85671337_0001 running in uber mode : false 2018-10-08 19:55:15,964 INFO mapreduce.Job: map 100% reduce 0%

大学四年自学走来,这些私藏的实用工具/学习网站我贡献出来了

大学四年,看课本是不可能一直看课本的了,对于学习,特别是自学,善于搜索网上的一些资源来辅助,还是非常有必要的,下面我就把这几年私藏的各种资源,网站贡献出来给你们。主要有:电子书搜索、实用工具、在线视频学习网站、非视频学习网站、软件下载、面试/求职必备网站。 注意:文中提到的所有资源,文末我都给你整理好了,你们只管拿去,如果觉得不错,转发、分享就是最大的支持了。 一、电子书搜索 对于大部分程序员...

在中国程序员是青春饭吗?

今年,我也32了 ,为了不给大家误导,咨询了猎头、圈内好友,以及年过35岁的几位老程序员……舍了老脸去揭人家伤疤……希望能给大家以帮助,记得帮我点赞哦。 目录: 你以为的人生 一次又一次的伤害 猎头界的真相 如何应对互联网行业的「中年危机」 一、你以为的人生 刚入行时,拿着傲人的工资,想着好好干,以为我们的人生是这样的: 等真到了那一天,你会发现,你的人生很可能是这样的: ...

Java基础知识面试题(2020最新版)

文章目录Java概述何为编程什么是Javajdk1.5之后的三大版本JVM、JRE和JDK的关系什么是跨平台性?原理是什么Java语言有哪些特点什么是字节码?采用字节码的最大好处是什么什么是Java程序的主类?应用程序和小程序的主类有何不同?Java应用程序与小程序之间有那些差别?Java和C++的区别Oracle JDK 和 OpenJDK 的对比基础语法数据类型Java有哪些数据类型switc...

我以为我学懂了数据结构,直到看了这个导图才发现,我错了

数据结构与算法思维导图

String s = new String(" a ") 到底产生几个对象?

老生常谈的一个梗,到2020了还在争论,你们一天天的,哎哎哎,我不是针对你一个,我是说在座的各位都是人才! 上图红色的这3个箭头,对于通过new产生一个字符串(”宜春”)时,会先去常量池中查找是否已经有了”宜春”对象,如果没有则在常量池中创建一个此字符串对象,然后堆中再创建一个常量池中此”宜春”对象的拷贝对象。 也就是说准确答案是产生了一个或两个对象,如果常量池中原来没有 ”宜春” ,就是两个。...

技术大佬:我去,你写的 switch 语句也太老土了吧

昨天早上通过远程的方式 review 了两名新来同事的代码,大部分代码都写得很漂亮,严谨的同时注释也很到位,这令我非常满意。但当我看到他们当中有一个人写的 switch 语句时,还是忍不住破口大骂:“我擦,小王,你丫写的 switch 语句也太老土了吧!” 来看看小王写的代码吧,看完不要骂我装逼啊。 private static String createPlayer(PlayerTypes p...

Linux面试题(2020最新版)

文章目录Linux 概述什么是LinuxUnix和Linux有什么区别?什么是 Linux 内核?Linux的基本组件是什么?Linux 的体系结构BASH和DOS之间的基本区别是什么?Linux 开机启动过程?Linux系统缺省的运行级别?Linux 使用的进程间通信方式?Linux 有哪些系统日志文件?Linux系统安装多个桌面环境有帮助吗?什么是交换空间?什么是root帐户什么是LILO?什...

Linux命令学习神器!命令看不懂直接给你解释!

大家都知道,Linux 系统有非常多的命令,而且每个命令又有非常多的用法,想要全部记住所有命令的所有用法,恐怕是一件不可能完成的任务。 一般情况下,我们学习一个命令时,要么直接百度去搜索它的用法,要么就直接用 man 命令去查看守冗长的帮助手册。这两个都可以实现我们的目标,但有没有更简便的方式呢? 答案是必须有的!今天给大家推荐一款有趣而实用学习神器 — kmdr,让你解锁 Linux 学习新姿势...

和黑客斗争的 6 天!

互联网公司工作,很难避免不和黑客们打交道,我呆过的两家互联网公司,几乎每月每天每分钟都有黑客在公司网站上扫描。有的是寻找 Sql 注入的缺口,有的是寻找线上服务器可能存在的漏洞,大部分都...

史上最全的 python 基础知识汇总篇,没有比这再全面的了,建议收藏

网友们有福了,小编终于把基础篇的内容全部涉略了一遍,这是一篇关于基础知识的汇总的文章,请朋友们收下,不用客气,不过文章篇幅肯能会有点长,耐心阅读吧爬虫(七十)多进程multiproces...

讲一个程序员如何副业月赚三万的真实故事

loonggg读完需要3分钟速读仅需 1 分钟大家好,我是你们的校长。我之前讲过,这年头,只要肯动脑,肯行动,程序员凭借自己的技术,赚钱的方式还是有很多种的。仅仅靠在公司出卖自己的劳动时...

女程序员,为什么比男程序员少???

昨天看到一档综艺节目,讨论了两个话题:(1)中国学生的数学成绩,平均下来看,会比国外好?为什么?(2)男生的数学成绩,平均下来看,会比女生好?为什么?同时,我又联想到了一个技术圈经常讨...

85后蒋凡:28岁实现财务自由、34岁成为阿里万亿电商帝国双掌门,他的人生底层逻辑是什么?...

蒋凡是何许人也? 2017年12月27日,在入职4年时间里,蒋凡开挂般坐上了淘宝总裁位置。 为此,时任阿里CEO张勇在任命书中力赞: 蒋凡加入阿里,始终保持创业者的冲劲,有敏锐的...

总结了 150 余个神奇网站,你不来瞅瞅吗?

原博客再更新,可能就没了,之后将持续更新本篇博客。

副业收入是我做程序媛的3倍,工作外的B面人生是怎样的?

提到“程序员”,多数人脑海里首先想到的大约是:为人木讷、薪水超高、工作枯燥…… 然而,当离开工作岗位,撕去层层标签,脱下“程序员”这身外套,有的人生动又有趣,马上展现出了完全不同的A/B面人生! 不论是简单的爱好,还是正经的副业,他们都干得同样出色。偶尔,还能和程序员的特质结合,产生奇妙的“化学反应”。 @Charlotte:平日素颜示人,周末美妆博主 大家都以为程序媛也个个不修边幅,但我们也许...

MySQL数据库面试题(2020最新版)

文章目录数据库基础知识为什么要使用数据库什么是SQL?什么是MySQL?数据库三大范式是什么mysql有关权限的表都有哪几个MySQL的binlog有有几种录入格式?分别有什么区别?数据类型mysql有哪些数据类型引擎MySQL存储引擎MyISAM与InnoDB区别MyISAM索引与InnoDB索引的区别?InnoDB引擎的4大特性存储引擎选择索引什么是索引?索引有哪些优缺点?索引使用场景(重点)...

新一代神器STM32CubeMonitor介绍、下载、安装和使用教程

关注、星标公众号,不错过精彩内容作者:黄工公众号:strongerHuang最近ST官网悄悄新上线了一款比较强大的工具:STM32CubeMonitor V1.0.0。经过我研究和使用之...

如果你是老板,你会不会踢了这样的员工?

有个好朋友ZS,是技术总监,昨天问我:“有一个老下属,跟了我很多年,做事勤勤恳恳,主动性也很好。但随着公司的发展,他的进步速度,跟不上团队的步伐了,有点...

我入职阿里后,才知道原来简历这么写

私下里,有不少读者问我:“二哥,如何才能写出一份专业的技术简历呢?我总感觉自己写的简历太烂了,所以投了无数份,都石沉大海了。”说实话,我自己好多年没有写过简历了,但我认识的一个同行,他在阿里,给我说了一些他当年写简历的方法论,我感觉太牛逼了,实在是忍不住,就分享了出来,希望能够帮助到你。 01、简历的本质 作为简历的撰写者,你必须要搞清楚一点,简历的本质是什么,它就是为了来销售你的价值主张的。往深...

大学一路走来,学习互联网全靠这几个网站,最终拿下了一把offer

大佬原来都是这样炼成的

离职半年了,老东家又发 offer,回不回?

有小伙伴问松哥这个问题,他在上海某公司,在离职了几个月后,前公司的领导联系到他,希望他能够返聘回去,他很纠结要不要回去? 俗话说好马不吃回头草,但是这个小伙伴既然感到纠结了,我觉得至少说明了两个问题:1.曾经的公司还不错;2.现在的日子也不是很如意。否则应该就不会纠结了。 老实说,松哥之前也有过类似的经历,今天就来和小伙伴们聊聊回头草到底吃不吃。 首先一个基本观点,就是离职了也没必要和老东家弄的苦...

为什么你不想学习?只想玩?人是如何一步一步废掉的

不知道是不是只有我这样子,还是你们也有过类似的经历。 上学的时候总有很多光辉历史,学年名列前茅,或者单科目大佬,但是虽然慢慢地长大了,你开始懈怠了,开始废掉了。。。 什么?你说不知道具体的情况是怎么样的? 我来告诉你: 你常常潜意识里或者心理觉得,自己真正的生活或者奋斗还没有开始。总是幻想着自己还拥有大把时间,还有无限的可能,自己还能逆风翻盘,只不是自己还没开始罢了,自己以后肯定会变得特别厉害...

什么时候跳槽,为什么离职,你想好了么?

都是出来打工的,多为自己着想

为什么程序员做外包会被瞧不起?

二哥,有个事想询问下您的意见,您觉得应届生值得去外包吗?公司虽然挺大的,中xx,但待遇感觉挺低,马上要报到,挺纠结的。

当HR压你价,说你只值7K,你该怎么回答?

当HR压你价,说你只值7K时,你可以流畅地回答,记住,是流畅,不能犹豫。 礼貌地说:“7K是吗?了解了。嗯~其实我对贵司的面试官印象很好。只不过,现在我的手头上已经有一份11K的offer。来面试,主要也是自己对贵司挺有兴趣的,所以过来看看……”(未完) 这段话主要是陪HR互诈的同时,从公司兴趣,公司职员印象上,都给予对方正面的肯定,既能提升HR的好感度,又能让谈判气氛融洽,为后面的发挥留足空间。...

面试阿里p7,被按在地上摩擦,鬼知道我经历了什么?

面试阿里p7被问到的问题(当时我只知道第一个):@Conditional是做什么的?@Conditional多个条件是什么逻辑关系?条件判断在什么时候执...

你期望月薪4万,出门右拐,不送,这几个点,你也就是个初级的水平

先来看几个问题通过注解的方式注入依赖对象,介绍一下你知道的几种方式@Autowired和@Resource有何区别说一下@Autowired查找候选者的...

面试了一个 31 岁程序员,让我有所触动,30岁以上的程序员该何去何从?

最近面试了一个31岁8年经验的程序猿,让我有点感慨,大龄程序猿该何去何从。

大三实习生,字节跳动面经分享,已拿Offer

说实话,自己的算法,我一个不会,太难了吧

程序员垃圾简历长什么样?

已经连续五年参加大厂校招、社招的技术面试工作,简历看的不下于万份 这篇文章会用实例告诉你,什么是差的程序员简历! 疫情快要结束了,各个公司也都开始春招了,作为即将红遍大江南北的新晋UP主,那当然要为小伙伴们做点事(手动狗头)。 就在公众号里公开征简历,义务帮大家看,并一一点评。《启舰:春招在即,义务帮大家看看简历吧》 一石激起千层浪,三天收到两百多封简历。 花光了两个星期的所有空闲时...

立即提问
相关内容推荐