map算子里面使用sparkContext 报 java.io.NotSerializableException: org.apache.spark.SparkContext错?

val receiverStream: ReceiverInputDStream[ String ] = RabbitMQUtils.createStream String
receiverStream.print()

receiverStream.map(value => {
 //@transient val sc = spark.sparkContext
  val jsonS = JSON.parseFull(value)
  val mapjson: Map[ String, String ] = regJson(jsonS)
  val alarmContent = mapjson.get("alarmContent").toString.replace("Some(", "").replace(")", "")
  val alarmEventId = mapjson.get("alarmEventId").toString.replace("Some(", "").replace(")", "")
  val alarmLevel = mapjson.get("alarmLevel").toString.replace("Some(", "").replace(")", "")
  val alarmType = mapjson.get("alarmType").toString.replace("Some(", "").replace(")", "")
  val buildingId = mapjson.get("buildingId").toString.replace("Some(", "").replace(")", "")
  val chargesCode = mapjson.get("chargesCode").toString.replace("Some(", "").replace(")", "")
  val createDate = mapjson.get("createDate").toString.replace("Some(", "").replace(")", "").toDouble
  val delFlag = mapjson.get("delFlag").toString.replace("Some(", "").replace(")", "")
  val deviceId = mapjson.get("deviceId").toString.replace("Some(", "").replace(")", "")
  val happenTime = mapjson.get("happenTime").toString.replace("Some(", "").replace(")", "").toDouble
  val isNewRecord = mapjson.get("isNewRecord").toString.replace("Some(", "").replace(")", "").toBoolean
  val page = mapjson.get("page").toString.replace("Some(", "").replace(")", "")
  val producerCode = mapjson.get("producerCode").toString.replace("Some(", "").replace(")", "")
  val sqlMap = mapjson.get("sqlMap").toString.replace("Some(", "").replace(")", "")
  println(alarmEventId)
  val strings: Apple = Apple(alarmContent, alarmEventId, alarmLevel,
    alarmType, buildingId, chargesCode, createDate, delFlag,
    deviceId, happenTime, isNewRecord, page, producerCode, sqlMap)
  val apples: Seq[ Apple ] = Seq(strings)
  //println("走到这里了!")
 println("logs:" + apples)
 // val appRdd: RDD[ Apple ] = sc.makeRDD(apples)
 /* value1.foreachPartition(iter =>{
    import spark.implicits._
    val frameDF: DataFrame = value1.toDF()
    frameDF.createTempView("t_1")
    frameDF.show()
  })*/
 val value1: RDD[ Apple ] = sc.parallelize(apples)
  import spark.implicits._
  val frameDF: DataFrame = value1.toDF()
  frameDF.createTempView("t_1")
  frameDF.show()
}).print()

1个回答

报错信息:Exception in thread "main" org.apache.spark.SparkException: Task not serializable
at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:298)
at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:288)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:108)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2039)
at org.apache.spark.streaming.dstream.DStream$$anonfun$map$1.apply(DStream.scala:546)
at org.apache.spark.streaming.dstream.DStream$$anonfun$map$1.apply(DStream.scala:546)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.SparkContext.withScope(SparkContext.scala:679)
at org.apache.spark.streaming.StreamingContext.withScope(StreamingContext.scala:264)
at org.apache.spark.streaming.dstream.DStream.map(DStream.scala:545)
at example.RabbitMQ2Spark$.main(RabbitMQ2Spark.scala:54)
at example.RabbitMQ2Spark.main(RabbitMQ2Spark.scala)
Caused by: java.io.NotSerializableException: org.apache.spark.SparkContext
Serialization stack:
- object not serializable (class: org.apache.spark.SparkContext, value: org.apache.spark.SparkContext@185f7840)
- field (class: example.RabbitMQ2Spark$$anonfun$main$1, name: sc$1, type: class org.apache.spark.SparkContext)
- object (class example.RabbitMQ2Spark$$anonfun$main$1, )
at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:46)
at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100)
at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:295)
... 12 more

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
idea编译出错 java.io.IOException: Cannot create empty file:

Error:Internal error: (java.io.IOException) Cannot create empty file: C:\Users\鏋楁案鍩�\.IntelliJIdea2019.1\system\compile-server\schoolrollsystem_d8001572\timestamps\data java.io.IOException: Cannot create empty file: C:\Users\鏋楁案鍩?\.IntelliJIdea2019.1\system\compile-server\schoolrollsystem_d8001572\timestamps\data at com.intellij.util.io.PersistentEnumeratorBase.<init>(PersistentEnumeratorBase.java:175) at com.intellij.util.io.PersistentBTreeEnumerator.<init>(PersistentBTreeEnumerator.java:73) at com.intellij.util.io.PersistentEnumeratorDelegate.<init>(PersistentEnumeratorDelegate.java:47) at com.intellij.util.io.PersistentHashMap.<init>(PersistentHashMap.java:163) at com.intellij.util.io.PersistentHashMap.<init>(PersistentHashMap.java:152) at com.intellij.util.io.PersistentHashMap.<init>(PersistentHashMap.java:143) at com.intellij.util.io.PersistentHashMap.<init>(PersistentHashMap.java:135) at com.intellij.util.io.PersistentHashMap.<init>(PersistentHashMap.java:128) at org.jetbrains.jps.incremental.storage.AbstractStateStorage.createMap(AbstractStateStorage.java:124) at org.jetbrains.jps.incremental.storage.AbstractStateStorage.<init>(AbstractStateStorage.java:27) at org.jetbrains.jps.incremental.storage.TimestampStorage.<init>(TimestampStorage.java:21) at org.jetbrains.jps.incremental.storage.ProjectTimestamps.<init>(ProjectTimestamps.java:35) at org.jetbrains.jps.cmdline.BuildRunner.load(BuildRunner.java:111) at org.jetbrains.jps.cmdline.BuildSession.runBuild(BuildSession.java:279) at org.jetbrains.jps.cmdline.BuildSession.run(BuildSession.java:135) at org.jetbrains.jps.cmdline.BuildMain$MyMessageHandler.lambda$channelRead0$0(BuildMain.java:228) at org.jetbrains.jps.service.impl.SharedThreadPoolImpl.lambda$executeOnPooledThread$0(SharedThreadPoolImpl.java:42) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Please perform full project rebuild (Build | Rebuild Project)

执行jar报错 Hadoop java.io.IOException

[img=http://img.bbs.csdn.net/upload/201703/15/1489518401_142809.png][/img] Error: java.io.IOException: Initialization of all the collectors failed. Error in last collector was :interface javax.xml.soap.Text hadoop jar Hadoop_Demo1.jar /user/myData/ /user/out/ 执行简单jar包 17/03/15 02:52:37 INFO client.RMProxy: Connecting to ResourceManager at s0/192.168.253.130:8032 17/03/15 02:52:37 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this. 17/03/15 02:52:38 INFO input.FileInputFormat: Total input paths to process : 2 17/03/15 02:52:38 INFO mapreduce.JobSubmitter: number of splits:2 17/03/15 02:52:38 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1489512856623_0004 17/03/15 02:52:39 INFO impl.YarnClientImpl: Submitted application application_1489512856623_0004 17/03/15 02:52:39 INFO mapreduce.Job: The url to track the job: http://s0:8088/proxy/application_1489512856623_0004/ 17/03/15 02:52:39 INFO mapreduce.Job: Running job: job_1489512856623_0004 17/03/15 02:52:50 INFO mapreduce.Job: Job job_1489512856623_0004 running in uber mode : false 17/03/15 02:52:50 INFO mapreduce.Job: map 0% reduce 0% 17/03/15 02:55:18 INFO mapreduce.Job: map 50% reduce 0% 17/03/15 02:55:18 INFO mapreduce.Job: Task Id : attempt_1489512856623_0004_m_000001_0, Status : FAILED Error: java.io.IOException: Initialization of all the collectors failed. Error in last collector was :interface javax.xml.soap.Text at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:414) at org.apache.hadoop.mapred.MapTask.access$100(MapTask.java:81) at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:698) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:770) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: java.lang.ClassCastException: interface javax.xml.soap.Text at java.lang.Class.asSubclass(Class.java:3404) at org.apache.hadoop.mapred.JobConf.getOutputKeyComparator(JobConf.java:887) at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.init(MapTask.java:1004) at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:402) ... 9 more Container killed by the ApplicationMaster. 17/03/15 02:55:18 INFO mapreduce.Job: Task Id : attempt_1489512856623_0004_m_000000_0, Status : FAILED Error: java.io.IOException: Initialization of all the collectors failed. Error in last collector was :interface javax.xml.soap.Text at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:414) at org.apache.hadoop.mapred.MapTask.access$100(MapTask.java:81) at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:698) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:770) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: java.lang.ClassCastException: interface javax.xml.soap.Text at java.lang.Class.asSubclass(Class.java:3404) at org.apache.hadoop.mapred.JobConf.getOutputKeyComparator(JobConf.java:887) at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.init(MapTask.java:1004) at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:402) ... 9 more 17/03/15 02:55:19 INFO mapreduce.Job: map 0% reduce 0% 17/03/15 02:55:31 INFO mapreduce.Job: Task Id : attempt_1489512856623_0004_m_000000_1, Status : FAILED Error: java.io.IOException: Initialization of all the collectors failed. Error in last collector was :interface javax.xml.soap.Text at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:414) at org.apache.hadoop.mapred.MapTask.access$100(MapTask.java:81) at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:698) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:770) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

spark shell在存运算结果到hdfs时报java.io.IOException: Not a file: hdfs://mini1:9000/spark/res

scala> sc.textFile("hdfs://mini1:9000/spark").flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_).saveAsTextFile("hdfs://mini1:9000/spark/res2") 执行上面的代码出错,这个目录在hdfs下是有的,而且就算没有也会创建。还有就是我运行的代码中是保存到res2目录 ,这里为什么报没有res目录 18/11/05 19:06:44 WARN SizeEstimator: Failed to check whether UseCompressedOops is set; assuming yes java.io.IOException: Not a file: hdfs://mini1:9000/spark/res at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:320) at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:199) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.Partitioner$.defaultPartitioner(Partitioner.scala:65) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$3.apply(PairRDDFunctions.scala:331) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$3.apply(PairRDDFunctions.scala:331) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111) at org.apache.spark.rdd.RDD.withScope(RDD.scala:316) at org.apache.spark.rdd.PairRDDFunctions.reduceByKey(PairRDDFunctions.scala:330) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:28) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35) at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:37) at $iwC$$iwC$$iwC$$iwC.<init>(<console>:39) at $iwC$$iwC$$iwC.<init>(<console>:41) at $iwC$$iwC.<init>(<console>:43) at $iwC.<init>(<console>:45) at <init>(<console>:47) at .<init>(<console>:51) at .<clinit>(<console>) at .<init>(<console>:7) at .<clinit>(<console>) at $print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657) at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Logstash与tcp,错误:java.io.IOException:连接被对等方重置

<div class="post-text" itemprop="text"> <p>我想在我的GolangApp中使用logstash。</p> <p>/etc/logstash/conf.d/first-pipeline.conf </p> <pre><code>input { tcp { port =&gt; 5959 codec =&gt; json } } #filter {} output { elasticsearch { hosts =&gt; [ "localhost:9200" ] } } </code></pre> <p>和用于运行logstash的命令:</p> <p>/usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/first-pipeline.conf --path.settings=/etc/logstash</p> <pre><code> Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties [2018-12-09T09:11:14,984][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified [2018-12-09T09:11:14,995][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=&gt;"6.5.1"} [2018-12-09T09:11:16,968][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=&gt;"main", "pipeline.workers"=&gt;4, "pipeline.batch.size"=&gt;125, "pipeline.batch.delay"=&gt;50} [2018-12-09T09:11:17,347][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=&gt;{:removed=&gt;[], :added=&gt;[http://localhost:9200/]}} [2018-12-09T09:11:17,356][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=&gt;http://localhost:9200/, :path=&gt;"/"} [2018-12-09T09:11:17,589][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=&gt;"http://localhost:9200/"} [2018-12-09T09:11:17,655][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=&gt;6} [2018-12-09T09:11:17,660][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=&gt;6} [2018-12-09T09:11:17,692][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=&gt;"LogStash::Outputs::ElasticSearch", :hosts=&gt;["//localhost:9200"]} [2018-12-09T09:11:17,730][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=&gt;nil} [2018-12-09T09:11:17,786][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=&gt;{"template"=&gt;"logstash-*", "version"=&gt;60001, "settings"=&gt;{"index.refresh_interval"=&gt;"5s"}, "mappings"=&gt;{"_default_"=&gt;{"dynamic_templates"=&gt;[{"message_field"=&gt;{"path_match"=&gt;"message", "match_mapping_type"=&gt;"string", "mapping"=&gt;{"type"=&gt;"text", "norms"=&gt;false}}}, {"string_fields"=&gt;{"match"=&gt;"*", "match_mapping_type"=&gt;"string", "mapping"=&gt;{"type"=&gt;"text", "norms"=&gt;false, "fields"=&gt;{"keyword"=&gt;{"type"=&gt;"keyword", "ignore_above"=&gt;256}}}}}], "properties"=&gt;{"@timestamp"=&gt;{"type"=&gt;"date"}, "@version"=&gt;{"type"=&gt;"keyword"}, "geoip"=&gt;{"dynamic"=&gt;true, "properties"=&gt;{"ip"=&gt;{"type"=&gt;"ip"}, "location"=&gt;{"type"=&gt;"geo_point"}, "latitude"=&gt;{"type"=&gt;"half_float"}, "longitude"=&gt;{"type"=&gt;"half_float"}}}}}}}} [2018-12-09T09:11:17,809][INFO ][logstash.inputs.tcp ] Automatically switching from json to json_lines codec {:plugin=&gt;"tcp"} [2018-12-09T09:11:17,862][INFO ][logstash.inputs.tcp ] Starting tcp input listener {:address=&gt;"0.0.0.0:5959", :ssl_enable=&gt;"false"} [2018-12-09T09:11:18,097][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=&gt;"main", :thread=&gt;"#&lt;Thread:0x42d68f8e run&gt;"} [2018-12-09T09:11:18,157][INFO ][logstash.agent ] Pipelines running {:count=&gt;1, :running_pipelines=&gt;[:main], :non_running_pipelines=&gt;[]} [2018-12-09T09:11:18,329][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=&gt;9600} </code></pre> <p>Code in golang:</p> <pre><code>import ( "encoding/json" "fmt" "github.com/heatxsink/go-logstash" "time" ) func main() { l := logstash.New("0.0.0.0", 5959, 5) _, err := l.Connect() if err != nil { fmt.Println(err) } dataMap := map[string]int{"apple": 5, "lettuce": 7} jsonMap, _ := json.Marshal(dataMap) err = l.Writeln(string(jsonMap)) if err != nil { fmt.Println(err) } } </code></pre> <p>当我试图请求登录应用程序时,终端会显示这个错误:</p> <blockquote> <p>[2018-12-09T09:12:41,954][ERROR][logstash.inputs.tcp ] Error in Netty pipeline: java.io.IOException: Connection reset by peer</p> </blockquote> <p>所有的东西都在我的本地系统中。请帮帮我!</p> </div>

此处为啥一直报找不到文件的异常:java.io.FileNotFoundException: Template "contextxml.ftl" not found.

public class ExportToXmlUtil { private Configuration configuration = null; public ExportToXmlUtil(){ configuration = new Configuration(); // configuration.setDefaultEncoding("UTF-8"); } public static void main(String[] args) { ExportToXmlUtil test = new ExportToXmlUtil(); test.createContextXml(); } public void createContextXml(){ Map<String,String> dataMap=new HashMap<>(); getData(dataMap); configuration.setClassForTemplateLoading(this.getClass(), "classpath:/template"); //FTL文件所存在的位置,放在与java相同的包下 Template t=null; try { t = configuration.getTemplate("contextxml.ftl"); //文件名 } catch (IOException e) { e.printStackTrace(); } File outFile = new File("D:/forK8sOutFile/context.xml"); //生成文件的路径 Writer out = null; try { out = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(outFile))); } catch (FileNotFoundException e1) { e1.printStackTrace(); } try { t.process(dataMap, out); } catch (TemplateException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } } //这里赋值的时候需要注意,xml中需要的数据你必须提供给它,不然会报找不到某元素错的. private Map<String,String> getData(Map<String, String> dataMap) { dataMap.put("name", "testName"); dataMap.put("auth", "testAuth"); return dataMap; } } 我的项目结构图随后附上

Could not find parameter map java.util.Map 错误

错误: org.mybatis.spring.MyBatisSystemException: nested exception is org.apache.ibatis.builder.xml.IncompleteStatementException: Could not find parameter map java.util.Map 报错的地方: this.sqlSessionTxTemplate.selectOne("UserManageMapper.addUser", ht); “UserManageMapper.addUser”: <resultMap id="map" type="java.util.HashMap"> </resultMap> <select id="getUserById" parameterType="java.util.Map" resultMap="map"> select * from subscriber where provider_id=#{providerId} order by updated_on desc </select> 之前一直都很正常,现在莫名其妙的报错,而且所有的查询,更新都是报这个错误,检查了很多地方都没发现问题。 请大家帮帮忙

Mapper 中的 resultType 使用 pojo 跟使用 java.util.Map 的区别?

``` <select id="" resultType="pojo类"> // sql代码 。。。 </select> <select id="" resultType="java.util.Map"> // sql代码 。。。 </select> ``` 虽然我知道既然有 pojo 这个概念就不可能纯用 map 去取代它, 但我只知其然不知其所以然,我想知道: 1. 为什么不能纯使用 map? 2. 而在什么情况下,又应该使用 map ? 求大神解答,这个问题困惑我好久了,写着代码都不安心?

spark sparkcontext 初始化失败

环境 Ubuntu 16.04 hadoop 2.7.3 scala 2.11.8 spark 2.1.0 已经安装好了hadoop scala,之后配置了下 spark 运行 spark-shell 就爆出来下面的错误 ``` 18/05/22 15:43:30 ERROR spark.SparkContext: Error initializing SparkContext. java.lang.IllegalArgumentException: For input string: "true #是否记录Spark事件,用于应用程序在完成后重构webUI" at scala.collection.immutable.StringLike$class.parseBoolean(StringLike.scala:290) at scala.collection.immutable.StringLike$class.toBoolean(StringLike.scala:260) at scala.collection.immutable.StringOps.toBoolean(StringOps.scala:29) at org.apache.spark.SparkConf$$anonfun$getBoolean$2.apply(SparkConf.scala:407) at org.apache.spark.SparkConf$$anonfun$getBoolean$2.apply(SparkConf.scala:407) at scala.Option.map(Option.scala:146) at org.apache.spark.SparkConf.getBoolean(SparkConf.scala:407) at org.apache.spark.SparkContext.isEventLogEnabled(SparkContext.scala:238) at org.apache.spark.SparkContext.<init>(SparkContext.scala:407) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95) at $line3.$read$$iw$$iw.<init>(<console>:15) at $line3.$read$$iw.<init>(<console>:42) at $line3.$read.<init>(<console>:44) at $line3.$read$.<init>(<console>:48) at $line3.$read$.<clinit>(<console>) at $line3.$eval$.$print$lzycompute(<console>:7) at $line3.$eval$.$print(<console>:6) at $line3.$eval.$print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786) at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047) at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638) at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637) at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31) at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19) at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637) at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569) at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565) at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807) at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681) at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395) at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38) at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214) at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37) at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97) at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909) at org.apache.spark.repl.Main$.doMain(Main.scala:68) at org.apache.spark.repl.Main$.main(Main.scala:51) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) java.lang.IllegalArgumentException: For input string: "true #是否记录Spark事件,用于应用程序在完成后重构webUI" at scala.collection.immutable.StringLike$class.parseBoolean(StringLike.scala:290) at scala.collection.immutable.StringLike$class.toBoolean(StringLike.scala:260) at scala.collection.immutable.StringOps.toBoolean(StringOps.scala:29) at org.apache.spark.SparkConf$$anonfun$getBoolean$2.apply(SparkConf.scala:407) at org.apache.spark.SparkConf$$anonfun$getBoolean$2.apply(SparkConf.scala:407) at scala.Option.map(Option.scala:146) at org.apache.spark.SparkConf.getBoolean(SparkConf.scala:407) at org.apache.spark.SparkContext.isEventLogEnabled(SparkContext.scala:238) at org.apache.spark.SparkContext.<init>(SparkContext.scala:407) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95) ... 47 elided <console>:14: error: not found: value spark import spark.implicits._ ^ <console>:14: error: not found: value spark import spark.sql ```

急急急请教各位大神,dubbo的HessianProtocolException

调用同事的dubbo接口所有的配置都检查过没有问题,可以注入当代码走到调用注入的接口时报异常: java.lang.reflect.InvocationTargetException 接口那边的同事说请求可以过去 但是在return的时候报异常: [com.alibaba.dubbo.rpc.protocol.dubbo.DecodeableRpcInvocation] - [DUBBO] Decode rpc invocation failed: expected map/object at java.lang.String (Ljava/lang/String;Lorg/bigdata/framework/pay/model/WebDownResourceParam;), dubbo version: 2.8.4, current host: 10.10.184.116 com.alibaba.com.caucho.hessian.io.HessianProtocolException: expected map/object at java.lang.String (Ljava/lang/String;Lorg/bigdata/framework/pay/model/WebDownResourceParam;) at com.alibaba.com.caucho.hessian.io.AbstractDeserializer.error(AbstractDeserializer.java:108) at com.alibaba.com.caucho.hessian.io.AbstractMapDeserializer.readObject(AbstractMapDeserializer.java:70) at com.alibaba.com.caucho.hessian.io.Hessian2Input.readObject(Hessian2Input.java:1696) at com.alibaba.dubbo.common.serialize.support.hessian.Hessian2ObjectInput.readObject(Hessian2ObjectInput.java:94) at com.alibaba.dubbo.rpc.protocol.dubbo.DecodeableRpcInvocation.decode(DecodeableRpcInvocation.java:150) at com.alibaba.dubbo.rpc.protocol.dubbo.DecodeableRpcInvocation.decode(DecodeableRpcInvocation.java:74) at com.alibaba.dubbo.rpc.protocol.dubbo.DubboCodec.decodeBody(DubboCodec.java:138) at com.alibaba.dubbo.remoting.exchange.codec.ExchangeCodec.decode(ExchangeCodec.java:134) at com.alibaba.dubbo.remoting.exchange.codec.ExchangeCodec.decode(ExchangeCodec.java:95) at com.alibaba.dubbo.rpc.protocol.dubbo.DubboCountCodec.decode(DubboCountCodec.java:46) at com.alibaba.dubbo.remoting.transport.netty.NettyCodecAdapter$InternalDecoder.messageReceived(NettyCodecAdapter.java:134) at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:109) at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312) at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:90) at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) [com.alibaba.dubbo.remoting.exchange.codec.ExchangeCodec] - [DUBBO] Skip input stream 330, dubbo version: 2.8.4, current host: 10.10.184.116 [com.alibaba.dubbo.rpc.protocol.dubbo.DubboProtocol] - [DUBBO] disconected from /10.20.21.130:52655,url:dubbo://10.10.184.116:9977/org.bigdata.framework.security.iservice.SecurityFacadeService?anyhost=true&application=dubbo-webservice-provider&channel.readonly.sent=true&codec=dubbo&default.delay=-1&default.retries=0&default.timeout=100000&delay=-1&dubbo=2.8.4&generic=false&heartbeat=60000&interface=org.bigdata.framework.security.iservice.SecurityFacadeService&methods=intercept,encrypt&organization=dubbox&owner=programmer&pid=28695&side=provider&timestamp=1499949941270, dubbo version: 2.8.4, current host: 10.10.184.116

java.lang.ClassCastException

public class ipSort { public static class Map extends Mapper<LongWritable, IntWritable, IntWritable, Text>{ //将输入文件转换成<ipNum,ipAdd>的形式 private final static IntWritable ipNum = new IntWritable(); private Text ipAdd = new Text(); public void map(LongWritable key, IntWritable value, Context context) throws IOException, InterruptedException{ //把每一行转成字符串 String line = value.toString(); // 分割每一行 StringTokenizer token = new StringTokenizer(line); //solve every line while(token.hasMoreElements()){ //divided by blank StringTokenizer tokenLine = new StringTokenizer(token.nextToken()); ipAdd.set(token.nextToken().trim()); ipNum.set(Integer.valueOf(token.nextToken().trim())); context.write(ipNum,new Text(ipAdd)); } } } public static class Reduce extends Reducer<IntWritable, Text, Text, IntWritable>{ //把Map阶段的输出结果颠倒; private Text result = new Text(); public void reduce(IntWritable key,Iterable<Text> values, Context context) throws IOException, InterruptedException{ for(Text val : values){ result.set(val.toString()); context.write(new Text(result),key); } } } public static class IntKeyDescComparator extends WritableComparator{ protected IntKeyDescComparator(){ super(IntWritable.class,true); } public int compare(WritableComparable a, WritableComparable b){ return super.compare(a, b); } } public static void main(String args[]) throws IOException, ClassNotFoundException, InterruptedException{ System.setProperty("hadoop.home.dir", "C:\\Users\\lenovo\\Desktop\\hadoop-2.6.0\\hadoop-2.6.0"); Configuration conf = new Configuration(); conf.set("mapred.job.tracker", "192.168.142.138"); Job job = new Job(conf,"ipSort"); job.setJarByClass(ipSort.class); job.setSortComparatorClass(IntKeyDescComparator.class); job.setMapperClass(Map.class); job.setReducerClass(Reduce.class); job.setOutputKeyClass(IntWritable.class); job.setOutputValueClass(Text.class); FileInputFormat.addInputPath(job, new Path("hdfs://10.170.54.193:9000/input")); FileOutputFormat.setOutputPath(job, new Path("hdfs://10.170.54.193:9000/output")); System.exit(job.waitForCompletion(true)?0:1); } 运行时出现问题Caused by: java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.io.IntWritable,但是找不到哪里类型转换错误了

list转换成map错误:java.lang.String cannot be cast to j

我想把类似[{name=小明, age=18}, {name=小红, age=20}, {name=大熊, age=17}]这样从 数据库取出来的list转换成[小明,18,小红,20,大熊,17](String类型list。 项目中这段代码是这样实现的 ``` public List executeSQL(String sql){ log.debug(sql); List<String> aList = new ArrayList(); List jlist = jdbcTemplate.queryForList(sql); for(int i=0;i<jlist.size();i++){ System.out.println(jlist.get(i).getClass()); } Iterator ite = jlist.iterator(); while(ite.hasNext()){ Map map = (Map)ite.next(); for(Object o:map.keySet()){ if(map.get(o.toString())==null){ aList.add(""); }else{ aList.add(map.get(o.toString()).toString()); } } } return aList; } ``` 我自己写了简单的想实现下 ``` public class mapDemo { public static void main(String[] args) { List<String> a = new ArrayList(); List b = new ArrayList<>(); String c1 = new String("{name=小明, age=18}"); String c2 = new String("{name=小红, age=20}"); String c3 = new String("{name=大熊, age=17}"); b.add(c1); b.add(c2); b.add(c3); Iterator it =b.iterator(); while(it.hasNext()){ Map map = (Map)it.next(); for(Object o:map.keySet()){ if(map.get(o.toString())==null){ a.add(""); }else{ a.add(map.get(o.toString()).toString()); } } } } } ``` 但是提示错误:java.lang.String cannot be cast to java.util.Map 数据库取出来的list和我add进去的有什么不一样吗,我这段是哪里错了?

hadoop 报java.lang.InstantiationException

package mapreduce; import java.io.IOException; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.LongWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Job; import org.apache.hadoop.mapreduce.Mapper; import org.apache.hadoop.mapreduce.Reducer; import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import org.apache.hadoop.mapreduce.lib.input.TextInputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; import org.apache.hadoop.mapreduce.lib.partition.HashPartitioner; public class WordCountApp { static final String INPUT_PATH = "hdfs://chaoren:9000/hello"; static final String OUTPUT_PATH = "hdfs://chaoren:9000/hello_statics"; public static void main(String[] args) throws Exception { Configuration conf = new Configuration(); Job job = new Job(conf, WordCountApp.class.getSimpleName()); //1。1输入的目录在哪里 FileInputFormat.setInputPaths(job, INPUT_PATH); //指定对输入的数据进行格式化处理 job.setInputFormatClass(TextInputFormat.class); //1.2指定自定义的mapper类 job.setMapperClass(MyMapper.class); //指定map输出的<key,value>类型 job.setMapOutputKeyClass(Text.class); job.setMapOutputValueClass(LongWritable.class); //1.3 分区 job.setPartitionerClass(HashPartitioner.class); job.setNumReduceTasks(1); //1.4 TODO 排序分组 //1.5 TODO <可选>规约 //2.2指定自定义的Reducer类 job.setReducerClass(MyReducer.class); job.setOutputKeyClass(Text.class); job.setOutputValueClass(LongWritable.class); //2.2指定输出路径 FileOutputFormat.setOutputPath(job, new Path(OUTPUT_PATH)); //2.3指定输出的格式化类 job.setOutputFormatClass(FileOutputFormat.class); //把作业提交jobTracker运行 job.waitForCompletion(true); } /** * KEYIN 即k1 表示每一行的起始位置<偏移量> * VALUEIN 即v1 表示每一行的文本内容 * KEYOUT 即k1 表示每一行中的单词 * VALUEOUT 即v1 表示每一行中,每个单词出现的次数 * @author Administrator * */ static class MyMapper extends Mapper<LongWritable, Text, Text, LongWritable> { @Override protected void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException { String[] splited = value.toString().split("\t"); for(String word : splited) { context.write(new Text(word), new LongWritable(1)); } } } /** * KEYIN 即k2 表示每一行中每个单词 * VALUEIN 即v2 表示每一行中每个单词出现的次数 * KEYOUT 即k3 表示整个文件中的不同单词 * VALUEOUT 即v3 表示整个文件中的不同单词的出现总数 * @author Administrator * */ static class MyReducer extends Reducer<Text,LongWritable, Text, LongWritable>{ protected void reduce(Text k2, Iterable<LongWritable> v2s, Context context) throws IOException, InterruptedException { long sum = 0l; for(LongWritable v2 : v2s){ sum+=v2.get(); } context.write(k2, new LongWritable(sum)); } } }

java 中 关于java.lang.ArrayStoreException: java.lang.Integer异常,是什么原因?

[code="java"] package pack.java.demo; import java.util.HashMap; import java.util.Map; public class Test { /** * @param args */ public static void main(String[] args) { // TODO Auto-generated method stub Map<String, Object> map = new HashMap<String, Object>(); map.put("A", 12); map.put("B", "SAP"); map.put("C", '中'); String[] keyArr = map.keySet().toArray(new String[map.size()]); for(int i=0;i<keyArr.length;i++){ System.out.println(keyArr[i]); } System.out.println(""); Object[] valueArr = map.values().toArray(new String[map.size()]); for(int i = 0;i<valueArr.length;i++){ System.out.println(valueArr[i]); } } } [/code] 直接运行上面那段代码,就报如下,错误。请问是什么原因? A C B Exception in thread "main" java.lang.ArrayStoreException: java.lang.Integer at java.util.AbstractCollection.toArray(AbstractCollection.java:176) at pack.java.demo.Test.main(Test.java:25)

Mybatis 运行出错

org.apache.ibatis.exceptions.PersistenceException: ### Error building SqlSession. ### Cause: org.apache.ibatis.builder.BuilderException: Error creating document instance. Cause: java.io.FileNotFoundException: http://www.mybatis.org/dtd/mybatis-3-congig.dtd at org.apache.ibatis.exceptions.ExceptionFactory.wrapException(ExceptionFactory.java:30) at org.apache.ibatis.session.SqlSessionFactoryBuilder.build(SqlSessionFactoryBuilder.java:82) at org.apache.ibatis.session.SqlSessionFactoryBuilder.build(SqlSessionFactoryBuilder.java:66) at test.UserMapperTest.test(UserMapperTest.java:24) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:389) at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:115) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$6(TestMethodTestDescriptor.java:167) at org.junit.jupiter.engine.execution.ThrowableCollector.execute(ThrowableCollector.java:40) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:163) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:110) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.lambda$execute$3(HierarchicalTestExecutor.java:83) at org.junit.platform.engine.support.hierarchical.SingleTestExecutor.executeSafely(SingleTestExecutor.java:66) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:77) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.lambda$null$2(HierarchicalTestExecutor.java:92) at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184) at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175) at java.util.Iterator.forEachRemaining(Iterator.java:116) at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151) at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174) at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.lambda$execute$3(HierarchicalTestExecutor.java:92) at org.junit.platform.engine.support.hierarchical.SingleTestExecutor.executeSafely(SingleTestExecutor.java:66) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:77) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.lambda$null$2(HierarchicalTestExecutor.java:92) at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184) at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175) at java.util.Iterator.forEachRemaining(Iterator.java:116) at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151) at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174) at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.lambda$execute$3(HierarchicalTestExecutor.java:92) at org.junit.platform.engine.support.hierarchical.SingleTestExecutor.executeSafely(SingleTestExecutor.java:66) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:77) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:51) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:43) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:170) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:154) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:90) at org.eclipse.jdt.internal.junit5.runner.JUnit5TestReference.run(JUnit5TestReference.java:86) at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:538) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:760) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:460) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:206) Caused by: org.apache.ibatis.builder.BuilderException: Error creating document instance. Cause: java.io.FileNotFoundException: http://www.mybatis.org/dtd/mybatis-3-congig.dtd at org.apache.ibatis.parsing.XPathParser.createDocument(XPathParser.java:259) at org.apache.ibatis.parsing.XPathParser.<init>(XPathParser.java:125) at org.apache.ibatis.builder.xml.XMLConfigBuilder.<init>(XMLConfigBuilder.java:78) at org.apache.ibatis.session.SqlSessionFactoryBuilder.build(SqlSessionFactoryBuilder.java:79) ... 55 more Caused by: java.io.FileNotFoundException: http://www.mybatis.org/dtd/mybatis-3-congig.dtd at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1836) at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1441) at com.sun.org.apache.xerces.internal.impl.XMLEntityManager.setupCurrentEntity(XMLEntityManager.java:647) at com.sun.org.apache.xerces.internal.impl.XMLEntityManager.startEntity(XMLEntityManager.java:1305) at com.sun.org.apache.xerces.internal.impl.XMLEntityManager.startDTDEntity(XMLEntityManager.java:1271) at com.sun.org.apache.xerces.internal.impl.XMLDTDScannerImpl.setInputSource(XMLDTDScannerImpl.java:263) at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$DTDDriver.dispatch(XMLDocumentScannerImpl.java:1167) at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$DTDDriver.next(XMLDocumentScannerImpl.java:1050) at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$PrologDriver.next(XMLDocumentScannerImpl.java:964) at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:606) at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:510) at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:848) at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:777) at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:141) at com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:243) at com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:339) at org.apache.ibatis.parsing.XPathParser.createDocument(XPathParser.java:257) ... 58 more ![图片说明](https://img-ask.csdn.net/upload/201903/15/1552633871_158992.jpg)

使用java nio 将超过2G文件映射到内存中,报异常

代码如下: MappedByteBuffer buffer = FileChannel.map(FileChannel.MapMode.READ_ONLY, 0, Integer.MAX_VALUE);//int java.lang.Integer.MAX_VALUE = 2147483647 [0x7fffffff] 环境:jdk7 64位 操作系统系统win7 64位 异常信息 Exception in thread "main" java.io.IOException: Map failed at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:888) at com.bigfile.read.MappedBiggerFileReader.<init>(MappedBiggerFileReader.java:36) at com.bigfile.read.MappedBiggerFileReader.main(MappedBiggerFileReader.java:83) Caused by: java.lang.OutOfMemoryError: Map failed at sun.nio.ch.FileChannelImpl.map0(Native Method) at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:885) ... 2 more

Exception in thread "Thread-12" java.lang.RuntimeException: java.lang.NullPointerException

第一次着手项目,springboot+mybatis,项目搭建是跟着教程做的。项目搭建好之后项目能正常运行,但是启动后报: Exception in thread "Thread-12" java.lang.RuntimeException: java.lang.NullPointerException at com.mysql.jdbc.JDBC4Connection$1$1.run(JDBC4Connection.java:106) Caused by: java.lang.NullPointerException at com.mysql.jdbc.ConnectionImpl.abortInternal(ConnectionImpl.java:1240) at com.mysql.jdbc.JDBC4Connection$1$1.run(JDBC4Connection.java:104) 启动Console: 2019-08-05 10:29:12.921 INFO 7416 --- [ restartedMain] com.xintujiuzhang.aimanagement.App : Starting App on DESKTOP-J57CTN6 with PID 7416 (D:\JAVA\eclipse\TaxHandwritingRecognition\target\classes started by Administrator in D:\JAVA\eclipse\TaxHandwritingRecognition) 2019-08-05 10:29:12.921 INFO 7416 --- [ restartedMain] com.xintujiuzhang.aimanagement.App : No active profile set, falling back to default profiles: default 2019-08-05 10:29:12.958 INFO 7416 --- [ restartedMain] ConfigServletWebServerApplicationContext : Refreshing org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@6070b0bb: startup date [Mon Aug 05 10:29:12 CST 2019]; root of context hierarchy 2019-08-05 10:29:13.450 INFO 7416 --- [ restartedMain] o.s.b.f.xml.XmlBeanDefinitionReader : Loading XML bean definitions from class path resource [mybatis-config.xml] 2019-08-05 10:29:13.503 INFO 7416 --- [ restartedMain] o.s.b.f.s.DefaultListableBeanFactory : Overriding bean definition for bean 'sqlSessionFactory' with a different definition: replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=app; factoryMethodName=sqlSessionFactoryBean; initMethodName=null; destroyMethodName=(inferred); defined in com.xintujiuzhang.aimanagement.App] with [Generic bean: class [org.mybatis.spring.SqlSessionFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null; defined in class path resource [mybatis-config.xml]] 2019-08-05 10:29:13.599 WARN 7416 --- [ restartedMain] o.m.s.mapper.ClassPathMapperScanner : No MyBatis mapper was found in '[com.xintujiuzhang.aimanagement]' package. Please check your configuration. 2019-08-05 10:29:13.958 INFO 7416 --- [ restartedMain] trationDelegate$BeanPostProcessorChecker : Bean 'org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration' of type [org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration$$EnhancerBySpringCGLIB$$b535b905] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2019-08-05 10:29:14.332 INFO 7416 --- [ restartedMain] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port(s): 8080 (http) 2019-08-05 10:29:14.348 INFO 7416 --- [ restartedMain] o.apache.catalina.core.StandardService : Starting service [Tomcat] 2019-08-05 10:29:14.349 INFO 7416 --- [ restartedMain] org.apache.catalina.core.StandardEngine : Starting Servlet Engine: Apache Tomcat/8.5.28 2019-08-05 10:29:14.355 INFO 7416 --- [ost-startStop-1] o.a.catalina.core.AprLifecycleListener : Loaded APR based Apache Tomcat Native library [1.2.21] using APR version [1.6.5]. 2019-08-05 10:29:14.355 INFO 7416 --- [ost-startStop-1] o.a.catalina.core.AprLifecycleListener : APR capabilities: IPv6 [true], sendfile [true], accept filters [false], random [true]. 2019-08-05 10:29:14.355 INFO 7416 --- [ost-startStop-1] o.a.catalina.core.AprLifecycleListener : APR/OpenSSL configuration: useAprConnector [false], useOpenSSL [true] 2019-08-05 10:29:14.357 INFO 7416 --- [ost-startStop-1] o.a.catalina.core.AprLifecycleListener : OpenSSL successfully initialized [OpenSSL 1.1.1a 20 Nov 2018] 2019-08-05 10:29:14.418 INFO 7416 --- [ost-startStop-1] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext 2019-08-05 10:29:14.418 INFO 7416 --- [ost-startStop-1] o.s.web.context.ContextLoader : Root WebApplicationContext: initialization completed in 1462 ms 2019-08-05 10:29:14.514 INFO 7416 --- [ost-startStop-1] o.s.b.w.servlet.ServletRegistrationBean : Servlet dispatcherServlet mapped to [/] 2019-08-05 10:29:14.517 INFO 7416 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'characterEncodingFilter' to: [/*] 2019-08-05 10:29:14.517 INFO 7416 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'hiddenHttpMethodFilter' to: [/*] 2019-08-05 10:29:14.517 INFO 7416 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'httpPutFormContentFilter' to: [/*] 2019-08-05 10:29:14.517 INFO 7416 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'requestContextFilter' to: [/*] 2019-08-05 10:29:14.631 INFO 7416 --- [ restartedMain] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Starting... 2019-08-05 10:29:19.625 INFO 7416 --- [ restartedMain] com.zaxxer.hikari.pool.PoolBase : HikariPool-1 - Driver does not support get/set network timeout for connections. (com.mysql.jdbc.JDBC4Connection.getNetworkTimeout()I) 2019-08-05 10:29:19.629 INFO 7416 --- [ restartedMain] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Start completed. 2019-08-05 10:29:19.670 INFO 7416 --- [ restartedMain] j.LocalContainerEntityManagerFactoryBean : Building JPA container EntityManagerFactory for persistence unit 'default' 2019-08-05 10:29:19.684 INFO 7416 --- [ restartedMain] o.hibernate.jpa.internal.util.LogHelper : HHH000204: Processing PersistenceUnitInfo [ name: default ...] 2019-08-05 10:29:19.738 INFO 7416 --- [ restartedMain] org.hibernate.Version : HHH000412: Hibernate Core {5.2.10.Final} 2019-08-05 10:29:19.739 INFO 7416 --- [ restartedMain] org.hibernate.cfg.Environment : HHH000206: hibernate.properties not found 2019-08-05 10:29:19.770 INFO 7416 --- [ restartedMain] o.hibernate.annotations.common.Version : HCANN000001: Hibernate Commons Annotations {5.0.1.Final} 2019-08-05 10:29:19.874 INFO 7416 --- [ restartedMain] org.hibernate.dialect.Dialect : HHH000400: Using dialect: org.hibernate.dialect.MySQL5Dialect 2019-08-05 10:29:19.897 INFO 7416 --- [ restartedMain] o.h.e.j.e.i.LobCreatorBuilderImpl : HHH000423: Disabling contextual LOB creation as JDBC driver reported JDBC version [3] less than 4 2019-08-05 10:29:20.037 INFO 7416 --- [ restartedMain] j.LocalContainerEntityManagerFactoryBean : Initialized JPA EntityManagerFactory for persistence unit 'default' 10:29:20,543 INFO App:32-org.springframework.orm.jpa.JpaTransactionManager 2019-08-05 10:29:20.796 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerAdapter : Looking for @ControllerAdvice: org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@6070b0bb: startup date [Mon Aug 05 10:29:12 CST 2019]; root of context hierarchy 2019-08-05 10:29:20.824 WARN 7416 --- [ restartedMain] aWebConfiguration$JpaWebMvcConfiguration : spring.jpa.open-in-view is enabled by default. Therefore, database queries may be performed during view rendering. Explicitly configure spring.jpa.open-in-view to disable this warning 2019-08-05 10:29:20.846 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/application/handwriting_authorization],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.ApplicationAuthorizationController.joinHWA(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.847 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/application/handwriting_information],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.ApplicationInformationController.joinHWI(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.847 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/log/background_management_login],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.BackgroundManagementLoginController.joinHWI(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.847 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/log/computer_login],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.ComputerLoginController.jumpDEM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.847 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/authority/department_management],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.DepartmentManagementController.jumpDEM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.848 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/authority/dictionary_management],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.DictionaryManagementController.jumpDM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.848 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/equipment/hardware_devices],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.EquipmentManagementController.jumpDM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.848 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/queryGetServerStatusInterface],methods=[GET]}" onto public java.util.Map<java.lang.String, java.lang.Object> com.xintujiuzhang.aimanagement.controller.GetServerStatusInterfaceController.queryGetServerStatus(java.lang.String,java.lang.String) 2019-08-05 10:29:20.848 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/queryHandwritingInterface],methods=[GET]}" onto public java.util.Map<java.lang.String, java.lang.Object> com.xintujiuzhang.aimanagement.controller.HandwritingInterfaceController.queryHandwriting(java.lang.String,java.lang.String,java.lang.Integer,java.util.Date) 2019-08-05 10:29:20.849 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/identity/identity_management],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.IdentityManagementController.jumpDM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.849 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/queryInstallationRequest],methods=[GET]}" onto public java.util.Map<java.lang.String, java.lang.Object> com.xintujiuzhang.aimanagement.controller.InstallationRequestInterfaceController.queryInstallationRequest(java.lang.String,java.lang.String,java.lang.String) 2019-08-05 10:29:20.849 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/queryLicenseInterface],methods=[GET]}" onto public java.util.Map<java.lang.String, java.lang.Object> com.xintujiuzhang.aimanagement.controller.LicenseInterfaceController.queryLicense(java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.util.Date) 2019-08-05 10:29:20.851 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.MainController.main(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.851 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/login],methods=[GET]}" onto public org.springframework.web.servlet.ModelAndView com.xintujiuzhang.aimanagement.controller.MainController.login() 2019-08-05 10:29:20.851 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/login],methods=[POST]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.MainController.login(java.lang.String,java.lang.String,java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.851 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/main],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.MainController.jumpmain(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.852 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/authority/menu_management],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.MenuManagementController.jumpMM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.852 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/log/module_browsing],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.ModuleBrowsingController.jumpTM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.852 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/log/module_operation],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.ModuleOperationController.jumpMM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.852 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/deletehandwriting],methods=[POST]}" onto public void com.xintujiuzhang.aimanagement.controller.NotesController.deletehandwriting(java.lang.String) 2019-08-05 10:29:20.852 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/updatehandwriting],methods=[POST]}" onto public void com.xintujiuzhang.aimanagement.controller.NotesController.updatehandwriting(com.xintujiuzhang.aimanagement.pojo.Handwriting) 2019-08-05 10:29:20.852 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/addhandwriting],methods=[POST]}" onto public void com.xintujiuzhang.aimanagement.controller.NotesController.addhandwriting(com.xintujiuzhang.aimanagement.pojo.Handwriting) 2019-08-05 10:29:20.853 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/querynotes],methods=[GET]}" onto public com.xintujiuzhang.aimanagement.pojo.Handwriting com.xintujiuzhang.aimanagement.controller.NotesController.querynotes(java.lang.String) 2019-08-05 10:29:20.853 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/authority/parameter_settings],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.parameterSettingsController.jumpPS(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.853 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/authority/role_authorization],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.RoleAuthorizationController.jumpRA(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.853 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/log/service_call],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.ServiceCallController.jumpRA(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.853 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/authority/task_management],methods=[GET]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.TaskManagementController.jumpTM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.854 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/authority/user_management]}" onto public java.lang.String com.xintujiuzhang.aimanagement.controller.UserController.jnmpUM(java.util.Map<java.lang.String, java.lang.Object>) 2019-08-05 10:29:20.854 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/queryUserInterface],methods=[GET]}" onto public java.util.Map<java.lang.String, java.lang.Object> com.xintujiuzhang.aimanagement.controller.UserInterfaceController.queryLicense(java.lang.String,java.lang.String,java.lang.Integer,java.util.Date) 2019-08-05 10:29:20.854 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/queryVersionCheckingInterface],methods=[GET]}" onto public java.util.Map<java.lang.String, java.lang.Object> com.xintujiuzhang.aimanagement.controller.VersionCheckingInterfaceController.queryVersionChecking(java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.Integer,java.util.Date) 2019-08-05 10:29:20.854 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/queryVersionupdateInterface],methods=[GET]}" onto public java.util.Map<java.lang.String, java.lang.Object> com.xintujiuzhang.aimanagement.controller.VersionUpdateInterfaceController.queryVersionupdate(java.lang.String,java.lang.String,java.lang.String,java.util.Date) 2019-08-05 10:29:20.857 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/error]}" onto public org.springframework.http.ResponseEntity<java.util.Map<java.lang.String, java.lang.Object>> org.springframework.boot.autoconfigure.web.servlet.error.BasicErrorController.error(javax.servlet.http.HttpServletRequest) 2019-08-05 10:29:20.857 INFO 7416 --- [ restartedMain] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/error],produces=[text/html]}" onto public org.springframework.web.servlet.ModelAndView org.springframework.boot.autoconfigure.web.servlet.error.BasicErrorController.errorHtml(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse) 2019-08-05 10:29:20.881 INFO 7416 --- [ restartedMain] o.s.w.s.handler.SimpleUrlHandlerMapping : Mapped URL path [/webjars/**] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler] 2019-08-05 10:29:20.881 INFO 7416 --- [ restartedMain] o.s.w.s.handler.SimpleUrlHandlerMapping : Mapped URL path [/**] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler] 2019-08-05 10:29:20.906 INFO 7416 --- [ restartedMain] o.s.w.s.handler.SimpleUrlHandlerMapping : Mapped URL path [/**/favicon.ico] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler] 2019-08-05 10:29:21.115 INFO 7416 --- [ restartedMain] o.s.b.d.a.OptionalLiveReloadServer : LiveReload server is running on port 35729 2019-08-05 10:29:21.164 INFO 7416 --- [ restartedMain] o.s.j.e.a.AnnotationMBeanExporter : Registering beans for JMX exposure on startup 2019-08-05 10:29:21.165 INFO 7416 --- [ restartedMain] o.s.j.e.a.AnnotationMBeanExporter : Bean with name 'dataSource' has been autodetected for JMX exposure 2019-08-05 10:29:21.170 INFO 7416 --- [ restartedMain] o.s.j.e.a.AnnotationMBeanExporter : Located MBean 'dataSource': registering with JMX server as MBean [com.zaxxer.hikari:name=dataSource,type=HikariDataSource] 2019-08-05 10:29:21.196 INFO 7416 --- [ restartedMain] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 8080 (http) with context path '' 2019-08-05 10:29:21.198 INFO 7416 --- [ restartedMain] com.xintujiuzhang.aimanagement.App : Started App in 8.458 seconds (JVM running for 8.828) Exception in thread "Thread-12" java.lang.RuntimeException: java.lang.NullPointerException at com.mysql.jdbc.JDBC4Connection$1$1.run(JDBC4Connection.java:106) Caused by: java.lang.NullPointerException at com.mysql.jdbc.ConnectionImpl.abortInternal(ConnectionImpl.java:1240) at com.mysql.jdbc.JDBC4Connection$1$1.run(JDBC4Connection.java:104) **在执行一些方法后“Thread-12”会不断增加** **ps:“Thread-668"**

dubbo服务调用报错,哪位大神知道这是什么错误

Caused by: com.alibaba.com.caucho.hessian.io.HessianProtocolException: 'org.springframework.beans.factory.NoUniqueBeanDefinitionException' could not be instantiated at com.alibaba.com.caucho.hessian.io.JavaDeserializer.instantiate(JavaDeserializer.java:275) ~[dubbo-2.8.4.jar:2.8.4] at com.alibaba.com.caucho.hessian.io.JavaDeserializer.readObject(JavaDeserializer.java:155) ~[dubbo-2.8.4.jar:2.8.4] at com.alibaba.com.caucho.hessian.io.Hessian2Input.readObjectInstance(Hessian2Input.java:2067) ~[dubbo-2.8.4.jar:2.8.4] at com.alibaba.com.caucho.hessian.io.Hessian2Input.readObject(Hessian2Input.java:1592) ~[dubbo-2.8.4.jar:2.8.4] at com.alibaba.com.caucho.hessian.io.Hessian2Input.readObject(Hessian2Input.java:1576) ~[dubbo-2.8.4.jar:2.8.4] at com.alibaba.com.caucho.hessian.io.JavaDeserializer$ObjectFieldDeserializer.deserialize(JavaDeserializer.java:396) ~[dubbo-2.8.4.jar:2.8.4] ... 27 common frames omitted Caused by: java.lang.reflect.InvocationTargetException: null at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.8.0_191] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[na:1.8.0_191] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.8.0_191] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[na:1.8.0_191] at com.alibaba.com.caucho.hessian.io.JavaDeserializer.instantiate(JavaDeserializer.java:271) ~[dubbo-2.8.4.jar:2.8.4] ... 32 common frames omitted Caused by: java.lang.NullPointerException: null at java.util.Objects.requireNonNull(Objects.java:203) ~[na:1.8.0_191] at java.util.Arrays$ArrayList.<init>(Arrays.java:3813) ~[na:1.8.0_191] at java.util.Arrays.asList(Arrays.java:3800) ~[na:1.8.0_191] at org.springframework.beans.factory.NoUniqueBeanDefinitionException.<init>(NoUniqueBeanDefinitionException.java:65) ~[spring-beans-4.2.5.RELEASE.jar:4.2.5.RELEASE] ... 37 common frames omitted ``` ```

请教各位会JAVA的大神

请问各位JAVA大神,这源代码出了什么问题? import java.io.BufferedReader; import java.io.IOException; import java.io.InputStream; import java.io.InputStreamReader; import java.util.ArrayList; import java.util.Date; import java.util.HashMap; import java.util.HashSet; import java.util.Iterator; import java.util.List; import java.util.Map; import java.util.Set; import java.util.StringTokenizer; /*** @author Robbin Fan ***/ public class IPBanner { public static final String NETSTAT = "netstat -nt"; public static final String IP_INSERT = "iptables -I INPUT -i eth0 -j DROP -p tcp --dport 80 -s "; public static final String IP_DEL = "iptables -D INPUT -i eth0 -j DROP -p tcp --dport 80 -s "; public static final String HOST_IP = "61.129.70.239:80"; public static final long BAN_TIMEOUT = 30 * 60 * 1000L; public static final long BAN_INTERVAL = 30 * 1000L; public static final int CONCURRENT = 80; public static final int SYN_CONCURRENT = 8; public static final Map banMap = new HashMap(); public static void ban()//throwsException { Set banList = dynamicBanIP(); System.out.println(); System.out.println("Time: " + new Date()); Runtime runtime = Runtime.getRuntime(); List expiredIPList = new ArrayList(); for(Iterator iter = banMap.entrySet().iterator(); iter.hasNext();){ Map.Entry entry = (Map.Entry) iter.next(); if((System.currentTimeMillis() - ((Long) entry.getValue()).longValue()) > BAN_TIMEOUT){ expiredIPList.add(entry.getKey()); } } for(int i = 0; i <expiredIPList.size(); i++){ try { runtime.exec(IP_DEL + expiredIPList.get(i)); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } System.out.println("DEL IP: " + expiredIPList.get(i)); banMap.remove(expiredIPList.get(i)); } for(Iterator iter = banList.iterator(); iter.hasNext();){ String ip = (String) iter.next(); if(!banMap.containsKey(ip)){ try { runtime.exec(IP_INSERT + ip); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } banMap.put(ip, new Long(System.currentTimeMillis())); System.out.println("BAN IP:" + ip); } } System.out.println("---ban ip list---"); for(Iterator iter = banMap.keySet().iterator(); iter.hasNext();){ String ip = (String) iter.next(); System.out.println(ip); } } public static Set dynamicBanIP()//throwsException { String ipstat = null; Set banList = new HashSet(); List ipList = new ArrayList(); List countList = new ArrayList(); List synCountList = new ArrayList(); List finCountList = new ArrayList(); Runtime runtime = Runtime.getRuntime(); Process process = null; try { process = runtime.exec(NETSTAT); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } InputStream input = process.getInputStream(); InputStreamReader inputReader = new InputStreamReader(input); BufferedReader reader = new BufferedReader(inputReader); try { reader.readLine(); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } try { reader.readLine(); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } try { while((ipstat = reader.readLine()) != null){ StringTokenizer token = new StringTokenizer(ipstat); while(token.hasMoreTokens()){ token.nextToken(); token.nextToken(); token.nextToken(); String originalIP = token.nextToken(); String ip = token.nextToken().split(":")[0]; String status = token.nextToken(); if(HOST_IP.equals(originalIP)){ if(!ipList.contains(ip)){ ipList.add(ip); countList.add(new Integer(1)); if("SYN_RECV".equals(status)){ synCountList.add(new Integer(1)); }else{ synCountList.add(new Integer(0)); } if("FIN_WAIT1".equals(status)){ finCountList.add(new Integer(1)); }else{ finCountList.add(new Integer(0)); } }else{ int index = ipList.indexOf(ip); countList.set(index, new Integer(((Integer) countList.get(index)).intValue() + 1)); if("SYN_RECV".equals(status)){ synCountList.set(index, new Integer(((Integer) synCountList.get(index)).intValue() + 1)); } if("FIN_WAIT1".equals(status)){ finCountList.set(index, new Integer(((Integer) finCountList.get(index)).intValue() + 1)); } } } } } } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } try { reader.close(); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } try { inputReader.close(); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } try { input.close(); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } process.destroy(); for(int i = 0; i < ipList.size(); i++){ if(((Integer) countList.get(i)).intValue() > CONCURRENT) banList.add(ipList.get(i)); if(((Integer) synCountList.get(i)).intValue() > SYN_CONCURRENT) banList.add(ipList.get(i)); if(((Integer) finCountList.get(i)).intValue() > SYN_CONCURRENT) banList.add(ipList.get(i)); } return banList; } public static void main(String[] args)//throwsException {while(true){ ban(); try { Thread.sleep(BAN_INTERVAL); } catch (InterruptedException e) { // TODO Auto-generated catch block e.printStackTrace(); } } } } ![图片说明](https://img-ask.csdn.net/upload/201712/02/1512217142_180962.png)

Exception in thread "main" java.lang.NoClassDefFoundError:

这是我的java代码,用于Mapreduce打包,至于Linux里。 import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.LongWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Job; import org.apache.hadoop.mapreduce.Mapper; import org.apache.hadoop.mapreduce.Reducer; import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; import java.io.IOException; public class WordCountAPP { public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException { if(args.length == 0){ args = new String[]{"/wordcount.txt","/wordcount-result"}; } Configuration conf=new Configuration(); Job job=Job.getInstance(conf,WordCountMap.class.getName()); //打成jar包运行 job.setJarByClass(WordCountMap.class); //数据来自哪里 FileInputFormat.setInputPaths(job,args[0]); //使用哪个mapper处理输入的数据 job.setMapperClass(WordCountMap.class); //map输出的数据类型是什么 job.setMapOutputKeyClass(Text.class); job.setMapOutputValueClass(LongWritable.class); //使用哪个reducer处理输出的数据 job.setReducerClass(WordCountReduce.class); //reduce输出数据的类型是什么 job.setOutputKeyClass(Text.class); job.setOutputValueClass(LongWritable.class); //数据输出到哪里 FileOutputFormat.setOutputPath(job,new Path(args[1])); //交给yarn去执行,直到执行结束后才退出本程序 job.waitForCompletion(true); } public static class WordCountMap extends Mapper<LongWritable,Text,Text,LongWritable> { @Override public void map(LongWritable key, Text value, Mapper<LongWritable,Text,Text,LongWritable>.Context context) throws IOException, InterruptedException { String line=value.toString(); String[] splited=line.split(" "); for(String word: splited){ context.write(new Text(word),new LongWritable(1)); } } } public static class WordCountReduce extends Reducer<Text,LongWritable,Text,LongWritable> { @Override public void reduce(Text key, Iterable<LongWritable> values, Reducer<Text,LongWritable,Text,LongWritable>.Context context) throws IOException, InterruptedException { long count=0L; for (LongWritable v:values){ count+=v.get(); } LongWritable v2=new LongWritable(count); context.write(key,v2); } } } 这是运行后报的错 Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration at WordCountAPP.main(WordCountAPP.java:17) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 1 more

程序员的兼职技能课

获取讲师答疑方式: 在付费视频第一节(触摸命令_ALL)片头有二维码及加群流程介绍 限时福利 原价99元,今日仅需39元!购课添加小助手(微信号:csdn590)按提示还可领取价值800元的编程大礼包! 讲师介绍: 苏奕嘉&nbsp;前阿里UC项目工程师 脚本开发平台官方认证满级(六级)开发者。 我将如何教会你通过【定制脚本】赚到你人生的第一桶金? 零基础程序定制脚本开发课程,是完全针对零脚本开发经验的小白而设计,课程内容共分为3大阶段: ①前期将带你掌握Q开发语言和界面交互开发能力; ②中期通过实战来制作有具体需求的定制脚本; ③后期将解锁脚本的更高阶玩法,打通任督二脉; ④应用定制脚本合法赚取额外收入的完整经验分享,带你通过程序定制脚本开发这项副业,赚取到你的第一桶金!

Windows版YOLOv4目标检测实战:训练自己的数据集

课程演示环境:Windows10; cuda 10.2; cudnn7.6.5; Python3.7; VisualStudio2019; OpenCV3.4 需要学习ubuntu系统上YOLOv4的同学请前往:《YOLOv4目标检测实战:训练自己的数据集》 课程链接:https://edu.csdn.net/course/detail/28745 YOLOv4来了!速度和精度双提升! 与 YOLOv3 相比,新版本的 AP (精度)和 FPS (每秒帧率)分别提高了 10% 和 12%。 YOLO系列是基于深度学习的端到端实时目标检测方法。本课程将手把手地教大家使用labelImg标注和使用YOLOv4训练自己的数据集。课程实战分为两个项目:单目标检测(足球目标检测)和多目标检测(足球和梅西同时检测)。 本课程的YOLOv4使用AlexyAB/darknet,在Windows系统上做项目演示。包括:安装软件环境、安装YOLOv4、标注自己的数据集、整理自己的数据集、修改配置文件、训练自己的数据集、测试训练出的网络模型、性能统计(mAP计算)和先验框聚类分析。还将介绍改善YOLOv4目标检测性能的技巧。 除本课程《Windows版YOLOv4目标检测实战:训练自己的数据集》外,本人将推出有关YOLOv4目标检测的系列课程。请持续关注该系列的其它视频课程,包括: 《Windows版YOLOv4目标检测实战:人脸口罩佩戴检测》 《Windows版YOLOv4目标检测实战:中国交通标志识别》 《Windows版YOLOv4目标检测:原理与源码解析》

lena全身原图(非256*256版本,而是全身原图)

lena全身原图(非256*256版本,而是全身原图) lena原图很有意思,我们通常所用的256*256图片是在lena原图上截取了头部部分的256*256正方形得到的. 原图是花花公子杂志上的一个

快速入门Android开发 视频 教程 android studio

这是一门快速入门Android开发课程,顾名思义是让大家能快速入门Android开发。 学完能让你学会如下知识点: Android的发展历程 搭建Java开发环境 搭建Android开发环境 Android Studio基础使用方法 Android Studio创建项目 项目运行到模拟器 项目运行到真实手机 Android中常用控件 排查开发中的错误 Android中请求网络 常用Android开发命令 快速入门Gradle构建系统 项目实战:看美图 常用Android Studio使用技巧 项目签名打包 如何上架市场

Java调用微信支付

Java 使用微信支付 一. 准备工作 1.

汽车租赁管理系统需求分析规格说明书

汽车租赁管理系统需求分析规格说明书,这只是一个模板,如果有不会的可以借鉴一下,还是蛮详细的。。。。

C/C++跨平台研发从基础到高阶实战系列套餐

一 专题从基础的C语言核心到c++ 和stl完成基础强化; 二 再到数据结构,设计模式完成专业计算机技能强化; 三 通过跨平台网络编程,linux编程,qt界面编程,mfc编程,windows编程,c++与lua联合编程来完成应用强化 四 最后通过基于ffmpeg的音视频播放器,直播推流,屏幕录像,

程序员的算法通关课:知己知彼(第一季)

【超实用课程内容】 程序员对于算法一直又爱又恨!特别是在求职面试时,算法类问题绝对是不可逃避的提问点!本门课程作为算法面试系列的第一季,会从“知己知彼”的角度,聊聊关于算法面试的那些事~ 【哪些人适合学习这门课程?】 求职中的开发者,对于面试算法阶段缺少经验 想了解实际工作中算法相关知识 在职程序员,算法基础薄弱,急需充电 【超人气讲师】 孙秀洋&nbsp;| 服务器端工程师 硕士毕业于哈工大计算机科学与技术专业,ACM亚洲区赛铜奖获得者,先后在腾讯和百度从事一线技术研发,对算法和后端技术有深刻见解。 【课程如何观看?】 PC端:https://edu.csdn.net/course/detail/27272 移动端:CSDN 学院APP(注意不是CSDN APP哦) 本课程为录播课,课程无限观看时长,但是大家可以抓紧时间学习后一起讨论哦~

机器学习初学者必会的案例精讲

通过六个实际的编码项目,带领同学入门人工智能。这些项目涉及机器学习(回归,分类,聚类),深度学习(神经网络),底层数学算法,Weka数据挖掘,利用Git开源项目实战等。

Python入门视频精讲

Python入门视频培训课程以通俗易懂的方式讲解Python核心技术,Python基础,Python入门。适合初学者的教程,让你少走弯路! 课程内容包括:1.Python简介和安装 、2.第一个Python程序、PyCharm的使用 、3.Python基础、4.函数、5.高级特性、6.面向对象、7.模块、8.异常处理和IO操作、9.访问数据库MySQL。教学全程采用笔记+代码案例的形式讲解,通俗易懂!!!

我以为我对Mysql事务很熟,直到我遇到了阿里面试官

太惨了,面试又被吊打

深度学习原理+项目实战+算法详解+主流框架(套餐)

深度学习系列课程从深度学习基础知识点开始讲解一步步进入神经网络的世界再到卷积和递归神经网络,详解各大经典网络架构。实战部分选择当下最火爆深度学习框架PyTorch与Tensorflow/Keras,全程实战演示框架核心使用与建模方法。项目实战部分选择计算机视觉与自然语言处理领域经典项目,从零开始详解算法原理,debug模式逐行代码解读。适合准备就业和转行的同学们加入学习! 建议按照下列课程顺序来进行学习 (1)掌握深度学习必备经典网络架构 (2)深度框架实战方法 (3)计算机视觉与自然语言处理项目实战。(按照课程排列顺序即可)

Java62数据提取代码

利用苹果手机微信下面的wx.data文件提取出62数据,通过62可以实现不同设备直接登陆,可以通过文件流的方式用脚本上传到服务器进行解析

Python代码实现飞机大战

文章目录经典飞机大战一.游戏设定二.我方飞机三.敌方飞机四.发射子弹五.发放补给包六.主模块 经典飞机大战 源代码以及素材资料(图片,音频)可从下面的github中下载: 飞机大战源代码以及素材资料github项目地址链接 ————————————————————————————————————————————————————————— 不知道大家有没有打过飞机,喜不喜欢打飞机。当我第一次接触这个东西的时候,我的内心是被震撼到的。第一次接触打飞机的时候作者本人是身心愉悦的,因为周边的朋友都在打飞机, 每

2018年全国大学生计算机技能应用大赛决赛 大题

2018年全国大学生计算机技能应用大赛决赛大题,程序填空和程序设计(侵删)

Lena图像处理测试专业用图,高清完整全身原图

Lena图像处理测试专业用图,高清完整全身原图,该图片很好的包含了平坦区域、阴影和纹理等细节,这些都有益于测试各种不同的图像处理算法。它是一幅很好的测试照片!其次,由于这是一个非常有魅力女人的照片。

MySQL数据库面试题(2020最新版)

文章目录数据库基础知识为什么要使用数据库什么是SQL?什么是MySQL?数据库三大范式是什么mysql有关权限的表都有哪几个MySQL的binlog有有几种录入格式?分别有什么区别?数据类型mysql有哪些数据类型引擎MySQL存储引擎MyISAM与InnoDB区别MyISAM索引与InnoDB索引的区别?InnoDB引擎的4大特性存储引擎选择索引什么是索引?索引有哪些优缺点?索引使用场景(重点)...

verilog实现地铁系统售票

使用 verilog 实现地铁售票

Python+OpenCV计算机视觉

Python+OpenCV计算机视觉系统全面的介绍。

Python可以这样学(第四季:数据分析与科学计算可视化)

董付国老师系列教材《Python程序设计(第2版)》(ISBN:9787302436515)、《Python可以这样学》(ISBN:9787302456469)配套视频,在教材基础上又增加了大量内容,通过实例讲解numpy、scipy、pandas、statistics、matplotlib等标准库和扩展库用法。

150讲轻松搞定Python网络爬虫

【为什么学爬虫?】 &nbsp; &nbsp; &nbsp; &nbsp;1、爬虫入手容易,但是深入较难,如何写出高效率的爬虫,如何写出灵活性高可扩展的爬虫都是一项技术活。另外在爬虫过程中,经常容易遇到被反爬虫,比如字体反爬、IP识别、验证码等,如何层层攻克难点拿到想要的数据,这门课程,你都能学到! &nbsp; &nbsp; &nbsp; &nbsp;2、如果是作为一个其他行业的开发者,比如app开发,web开发,学习爬虫能让你加强对技术的认知,能够开发出更加安全的软件和网站 【课程设计】 一个完整的爬虫程序,无论大小,总体来说可以分成三个步骤,分别是: 网络请求:模拟浏览器的行为从网上抓取数据。 数据解析:将请求下来的数据进行过滤,提取我们想要的数据。 数据存储:将提取到的数据存储到硬盘或者内存中。比如用mysql数据库或者redis等。 那么本课程也是按照这几个步骤循序渐进的进行讲解,带领学生完整的掌握每个步骤的技术。另外,因为爬虫的多样性,在爬取的过程中可能会发生被反爬、效率低下等。因此我们又增加了两个章节用来提高爬虫程序的灵活性,分别是: 爬虫进阶:包括IP代理,多线程爬虫,图形验证码识别、JS加密解密、动态网页爬虫、字体反爬识别等。 Scrapy和分布式爬虫:Scrapy框架、Scrapy-redis组件、分布式爬虫等。 通过爬虫进阶的知识点我们能应付大量的反爬网站,而Scrapy框架作为一个专业的爬虫框架,使用他可以快速提高我们编写爬虫程序的效率和速度。另外如果一台机器不能满足你的需求,我们可以用分布式爬虫让多台机器帮助你快速爬取数据。 &nbsp; 从基础爬虫到商业化应用爬虫,本套课程满足您的所有需求! 【课程服务】 专属付费社群+每周三讨论会+1v1答疑

获取Linux下Ftp目录树并逐步绑定到treeview

在linux下抓取目录树,双击后获取该节点子节点(逐步生成)。另外有两个类,一个是windows下的(一次性获取目录树),一个是linux下的(足部获取目录树)

YOLOv3目标检测实战系列课程

《YOLOv3目标检测实战系列课程》旨在帮助大家掌握YOLOv3目标检测的训练、原理、源码与网络模型改进方法。 本课程的YOLOv3使用原作darknet(c语言编写),在Ubuntu系统上做项目演示。 本系列课程包括三门课: (1)《YOLOv3目标检测实战:训练自己的数据集》 包括:安装darknet、给自己的数据集打标签、整理自己的数据集、修改配置文件、训练自己的数据集、测试训练出的网络模型、性能统计(mAP计算和画出PR曲线)和先验框聚类。 (2)《YOLOv3目标检测:原理与源码解析》讲解YOLOv1、YOLOv2、YOLOv3的原理、程序流程并解析各层的源码。 (3)《YOLOv3目标检测:网络模型改进方法》讲解YOLOv3的改进方法,包括改进1:不显示指定类别目标的方法 (增加功能) ;改进2:合并BN层到卷积层 (加快推理速度) ; 改进3:使用GIoU指标和损失函数 (提高检测精度) ;改进4:tiny YOLOv3 (简化网络模型)并介绍 AlexeyAB/darknet项目。

手把手实现Java图书管理系统(附源码)

【超实用课程内容】 本课程演示的是一套基于Java的SSM框架实现的图书管理系统,主要针对计算机相关专业的正在做毕设的学生与需要项目实战练习的java人群。详细介绍了图书管理系统的实现,包括:环境搭建、系统业务、技术实现、项目运行、功能演示、系统扩展等,以通俗易懂的方式,手把手的带你从零开始运行本套图书管理系统,该项目附带全部源码可作为毕设使用。 【课程如何观看?】 PC端:https://edu.csdn.net/course/detail/27513 移动端:CSDN 学院APP(注意不是CSDN APP哦) 本课程为录播课,课程2年有效观看时长,大家可以抓紧时间学习后一起讨论哦~ 【学员专享增值服务】 源码开放 课件、课程案例代码完全开放给你,你可以根据所学知识,自行修改、优化

微信小程序开发实战之番茄时钟开发

微信小程序番茄时钟视频教程,本课程将带着各位学员开发一个小程序初级实战类项目,针对只看过官方文档而又无从下手的开发者来说,可以作为一个较好的练手项目,对于有小程序开发经验的开发者而言,可以更好加深对小程序各类组件和API 的理解,为更深层次高难度的项目做铺垫。

Java 最常见的 200+ 面试题:面试必备

这份面试清单是从我 2015 年做了 TeamLeader 之后开始收集的,一方面是给公司招聘用,另一方面是想用它来挖掘在 Java 技术栈中,还有那些知识点是我不知道的,我想找到这些技术盲点,然后修复它,以此来提高自己的技术水平。虽然我是从 2009 年就开始参加编程工作了,但我依旧觉得自己现在要学的东西很多,并且学习这些知识,让我很有成就感和满足感,那所以何乐而不为呢? 说回面试的事,这份面试...

Java基础知识面试题(2020最新版)

文章目录Java概述何为编程什么是Javajdk1.5之后的三大版本JVM、JRE和JDK的关系什么是跨平台性?原理是什么Java语言有哪些特点什么是字节码?采用字节码的最大好处是什么什么是Java程序的主类?应用程序和小程序的主类有何不同?Java应用程序与小程序之间有那些差别?Java和C++的区别Oracle JDK 和 OpenJDK 的对比基础语法数据类型Java有哪些数据类型switc...

三个项目玩转深度学习(附1G源码)

从事大数据与人工智能开发与实践约十年,钱老师亲自见证了大数据行业的发展与人工智能的从冷到热。事实证明,计算机技术的发展,算力突破,海量数据,机器人技术等,开启了第四次工业革命的序章。深度学习图像分类一直是人工智能的经典任务,是智慧零售、安防、无人驾驶等机器视觉应用领域的核心技术之一,掌握图像分类技术是机器视觉学习的重中之重。针对现有线上学习的特点与实际需求,我们开发了人工智能案例实战系列课程。打造:以项目案例实践为驱动的课程学习方式,覆盖了智能零售,智慧交通等常见领域,通过基础学习、项目案例实践、社群答疑,三维立体的方式,打造最好的学习效果。

微信小程序 实例汇总 完整项目源代码

微信小程序 实例汇总 完整项目源代码

基于西门子S7—1200的单部六层电梯设计程序,1部6层电梯

基于西门子S7—1200的单部六层电梯设计程序,1部6层电梯。 本系统控制六层电梯, 采用集选控制方式。 为了完成设定的控制任务, 主要根据电梯输入/输出点数确定PLC 的机型。 根据电梯控制的要求,

相关热词 c#分级显示数据 c# 不区分大小写替换 c#中调用就java c#正则表达式 验证小数 c# vscode 配置 c#三维数组能存多少数据 c# 新建excel c#多个文本框 c#怎么创建tcp通讯 c# mvc 电子病例
立即提问
相关内容推荐