scala运行hello world出错

Error:scalac: Error: org.jetbrains.jps.incremental.scala.remote.ServerException
Error compiling sbt component 'compiler-interface-2.10.0-52.0'

at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$apply$2.apply(AnalyzingCompiler.scala:145)

at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$apply$2.apply(AnalyzingCompiler.scala:142)

at sbt.IO$.withTemporaryDirectory(IO.scala:291)

at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:142)

at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:139)

at sbt.IO$.withTemporaryDirectory(IO.scala:291)

at sbt.compiler.AnalyzingCompiler$.compileSources(AnalyzingCompiler.scala:139)

at sbt.compiler.IC$.compileInterfaceJar(IncrementalCompiler.scala:52)

at org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl$.getOrCompileInterfaceJar(CompilerFactoryImpl.scala:96)

at org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl$$anonfun$getScalac$1.apply(CompilerFactoryImpl.scala:50)

at org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl$$anonfun$getScalac$1.apply(CompilerFactoryImpl.scala:49)

at scala.Option.map(Option.scala:146)

at org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl.getScalac(CompilerFactoryImpl.scala:49)

at org.jetbrains.jps.incremental.scala.local.CompilerFactoryImpl.createCompiler(CompilerFactoryImpl.scala:22)

at org.jetbrains.jps.incremental.scala.local.CachingFactory$$anonfun$createCompiler$1.apply(CachingFactory.scala:24)

at org.jetbrains.jps.incremental.scala.local.CachingFactory$$anonfun$createCompiler$1.apply(CachingFactory.scala:24)

at org.jetbrains.jps.incremental.scala.local.Cache$$anonfun$getOrUpdate$2.apply(Cache.scala:20)

at scala.Option.getOrElse(Option.scala:121)

at org.jetbrains.jps.incremental.scala.local.Cache.getOrUpdate(Cache.scala:19)

at org.jetbrains.jps.incremental.scala.local.CachingFactory.createCompiler(CachingFactory.scala:23)

at org.jetbrains.jps.incremental.scala.local.LocalServer.compile(LocalServer.scala:22)

at org.jetbrains.jps.incremental.scala.remote.Main$.make(Main.scala:68)

at org.jetbrains.jps.incremental.scala.remote.Main$.nailMain(Main.scala:25)

at org.jetbrains.jps.incremental.scala.remote.Main.nailMain(Main.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at com.martiansoftware.nailgun.NGSession.run(NGSession.java:319)

2个回答

You need to update your scala-sdk to 2.10.3 or newer.

版本问题吧。卸了重新装一下

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
spark程序使用scalac先编译再使用scala运行和打成jar包使用spark-submit提交运行有什么区别?
第一种方式:先使用scalac命令 编译,再使用scala命令运行 第二种方式:先使用sbt打包,然后再使用spark-submit提交运行spark 这两种方式有什么区别?各有什么优劣势?先谢谢大家了 ~
Scala环境变量配置出错
网上说是因为安装路径有空格,但是已经确定安装路径没有问题了 ![图片说明](https://img-ask.csdn.net/upload/201611/23/1479867508_97527.jpg) 安装路径 :D:\scala path: ![图片说明](https://img-ask.csdn.net/upload/201611/23/1479867500_811727.jpg)
Spark scala 运行报错 .test$$anonfun$1
同样的写法再scala中执行报错,然而在java中能够正常执行 以下是报错内容 ``` helloOffSLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/C:/Users/zsts/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.5/log4j-slf4j-impl-2.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/C:/Users/zsts/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console. 19:25:11.875 [main] ERROR org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:355) ~[hadoop-common-2.6.4.jar:?] at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:370) [hadoop-common-2.6.4.jar:?] at org.apache.hadoop.util.Shell.<clinit>(Shell.java:363) [hadoop-common-2.6.4.jar:?] at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79) [hadoop-common-2.6.4.jar:?] at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:116) [hadoop-common-2.6.4.jar:?] at org.apache.hadoop.security.Groups.<init>(Groups.java:93) [hadoop-common-2.6.4.jar:?] at org.apache.hadoop.security.Groups.<init>(Groups.java:73) [hadoop-common-2.6.4.jar:?] at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:293) [hadoop-common-2.6.4.jar:?] at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:283) [hadoop-common-2.6.4.jar:?] at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260) [hadoop-common-2.6.4.jar:?] at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:789) [hadoop-common-2.6.4.jar:?] at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774) [hadoop-common-2.6.4.jar:?] at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647) [hadoop-common-2.6.4.jar:?] at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2198) [spark-core_2.10-1.6.2.jar:1.6.2] at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2198) [spark-core_2.10-1.6.2.jar:1.6.2] at scala.Option.getOrElse(Option.scala:120) [?:?] at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2198) [spark-core_2.10-1.6.2.jar:1.6.2] at org.apache.spark.SparkContext.<init>(SparkContext.scala:322) [spark-core_2.10-1.6.2.jar:1.6.2] at tarot.test$.main(test.scala:19) [bin/:?] at tarot.test.main(test.scala) [bin/:?] Exception in thread "main" org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://tarot1:9000/sparkTest/hello at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:285) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313) at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:199) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1307) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111) at org.apache.spark.rdd.RDD.withScope(RDD.scala:316) at org.apache.spark.rdd.RDD.take(RDD.scala:1302) at tarot.test$.main(test.scala:26) at tarot.test.main(test.scala) ``` scala代码 ``` package tarot import scala.tools.nsc.doc.model.Val import org.apache.spark.SparkConf import org.apache.spark.SparkContext import org.apache.spark.api.java.JavaSparkContext object test { def main(hello:Array[String]){ print("helloOff") val conf = new SparkConf() conf.setMaster("spark://tarot1:7077") .setAppName("hello_off") .set("spark.executor.memory", "4g") .set("spark.executor.cores", "4") .set("spark.cores.max", "4") .set("spark.sql.crossJoin.enabled", "true") val sc = new SparkContext(conf) sc.setLogLevel("ERROR") val file = sc.textFile("hdfs://tarot1:9000/sparkTest/hello") val filterRDD = file.filter { (ss:String) => ss.contains("hello") } val f=filterRDD.cache() // println(f) // filterRDD.count() for(x <- f.take(100)){ println(x) } } def helloingTest(jsc:SparkContext){ val sc = jsc val file = sc.textFile("hdfs://tarot1:9000/sparkTest/hello") val filterRDD = file.filter((ss:String) => ss.contains("hello")) val f=filterRDD.cache() println(f) val i = filterRDD.count() println(i) } // val seehello = def helloingTest(jsc:JavaSparkContext){ val sc = jsc val file = sc.textFile("hdfs://tarot1:9000/sparkTest/hello") val filterRDD = file.filter((ss:String) => ss.contains("hello")) val f=filterRDD.cache() println(f) val i = filterRDD.count() println(i) } } ``` java代码 ``` package com.tarot.sparkToHdfsTest; import org.apache.spark.SparkConf; import org.apache.spark.SparkContext; import org.apache.spark.api.java.JavaRDD; import org.apache.spark.api.java.JavaSparkContext; import org.apache.spark.api.java.function.Function; import org.apache.spark.rdd.RDD; //import scalaModule.hello; public class App { public static void main( String[] args ) { SparkConf sparkConf = new SparkConf(); sparkConf.setMaster("spark://tarot1:7077") .setAppName("hello off") .set("spark.executor.memory","4g") .set("spark.executor.cores", "4") .set("spark.cores.max","4") .set("spark.sql.crossJoin.enabled", "true"); JavaSparkContext jsc = new JavaSparkContext(sparkConf); jsc.setLogLevel("ERROR"); text(jsc); // test.helloingTest(jsc); } /** * test * @param jsc */ private static void text(JavaSparkContext jsc){ // jsc.textFile("hdfs://tarot1:9000/sparkTest/hello"); JavaRDD<String> jr= jsc.textFile("hdfs://tarot1:9000/sparkTest/hello",1); jr.cache(); // test t = new test(); jr.filter(f); for (String string : jr.take(100)) { System.out.println(string); } System.out.println("hello off"); } public static Function<String, Boolean> f = new Function<String, Boolean>() { public Boolean call(String s) { return s.contains("hello"); } }; } ``` 坑了很久了,网上的解决办法不是让我去shell上就是让我上传jar包,但是java不用啊?都调用的JVM。 大神救命
那位大佬看下scala报错求解决
Exception in thread "main" java.lang.VerifyError: class scala.collection.mutable.WrappedArray overrides final method toBuffer.()Lscala/collection/mutable/Buffer; at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:763) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:467) at java.net.URLClassLoader.access$100(URLClassLoader.java:73) at java.net.URLClassLoader$1.run(URLClassLoader.java:368) at java.net.URLClassLoader$1.run(URLClassLoader.java:362) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:361) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:73) at org.apache.spark.SparkConf.<init>(SparkConf.scala:68) at org.apache.spark.SparkConf.<init>(SparkConf.scala:55) at SessionStat$.main(SessionStat.scala:21) at SessionStat.main(SessionStat.scala) ``` ``` object SessionStat { def main(args: Array[String]): Unit = { //获取筛选条件 val jsonStr = ConfigurationManager.config.getString(Constants.TASK_PARAMS) //获取筛选条件对应的JsonObject val taskParam = JSONObject.fromObject(jsonStr) //创建全局唯一的主键 val taskUUID = UUID.randomUUID().toString //创建sparkSession val sparkConf = new SparkConf().setAppName("session").setMaster("local[*]") //创建sparkSession(包含SparkContext) val sparkSession = SparkSession.builder().config(sparkConf).enableHiveSupport().getOrCreate() //获取原始的动作表数据 //actionRDD:RDD[UserVisit] val actionRDD = getOriActionRDD(sparkSession,taskParam) actionRDD.foreach(println(_)) } def getOriActionRDD(sparkSession: SparkSession, taskParam: JSONObject) = { //获取查询时间的开始时间 val startDate = ParamUtils.getParam(taskParam,Constants.PARAM_START_DATE) //获取查询时间的结束时间 val endDate = ParamUtils.getParam(taskParam,Constants.PARAM_END_DATE) //查询数据 val sql = "select * from user_visit_action where date >='"+startDate+"' and date <='"+endDate+"'" import sparkSession.implicits._ sparkSession.sql(sql).as[UserVisitAction].rdd } }
在java项目中可以整合scala吗?
我要写一个分析系统,分析公司的业务数据 是否可以使用SSM框架controller层,service层,dao层 scala对接java接口 整合scala,sparksql实现前后端交互
Linux上运行Scala出现Java异常错误
在Fedora上试图运行Scala,环境变量设置完整,但是出现以下错误: Exception in thread "main" java.lang.NoClassDefFoundError: javax/script/Compilable at scala.tools.nsc.interpreter.ILoop.createInterpreter(ILoop.scala:126) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:908) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:906) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:906) at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97) at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:906) at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:74) at scala.tools.nsc.MainGenericRunner.run$1(MainGenericRunner.scala:87) at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:98) at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:103) at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
kafka启动报错,java.lang.NoSuchMethodError: org.apache.zookeeper.ZooKeeper.multi(Ljava/lang/Iterable;Lorg/apache/zookeeper/AsyncCallback$MultiCallback;Ljava/lang/Object;)V
试过很多方法,降级zk使其和kafka依赖的版本保持一致; zk3.414 ,kafka2.3 删除了scala的环境变量,依然不行; java_home只有一个 java.lang.NoSuchMethodError: org.apache.zookeeper.ZooKeeper.multi(Ljava/lang/Iterable;Lorg/apache/zookeeper/AsyncCallback$MultiCallback;Ljava/lang/Object;)V at kafka.zookeeper.ZooKeeperClient.send(ZooKeeperClient.scala:238) at kafka.zookeeper.ZooKeeperClient.$anonfun$handleRequests$2(ZooKeeperClient.scala:160) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at kafka.utils.CoreUtils$.inLock(CoreUtils.scala:253) at kafka.utils.CoreUtils$.inReadLock(CoreUtils.scala:259) at kafka.zookeeper.ZooKeeperClient.$anonfun$handleRequests$1(ZooKeeperClient.scala:160) at kafka.zookeeper.ZooKeeperClient.$anonfun$handleRequests$1$adapted(ZooKeeperClient.scala:156) at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49) at kafka.zookeeper.ZooKeeperClient.handleRequests(ZooKeeperClient.scala:156) at kafka.zk.KafkaZkClient.retryRequestsUntilConnected(KafkaZkClient.scala:1660) at kafka.zk.KafkaZkClient.retryRequestsUntilConnected(KafkaZkClient.scala:1647) at kafka.zk.KafkaZkClient.retryRequestUntilConnected(KafkaZkClient.scala:1642) at kafka.zk.KafkaZkClient$CheckedEphemeral.create(KafkaZkClient.scala:1712) at kafka.zk.KafkaZkClient.checkedEphemeralCreate(KafkaZkClient.scala:1689) at kafka.zk.KafkaZkClient.registerBroker(KafkaZkClient.scala:97) at kafka.server.KafkaServer.startup(KafkaServer.scala:262) at kafka.server.KafkaServerStartable.startup(KafkaServerStartable.scala:38) at kafka.Kafka$.main(Kafka.scala:84) at kafka.Kafka.main(Kafka.scala)
使用gradle编译打包scala工程,报错:
1.使用的版本 jdk1.7.004 gradle 2.1 scala 2.11.2 2.代码结构 src/main/scala/HelloWorld.scala package main.scala object HelloWorld { def main(args: Array[String]): Unit = { println("hello world") } } 3.build.gradle文件 apply plugin: 'scala' repositories { maven { url "http://10.177.60.141:8888/nexus/conte..." } } dependencies { compile 'org.scala-lang:scala-library:2.11.2' compile 'org.scala-lang:scala-compiler:2.11.2' } 4.执行gradle build命令,结果: D:\Workfiles\Scala\gradleproject>gradle build -stacktrace :compileJava UP-TO-DATE :compileScala FAILED FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':compileScala'. > scala/runtime/Nothing$ * Try: Run with --info or --debug option to get more log output. * Exception is: org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':compile Scala'. at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.ex ecuteActions(ExecuteActionsTaskExecuter.java:69) ........................................................... at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.ex ecuteActions(ExecuteActionsTaskExecuter.java:61) ... 44 more Caused by: java.lang.ClassNotFoundException: scala.runtime.Nothing$ ... 76 more BUILD FAILED Total time: 12.063 secs
如何使用VBA实现判断:①某个excel表中是否含有另一excel中相关内容②如果没有,将此另一excel的数据填入此excel中的相应位置
如何使用VBA实现判断: ①某个excel表中是否含有另一excel中相关内容 ②如果没有,将此另一excel的数据填入此excel中的相应位置
azkaban任务调度出错求解
azkaban在执行打包好的job任务时,多个job之间存在依赖关系,如果中途有job执行失败 怎么处理?官网就是笼统的说失败了可以自己选择是否全部取消,或满足执行条件的时候 自动执行。能不能说具体一点,生产环境中这种常见的出错怎么处理?
IDEA 2018.2 使用gradle 构建, java + scala 的项目, 编译不通过,无法运行
**下面是编译异常的日志:** ``` org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':common:compileScala'. at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:103) at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:73) at org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51) at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59) at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54) at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59) at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101) at org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44) at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91) at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62) at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59) at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54) at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43) at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34) at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker$1.run(DefaultTaskGraphExecuter.java:256) at org.gradle.internal.progress.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:336) at org.gradle.internal.progress.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:328) at org.gradle.internal.progress.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:199) at org.gradle.internal.progress.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:110) at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:249) at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:238) at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.processTask(DefaultTaskPlanExecutor.java:123) at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.access$200(DefaultTaskPlanExecutor.java:79) at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:104) at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:98) at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.execute(DefaultTaskExecutionPlan.java:663) at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.executeWithTask(DefaultTaskExecutionPlan.java:597) at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.run(DefaultTaskPlanExecutor.java:98) at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63) at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55) at java.lang.Thread.run(Thread.java:745) Caused by: org.gradle.api.internal.tasks.compile.CompilationFailedException: javac returned nonzero exit code at org.gradle.api.internal.tasks.scala.ZincScalaCompiler$Compiler.execute(ZincScalaCompiler.java:83) at org.gradle.api.internal.tasks.scala.ZincScalaCompiler.execute(ZincScalaCompiler.java:52) at org.gradle.api.internal.tasks.scala.ZincScalaCompiler.execute(ZincScalaCompiler.java:38) at org.gradle.api.internal.tasks.compile.daemon.AbstractDaemonCompiler$CompilerRunnable.run(AbstractDaemonCompiler.java:87) at org.gradle.workers.internal.DefaultWorkerServer.execute(DefaultWorkerServer.java:36) at org.gradle.workers.internal.WorkerDaemonServer.execute(WorkerDaemonServer.java:46) at org.gradle.workers.internal.WorkerDaemonServer.execute(WorkerDaemonServer.java:30) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.gradle.process.internal.worker.request.WorkerAction.run(WorkerAction.java:100) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35) at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24) at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:146) at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:128) at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:404) ... 6 more Caused by: javac returned nonzero exit code at sbt.compiler.JavaCompiler$JavaTool0.compile(JavaCompiler.scala:97) at sbt.compiler.JavaTool$class.apply(JavaCompiler.scala:54) at sbt.compiler.JavaCompiler$JavaTool0.apply(JavaCompiler.scala:84) at sbt.compiler.JavaCompiler$class.compile(JavaCompiler.scala:35) at sbt.compiler.JavaCompiler$JavaTool0.compile(JavaCompiler.scala:84) at sbt.compiler.JavaCompiler$class.compileWithReporter(JavaCompiler.scala:40) at sbt.compiler.JavaCompiler$JavaTool0.compileWithReporter(JavaCompiler.scala:84) at sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileJava$1$1.apply$mcV$sp(AggressiveCompile.scala:121) at sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileJava$1$1.apply(AggressiveCompile.scala:121) at sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileJava$1$1.apply(AggressiveCompile.scala:121) at sbt.compiler.AggressiveCompile.sbt$compiler$AggressiveCompile$$timed(AggressiveCompile.scala:168) at sbt.compiler.AggressiveCompile$$anonfun$3.compileJava$1(AggressiveCompile.scala:120) at sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:142) at sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:84) at sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:66) at sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:64) at sbt.inc.IncrementalCommon.cycle(IncrementalCommon.scala:32) at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:72) at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:71) at sbt.inc.Incremental$.manageClassfiles(Incremental.scala:99) at sbt.inc.Incremental$.compile(Incremental.scala:71) at sbt.inc.IncrementalCompile$.apply(Compile.scala:54) at sbt.compiler.AggressiveCompile.compile2(AggressiveCompile.scala:159) at sbt.compiler.AggressiveCompile.compile1(AggressiveCompile.scala:68) at com.typesafe.zinc.Compiler.compile(Compiler.scala:207) at com.typesafe.zinc.Compiler.compile(Compiler.scala:189) at com.typesafe.zinc.Compiler.compile(Compiler.scala:180) at com.typesafe.zinc.Compiler.compile(Compiler.scala:171) at org.gradle.api.internal.tasks.scala.ZincScalaCompiler$Compiler.execute(ZincScalaCompiler.java:81) ... 26 more ``` IDEA 的 scala 插件版本 2018.2.11 gradle 版本:5.1 JDK 版本 1.8.0_45 scala 依赖的版本 2.11.8
intellij创建scala项目没有scala选项
为什么我新建scala那一步,在new project里面左侧选择scala,右侧就是没有scala的选项呢,只有SBT\Activator\IDEA这三个选项 ![图片说明](https://img-ask.csdn.net/upload/201709/25/1506321594_567550.jpg)
IDEA安装导入sbt依赖库时失败!尝试了三天依然无法成功导入,求大神指点!!!
dump project structure from sbt 一直无法成功. sbt shell中显示[info] Loading settings for project global-plugins from idea.sbt scala 2.11.8 sbt 1.3.4 请问怎么解决呢?
kafka ArrayIndexOutOfBoundsException: 18
在kafka上出了下面这个问题,上网查了下都说是新版的kafka clien向旧版的kafka发送请求,旧版的kafka(<0.10)不支持ApiVersion(key:18) Request,造成的,但是我所有的produce,consumer,kafka服务器上装的kafka clien都是0.9.0.1,应该不会出现这个问题才对,为什么?求各位大神指点 ``` [2018-10-25 10:03:17,919] INFO [Kafka Server 0], started (kafka.server.KafkaServer) [2018-10-25 10:03:18,080] INFO [ReplicaFetcherManager on broker 0] Removed fetcher for partitions [topic-test,0] (kafka.server.ReplicaFetcherManager) [2018-10-25 10:03:18,099] INFO [ReplicaFetcherManager on broker 0] Removed fetcher for partitions [topic-test,0] (kafka.server.ReplicaFetcherManager) [2018-10-25 10:03:48,864] ERROR Processor got uncaught exception. (kafka.network.Processor) java.lang.ArrayIndexOutOfBoundsException: 18 at org.apache.kafka.common.protocol.ApiKeys.forId(ApiKeys.java:68) at org.apache.kafka.common.requests.AbstractRequest.getRequest(AbstractRequest.java:39) at kafka.network.RequestChannel$Request.<init>(RequestChannel.scala:79) at kafka.network.Processor$$anonfun$run$11.apply(SocketServer.scala:426) at kafka.network.Processor$$anonfun$run$11.apply(SocketServer.scala:421) at scala.collection.Iterator$class.foreach(Iterator.scala:742) at scala.collection.AbstractIterator.foreach(Iterator.scala:1194) at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) at scala.collection.AbstractIterable.foreach(Iterable.scala:54) at kafka.network.Processor.run(SocketServer.scala:421) at java.lang.Thread.run(Thread.java:748) ```
Quartus II IP核已破解,但是出现megacore function generation
![对应的IP核按照相关博客已经破解好了](https://img-ask.csdn.net/upload/201912/03/1575368261_622176.png)![但是生成的时候出现这种问题,请问是哪里出错了呢](https://img-ask.csdn.net/upload/201912/03/1575368268_575442.png) 对应的IP核按照相关博客已经破解好了 但是生成的时候出现这种问题,请问是哪里出错了呢 求求大佬指点,辛苦了
为什么mysql语句在SparkSql里报错?
``` spark.sql("select qj,sd,cp from (select cp,sd,qj,row_number() over(partition by qj order by sd desc) rk from car) where rk>=1 and rk<=3").show() println("对比!!!!!!!!!") spark.sql("select * from car c where 3 > (select count(*) from car where qj = c.qj and sd > c.sd) order by c.qj,c.sd desc").show() ``` 第二句在mysql里能够正常运行,在Scala的SparkSql里运行报错 ![图片说明](https://img-ask.csdn.net/upload/201911/27/1574855788_949345.png) 在mysql里效果是这样的 ![图片说明](https://img-ask.csdn.net/upload/201911/27/1574855798_673837.png)
Terminal是什么电脑软件? 退出也卸载不掉,打开页面需要输入密码,求解答 卸载 如图
![图片说明](https://img-ask.csdn.net/upload/202001/07/1578388448_196033.png)![图片说明](https://img-ask.csdn.net/upload/202001/07/1578388455_366377.png)
Flink local模式下,运行Flink自带的jar包一直报错
Flink local模式下,运行Flink自带的jar包一直报错 启动Flink: ![图片说明](https://img-ask.csdn.net/upload/201911/22/1574391817_772547.png) 执行: bin/flink run examples/streaming/SocketWindowWordCount.jar --port 8888 利用nc -lk 8888模拟socket 输入,然后会报错,并且页面也进不去了 前台页面显示: ![图片说明](https://img-ask.csdn.net/upload/201911/22/1574392093_133203.png) 后台报错内容: ``` org.apache.flink.client.program.ProgramInvocationException: Could not retrieve the execution result. (JobID: f2ee89e0ed991f22ed9eaec00edfd789) at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:261) at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:486) at org.apache.flink.streaming.api.environment.StreamContextEnvironment.execute(StreamContextEnvironment.java:66) at org.apache.flink.streaming.examples.socket.SocketWindowWordCount.main(SocketWindowWordCount.java:92) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529) at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421) at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:426) at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:816) at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:290) at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:216) at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1053) at org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1129) at org.apache.flink.client.cli.CliFrontend$$Lambda$1/1977310713.call(Unknown Source) at org.apache.flink.runtime.security.HadoopSecurityContext$$Lambda$2/1169474473.run(Unknown Source) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754) at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41) at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1129) Caused by: org.apache.flink.runtime.client.JobSubmissionException: Failed to submit JobGraph. at org.apache.flink.client.program.rest.RestClusterClient.lambda$submitJob$8(RestClusterClient.java:380) at org.apache.flink.client.program.rest.RestClusterClient$$Lambda$11/256346753.apply(Unknown Source) at java.util.concurrent.CompletableFuture$ExceptionCompletion.run(CompletableFuture.java:1246) at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:193) at java.util.concurrent.CompletableFuture.internalComplete(CompletableFuture.java:210) at java.util.concurrent.CompletableFuture$ThenApply.run(CompletableFuture.java:723) at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:193) at java.util.concurrent.CompletableFuture.internalComplete(CompletableFuture.java:210) at java.util.concurrent.CompletableFuture$ThenCopy.run(CompletableFuture.java:1333) at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:193) at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:2361) at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$5(FutureUtils.java:203) at org.apache.flink.runtime.concurrent.FutureUtils$$Lambda$21/100819684.accept(Unknown Source) at java.util.concurrent.CompletableFuture$WhenCompleteCompletion.run(CompletableFuture.java:1298) at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:193) at java.util.concurrent.CompletableFuture.internalComplete(CompletableFuture.java:210) at java.util.concurrent.CompletableFuture$ThenCopy.run(CompletableFuture.java:1333) at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:193) at java.util.concurrent.CompletableFuture.internalComplete(CompletableFuture.java:210) at java.util.concurrent.CompletableFuture$AsyncCompose.exec(CompletableFuture.java:626) at java.util.concurrent.CompletableFuture$Async.run(CompletableFuture.java:428) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.flink.runtime.rest.util.RestClientException: [Internal server error., <Exception on server side: akka.pattern.AskTimeoutException: Ask timed out on [Actor[akka://flink/user/dispatcher#1680779493]] after [12000 ms]. Sender[null] sent message of type "org.apache.flink.runtime.rpc.messages.LocalFencedMessage". at akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:604) at akka.actor.Scheduler$$anon$4.run(Scheduler.scala:126) at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:601) at scala.concurrent.BatchingExecutor$class.execute(BatchingExecutor.scala:109) at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:599) at akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(LightArrayRevolverScheduler.scala:329) at akka.actor.LightArrayRevolverScheduler$$anon$4.executeBucket$1(LightArrayRevolverScheduler.scala:280) at akka.actor.LightArrayRevolverScheduler$$anon$4.nextTick(LightArrayRevolverScheduler.scala:284) at akka.actor.LightArrayRevolverScheduler$$anon$4.run(LightArrayRevolverScheduler.scala:236) at java.lang.Thread.run(Thread.java:745) End of exception on server side>] at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:350) at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:334) at org.apache.flink.runtime.rest.RestClient$$Lambda$31/1522310222.apply(Unknown Source) at java.util.concurrent.CompletableFuture$AsyncCompose.exec(CompletableFuture.java:604) ... 4 more ``` 网上搜了一下,添加了这几个参数还是有问题: ``` akka.ask.timeout: 60 s web.timeout: 12000 taskmanager.host: localhost ``` 有人遇到过吗?
sbt报错scala.runtime in compiler mirror not found
用sbt编译kafka manager工程时报错如下信息: [info] Loading project definition from E:\workspace\idea\kafka-manager-master\project error: error while loading <root>, error in opening zip file scala.reflect.internal.MissingRequirementError: object scala.runtime in compiler mirror not found. at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16) at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61) at scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172) at scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175) at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183) at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183) at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184) at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184) at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024) at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023) at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153) at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152) at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196) at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196) at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261) at scala.tools.nsc.Global$Run.<init>(Global.scala:1290) at sbt.compiler.Eval$$anon$1.<init>(Eval.scala:141) at sbt.compiler.Eval.run$lzycompute$1(Eval.scala:141) at sbt.compiler.Eval.run$1(Eval.scala:141) at sbt.compiler.Eval.unlinkAll$1(Eval.scala:144) at sbt.compiler.Eval.evalCommon(Eval.scala:153) at sbt.compiler.Eval.evalDefinitions(Eval.scala:122) at sbt.EvaluateConfigurations$.evaluateDefinitions(EvaluateConfigurations.scala:271) at sbt.EvaluateConfigurations$.evaluateSbtFile(EvaluateConfigurations.scala:109) at sbt.Load$.sbt$Load$$loadSettingsFile$1(Load.scala:775) at sbt.Load$$anonfun$sbt$Load$$memoLoadSettingsFile$1$1.apply(Load.scala:781) at sbt.Load$$anonfun$sbt$Load$$memoLoadSettingsFile$1$1.apply(Load.scala:780) at scala.collection.MapLike$class.getOrElse(MapLike.scala:128) at scala.collection.AbstractMap.getOrElse(Map.scala:58) at sbt.Load$.sbt$Load$$memoLoadSettingsFile$1(Load.scala:780) at sbt.Load$$anonfun$loadFiles$1$2.apply(Load.scala:788) at sbt.Load$$anonfun$loadFiles$1$2.apply(Load.scala:788) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) at scala.collection.TraversableLike$class.map(TraversableLike.scala:244) at scala.collection.AbstractTraversable.map(Traversable.scala:105) at sbt.Load$.loadFiles$1(Load.scala:788) at sbt.Load$.discoverProjects(Load.scala:799) at sbt.Load$.discover$1(Load.scala:585) at sbt.Load$.sbt$Load$$loadTransitive(Load.scala:633) at sbt.Load$$anonfun$loadUnit$1.sbt$Load$$anonfun$$loadProjects$1(Load.scala:482) at sbt.Load$$anonfun$loadUnit$1$$anonfun$40.apply(Load.scala:485) at sbt.Load$$anonfun$loadUnit$1$$anonfun$40.apply(Load.scala:485) at sbt.Load$.timed(Load.scala:1025) at sbt.Load$$anonfun$loadUnit$1.apply(Load.scala:485) at sbt.Load$$anonfun$loadUnit$1.apply(Load.scala:459) at sbt.Load$.timed(Load.scala:1025) at sbt.Load$.loadUnit(Load.scala:459) at sbt.Load$$anonfun$25$$anonfun$apply$14.apply(Load.scala:311) at sbt.Load$$anonfun$25$$anonfun$apply$14.apply(Load.scala:310) at sbt.BuildLoader$$anonfun$componentLoader$1$$anonfun$apply$4$$anonfun$apply$5$$anonfun$apply$6.apply(BuildLoader.scala:91) at sbt.BuildLoader$$anonfun$componentLoader$1$$anonfun$apply$4$$anonfun$apply$5$$anonfun$apply$6.apply(BuildLoader.scala:90) at sbt.BuildLoader.apply(BuildLoader.scala:140) at sbt.Load$.loadAll(Load.scala:365) at sbt.Load$.loadURI(Load.scala:320) at sbt.Load$.load(Load.scala:316) at sbt.Load$.load(Load.scala:305) at sbt.Load$$anonfun$4.apply(Load.scala:146) at sbt.Load$$anonfun$4.apply(Load.scala:146) at sbt.Load$.timed(Load.scala:1025) at sbt.Load$.apply(Load.scala:146) at sbt.Load$.defaultLoad(Load.scala:39) at sbt.BuiltinCommands$.liftedTree1$1(Main.scala:496) at sbt.BuiltinCommands$.doLoadProject(Main.scala:496) at sbt.BuiltinCommands$$anonfun$loadProjectImpl$2.apply(Main.scala:488) at sbt.BuiltinCommands$$anonfun$loadProjectImpl$2.apply(Main.scala:488) at sbt.Command$$anonfun$applyEffect$1$$anonfun$apply$2.apply(Command.scala:59) at sbt.Command$$anonfun$applyEffect$1$$anonfun$apply$2.apply(Command.scala:59) at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:61) at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:61) at sbt.Command$.process(Command.scala:93) at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:96) at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:96) at sbt.State$$anon$1.process(State.scala:184) at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:96) at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:96) at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17) at sbt.MainLoop$.next(MainLoop.scala:96) at sbt.MainLoop$.run(MainLoop.scala:89) at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:68) at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:63) at sbt.Using.apply(Using.scala:24) at sbt.MainLoop$.runWithNewLog(MainLoop.scala:63) at sbt.MainLoop$.runAndClearLast(MainLoop.scala:46) at sbt.MainLoop$.runLoggedLoop(MainLoop.scala:30) at sbt.MainLoop$.runLogged(MainLoop.scala:22) at sbt.StandardMain$.runManaged(Main.scala:57) at sbt.xMain.run(Main.scala:29) at xsbt.boot.Launch$$anonfun$run$1.apply(Launch.scala:109) at xsbt.boot.Launch$.withContextLoader(Launch.scala:128) at xsbt.boot.Launch$.run(Launch.scala:109) at xsbt.boot.Launch$$anonfun$apply$1.apply(Launch.scala:35) at xsbt.boot.Launch$.launch(Launch.scala:117) at xsbt.boot.Launch$.apply(Launch.scala:18) at xsbt.boot.Boot$.runImpl(Boot.scala:41) at xsbt.boot.Boot$.main(Boot.scala:17) at xsbt.boot.Boot.main(Boot.scala) [error] scala.reflect.internal.MissingRequirementError: object scala.runtime in compiler mirror not found. [error] Use 'last' for the full log. Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768M; support was removed in 8.0
爬虫福利二 之 妹子图网MM批量下载
爬虫福利一:27报网MM批量下载 点击 看了本文,相信大家对爬虫一定会产生强烈的兴趣,激励自己去学习爬虫,在这里提前祝:大家学有所成! 目标网站:妹子图网 环境:Python3.x 相关第三方模块:requests、beautifulsoup4 Re:各位在测试时只需要将代码里的变量path 指定为你当前系统要保存的路径,使用 python xxx.py 或IDE运行即可。 ...
Java学习的正确打开方式
在博主认为,对于入门级学习java的最佳学习方法莫过于视频+博客+书籍+总结,前三者博主将淋漓尽致地挥毫于这篇博客文章中,至于总结在于个人,实际上越到后面你会发现学习的最好方式就是阅读参考官方文档其次就是国内的书籍,博客次之,这又是一个层次了,这里暂时不提后面再谈。博主将为各位入门java保驾护航,各位只管冲鸭!!!上天是公平的,只要不辜负时间,时间自然不会辜负你。 何谓学习?博主所理解的学习,它是一个过程,是一个不断累积、不断沉淀、不断总结、善于传达自己的个人见解以及乐于分享的过程。
大学四年自学走来,这些私藏的实用工具/学习网站我贡献出来了
大学四年,看课本是不可能一直看课本的了,对于学习,特别是自学,善于搜索网上的一些资源来辅助,还是非常有必要的,下面我就把这几年私藏的各种资源,网站贡献出来给你们。主要有:电子书搜索、实用工具、在线视频学习网站、非视频学习网站、软件下载、面试/求职必备网站。 注意:文中提到的所有资源,文末我都给你整理好了,你们只管拿去,如果觉得不错,转发、分享就是最大的支持了。 一、电子书搜索 对于大部分程序员...
linux系列之常用运维命令整理笔录
本博客记录工作中需要的linux运维命令,大学时候开始接触linux,会一些基本操作,可是都没有整理起来,加上是做开发,不做运维,有些命令忘记了,所以现在整理成博客,当然vi,文件操作等就不介绍了,慢慢积累一些其它拓展的命令,博客不定时更新 free -m 其中:m表示兆,也可以用g,注意都要小写 Men:表示物理内存统计 total:表示物理内存总数(total=used+free) use...
比特币原理详解
一、什么是比特币 比特币是一种电子货币,是一种基于密码学的货币,在2008年11月1日由中本聪发表比特币白皮书,文中提出了一种去中心化的电子记账系统,我们平时的电子现金是银行来记账,因为银行的背后是国家信用。去中心化电子记账系统是参与者共同记账。比特币可以防止主权危机、信用风险。其好处不多做赘述,这一层面介绍的文章很多,本文主要从更深层的技术原理角度进行介绍。 二、问题引入 假设现有4个人...
程序员接私活怎样防止做完了不给钱?
首先跟大家说明一点,我们做 IT 类的外包开发,是非标品开发,所以很有可能在开发过程中会有这样那样的需求修改,而这种需求修改很容易造成扯皮,进而影响到费用支付,甚至出现做完了项目收不到钱的情况。 那么,怎么保证自己的薪酬安全呢? 我们在开工前,一定要做好一些证据方面的准备(也就是“讨薪”的理论依据),这其中最重要的就是需求文档和验收标准。一定要让需求方提供这两个文档资料作为开发的基础。之后开发...
网页实现一个简单的音乐播放器(大佬别看。(⊙﹏⊙))
今天闲着无事,就想写点东西。然后听了下歌,就打算写个播放器。 于是乎用h5 audio的加上js简单的播放器完工了。 演示地点演示 html代码如下` music 这个年纪 七月的风 音乐 ` 然后就是css`*{ margin: 0; padding: 0; text-decoration: none; list-...
Python十大装B语法
Python 是一种代表简单思想的语言,其语法相对简单,很容易上手。不过,如果就此小视 Python 语法的精妙和深邃,那就大错特错了。本文精心筛选了最能展现 Python 语法之精妙的十个知识点,并附上详细的实例代码。如能在实战中融会贯通、灵活使用,必将使代码更为精炼、高效,同时也会极大提升代码B格,使之看上去更老练,读起来更优雅。
数据库优化 - SQL优化
以实际SQL入手,带你一步一步走上SQL优化之路!
2019年11月中国大陆编程语言排行榜
2019年11月2日,我统计了某招聘网站,获得有效程序员招聘数据9万条。针对招聘信息,提取编程语言关键字,并统计如下: 编程语言比例 rank pl_ percentage 1 java 33.62% 2 cpp 16.42% 3 c_sharp 12.82% 4 javascript 12.31% 5 python 7.93% 6 go 7.25% 7 p...
通俗易懂地给女朋友讲:线程池的内部原理
餐盘在灯光的照耀下格外晶莹洁白,女朋友拿起红酒杯轻轻地抿了一小口,对我说:“经常听你说线程池,到底线程池到底是个什么原理?”
经典算法(5)杨辉三角
杨辉三角 是经典算法,这篇博客对它的算法思想进行了讲解,并有完整的代码实现。
腾讯算法面试题:64匹马8个跑道需要多少轮才能选出最快的四匹?
昨天,有网友私信我,说去阿里面试,彻底的被打击到了。问了为什么网上大量使用ThreadLocal的源码都会加上private static?他被难住了,因为他从来都没有考虑过这个问题。无独有偶,今天笔者又发现有网友吐槽了一道腾讯的面试题,我们一起来看看。 腾讯算法面试题:64匹马8个跑道需要多少轮才能选出最快的四匹? 在互联网职场论坛,一名程序员发帖求助到。二面腾讯,其中一个算法题:64匹...
面试官:你连RESTful都不知道我怎么敢要你?
干货,2019 RESTful最贱实践
JDK12 Collectors.teeing 你真的需要了解一下
前言 在 Java 12 里面有个非常好用但在官方 JEP 没有公布的功能,因为它只是 Collector 中的一个小改动,它的作用是 merge 两个 collector 的结果,这句话显得很抽象,老规矩,我们先来看个图(这真是一个不和谐的图????): 管道改造经常会用这个小东西,通常我们叫它「三通」,它的主要作用就是将 downstream1 和 downstre...
为啥国人偏爱Mybatis,而老外喜欢Hibernate/JPA呢?
关于SQL和ORM的争论,永远都不会终止,我也一直在思考这个问题。昨天又跟群里的小伙伴进行了一番讨论,感触还是有一些,于是就有了今天这篇文。 声明:本文不会下关于Mybatis和JPA两个持久层框架哪个更好这样的结论。只是摆事实,讲道理,所以,请各位看官勿喷。 一、事件起因 关于Mybatis和JPA孰优孰劣的问题,争论已经很多年了。一直也没有结论,毕竟每个人的喜好和习惯是大不相同的。我也看...
SQL-小白最佳入门sql查询一
不要偷偷的查询我的个人资料,即使你再喜欢我,也不要这样,真的不好;
项目中的if else太多了,该怎么重构?
介绍 最近跟着公司的大佬开发了一款IM系统,类似QQ和微信哈,就是聊天软件。我们有一部分业务逻辑是这样的 if (msgType = "文本") { // dosomething } else if(msgType = "图片") { // doshomething } else if(msgType = "视频") { // doshomething } else { // doshom...
【图解经典算法题】如何用一行代码解决约瑟夫环问题
约瑟夫环问题算是很经典的题了,估计大家都听说过,然后我就在一次笔试中遇到了,下面我就用 3 种方法来详细讲解一下这道题,最后一种方法学了之后保证让你可以让你装逼。 问题描述:编号为 1-N 的 N 个士兵围坐在一起形成一个圆圈,从编号为 1 的士兵开始依次报数(1,2,3…这样依次报),数到 m 的 士兵会被杀死出列,之后的士兵再从 1 开始报数。直到最后剩下一士兵,求这个士兵的编号。 1、方...
致 Python 初学者
欢迎来到“Python进阶”专栏!来到这里的每一位同学,应该大致上学习了很多 Python 的基础知识,正在努力成长的过程中。在此期间,一定遇到了很多的困惑,对未来的学习方向感到迷茫。我非常理解你们所面临的处境。我从2007年开始接触 python 这门编程语言,从2009年开始单一使用 python 应对所有的开发工作,直至今天。回顾自己的学习过程,也曾经遇到过无数的困难,也曾经迷茫过、困惑过。开办这个专栏,正是为了帮助像我当年一样困惑的 Python 初学者走出困境、快速成长。希望我的经验能真正帮到你
“狗屁不通文章生成器”登顶GitHub热榜,分分钟写出万字形式主义大作
一、垃圾文字生成器介绍 最近在浏览GitHub的时候,发现了这样一个骨骼清奇的雷人项目,而且热度还特别高。 项目中文名:狗屁不通文章生成器 项目英文名:BullshitGenerator 根据作者的介绍,他是偶尔需要一些中文文字用于GUI开发时测试文本渲染,因此开发了这个废话生成器。但由于生成的废话实在是太过富于哲理,所以最近已经被小伙伴们给玩坏了。 他的文风可能是这样的: 你发现,...
程序员:我终于知道post和get的区别
是一个老生常谈的话题,然而随着不断的学习,对于以前的认识有很多误区,所以还是需要不断地总结的,学而时习之,不亦说乎
GitHub标星近1万:只需5秒音源,这个网络就能实时“克隆”你的声音
作者 | Google团队 译者 | 凯隐 编辑 | Jane 出品 | AI科技大本营(ID:rgznai100) 本文中,Google 团队提出了一种文本语音合成(text to speech)神经系统,能通过少量样本学习到多个不同说话者(speaker)的语音特征,并合成他们的讲话音频。此外,对于训练时网络没有接触过的说话者,也能在不重新训练的情况下,仅通过未知...
《程序人生》系列-这个程序员只用了20行代码就拿了冠军
你知道的越多,你不知道的越多 点赞再看,养成习惯GitHub上已经开源https://github.com/JavaFamily,有一线大厂面试点脑图,欢迎Star和完善 前言 这一期不算《吊打面试官》系列的,所有没前言我直接开始。 絮叨 本来应该是没有这期的,看过我上期的小伙伴应该是知道的嘛,双十一比较忙嘛,要值班又要去帮忙拍摄年会的视频素材,还得搞个程序员一天的Vlog,还要写BU...
加快推动区块链技术和产业创新发展,2019可信区块链峰会在京召开
11月8日,由中国信息通信研究院、中国通信标准化协会、中国互联网协会、可信区块链推进计划联合主办,科技行者协办的2019可信区块链峰会将在北京悠唐皇冠假日酒店开幕。   区块链技术被认为是继蒸汽机、电力、互联网之后,下一代颠覆性的核心技术。如果说蒸汽机释放了人类的生产力,电力解决了人类基本的生活需求,互联网彻底改变了信息传递的方式,区块链作为构造信任的技术有重要的价值。   1...
程序员把地府后台管理系统做出来了,还有3.0版本!12月7号最新消息:已在开发中有github地址
第一幕:缘起 听说阎王爷要做个生死簿后台管理系统,我们派去了一个程序员…… 996程序员做的梦: 第一场:团队招募 为了应对地府管理危机,阎王打算找“人”开发一套地府后台管理系统,于是就在地府总经办群中发了项目需求。 话说还是中国电信的信号好,地府都是满格,哈哈!!! 经常会有外行朋友问:看某网站做的不错,功能也简单,你帮忙做一下? 而这次,面对这样的需求,这个程序员...
网易云6亿用户音乐推荐算法
网易云音乐是音乐爱好者的集聚地,云音乐推荐系统致力于通过 AI 算法的落地,实现用户千人千面的个性化推荐,为用户带来不一样的听歌体验。 本次分享重点介绍 AI 算法在音乐推荐中的应用实践,以及在算法落地过程中遇到的挑战和解决方案。 将从如下两个部分展开: AI算法在音乐推荐中的应用 音乐场景下的 AI 思考 从 2013 年 4 月正式上线至今,网易云音乐平台持续提供着:乐屏社区、UGC...
【技巧总结】位运算装逼指南
位算法的效率有多快我就不说,不信你可以去用 10 亿个数据模拟一下,今天给大家讲一讲位运算的一些经典例子。不过,最重要的不是看懂了这些例子就好,而是要在以后多去运用位运算这些技巧,当然,采用位运算,也是可以装逼的,不信,你往下看。我会从最简单的讲起,一道比一道难度递增,不过居然是讲技巧,那么也不会太难,相信你分分钟看懂。 判断奇偶数 判断一个数是基于还是偶数,相信很多人都做过,一般的做法的代码如下...
【管理系统课程设计】美少女手把手教你后台管理
【文章后台管理系统】URL设计与建模分析+项目源码+运行界面 栏目管理、文章列表、用户管理、角色管理、权限管理模块(文章最后附有源码) 1. 这是一个什么系统? 1.1 学习后台管理系统的原因 随着时代的变迁,现如今各大云服务平台横空出世,市面上有许多如学生信息系统、图书阅读系统、停车场管理系统等的管理系统,而本人家里就有人在用烟草销售系统,直接在网上完成挑选、购买与提交收货点,方便又快捷。 试想,若没有烟草销售系统,本人家人想要购买烟草,还要独自前往药...
4G EPS 第四代移动通信系统
目录 文章目录目录4G 与 LTE/EPCLTE/EPC 的架构E-UTRANE-UTRAN 协议栈eNodeBEPCMMES-GWP-GWHSSLTE/EPC 协议栈概览 4G 与 LTE/EPC 4G,即第四代移动通信系统,提供了 3G 不能满足的无线网络宽带化,主要提供数据(上网)业务。而 LTE(Long Term Evolution,长期演进技术)是电信领域用于手机及数据终端的高速无线通...
相关热词 如何提升c#开发能力 矩阵乘法c# c#调用谷歌浏览器 c# 去空格去转义符 c#用户登录窗体代码 c# 流 c# linux 可视化 c# mvc 返回图片 c# 像素空间 c# 日期 最后一天
立即提问