我是ubuntu20.04系统,用idea通过maven项目要做spark的测试,关于spark集群还没建立,但要求用docker-compose来建立,因为只有一台电脑。但我不知道怎么测试,新建maven项目然后新建scala,然后在scala里面写代码吗,真小白也不太会用pom配置,这里是导师发给我的项目,让我自己去测试别的(先进行单机测试,不用搭建集群),但我不知道怎么找,单机测试什么,
import org.apache.spark.rdd.RDD
import org.apache.spark.{SparkConf, SparkContext}
class ScalaWordCount {
}
object ScalaWordCount {
def main(args: Array[String]): Unit = {
val list = List("An old woman had a cat",
"The cat was very old she could not run quickly and she could not bite because she was so old",
"'Do not be unkind to the old but remember what good work the old did when they were young")
val conf = new SparkConf().setAppName("Wordcount").setMaster("local[*]")
val sc = new SparkContext(conf)
val lines: RDD[String] = sc.parallelize(list)
val words: RDD[String] = lines.flatMap((line: String) => {
line.split(" ")
})
val wordAndOne: RDD[(String, Int)] = words.map((word: String) => {
(word, 1)
})
val wordAndNum: RDD[(String, Int)] = wordAndOne
.reduceByKey((count1: Int, count2: Int) => {
count1 + count2
})
val ret = wordAndNum
.sortBy(kv => kv._2, false)
println(ret.collect().mkString(","))
//ret.saveAsTextFile(args(0))
sc.stop()
}
}