【发布时间】:2014-07-19 13:54:13
【问题描述】:
尝试从源代码运行http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala。
这一行:
val wordCounts = textFile.flatMap(line => line.split(" ")).map(word => (word, 1)).reduceByKey((a, b) => a + b)
正在抛出错误
value reduceByKey is not a member of org.apache.spark.rdd.RDD[(String, Int)]
val wordCounts = logData.flatMap(line => line.split(" ")).map(word => (word, 1)).reduceByKey((a, b) => a + b)
logData.flatMap(line => line.split(" ")).map(word => (word, 1)) 返回一个 MappedRDD 但我在http://spark.apache.org/docs/0.9.1/api/core/index.html#org.apache.spark.rdd.RDD 中找不到这种类型
我正在从 Spark 源代码运行此代码,所以可能是类路径问题?但所需的依赖项在我的类路径上。
【问题讨论】:
标签: scala apache-spark rdd