【发布时间】:2017-01-28 15:57:24
【问题描述】:
在 spark-scala-shell 中实现 aggregateByKey 时出错。
我试图在 Scala-shell 上执行的代码是这样的,
val orderItemsMapJoinOrdersMapMapAgg = orderItemsMapJoinOrdersMapMap
.aggregateByKey(0.0,0)(
(a,b) => (a._1 + b , a._2 + 1),
(a,b) => (a._1 + b._1 , a._2 + b._2 )
)
但我收到以下错误,
<console>:39: error: value _1 is not a member of Double
val orderItemsMapJoinOrdersMapMapAgg = orderItemsMapJoinOrdersMapMap.aggregateByKey( 0.0,0)( (a,b) => (a._1 + b , a._2 +1), (a,b) => (a._1 + b._1 , a._2 + b._2 ))
scala> orderItemsMapJoinOrdersMapMap
res8: org.apache.spark.rdd.RDD[(String, Float)] = MapPartitionsRDD[16] at map at <console>:37
有人可以帮助我理解双精度和浮点值逻辑以及如何解决它
【问题讨论】:
标签: scala apache-spark