【发布时间】:2019-01-12 05:29:35
【问题描述】:
在 Spark Scala 中尝试使用减法时,出现以下错误
<console>:29: error: value subtract is not a member of org.apache.spark.sql.DataFrame
但从以下链接我可以看到它存在于 Python 中
https://forums.databricks.com/questions/7505/comparing-two-dataframes.html https://spark.apache.org/docs/1.3.0/api/python/pyspark.sql.html?highlight=dataframe#pyspark.sql.DataFrame.subtract
我们在 Spark Scala 中是否有减法?如果不是,它的替代品是什么?
我的示例代码如下所示:
scala> val myDf1 = sc.parallelize(Seq(1,2,2)).toDF
myDf1: org.apache.spark.sql.DataFrame = [value: int]
scala> val myDf2 = sc.parallelize(Seq(1,2)).toDF
myDf2: org.apache.spark.sql.DataFrame = [value: int]
scala> val result = myDf1.subtract(myDf2)
<console>:28: error: value subtract is not a member of org.apache.spark.sql.DataFrame
val result = myDf1.subtract(myDf2)
【问题讨论】:
-
你用的是什么版本的spark?
标签: scala apache-spark apache-spark-sql