【发布时间】:2020-03-01 19:49:20
【问题描述】:
我正在尝试实现两个时间戳列值之间的差异。尝试使用 spark 中可用的不同方法来达到相同的结果。我可以使用 Spark SQL 和普通函数来实现相同的结果。但是,当我尝试将此函数注册为 UDF 时,它开始抛出错误。
数据:
id|end_date|start_date|location
1|2015-10-14 00:00:00|2015-09-14 00:00:00|CA-SF
2|2015-10-15 01:00:20|2015-08-14 00:00:00|CA-SD
3|2015-10-16 02:30:00|2015-01-14 00:00:00|NY-NY
4|2015-10-17 03:00:20|2015-02-14 00:00:00|NY-NY
5|2015-10-18 04:30:00|2014-04-14 00:00:00|CA-SD
使用 SparkSQL:工作正常!!
data.createOrReplaceTempView("data_tbl")
query = "SELECT id, end_date, start_date,\
datediff(end_date,start_date) as dtdiff FROM data_tbl"
spark.sql(query).show()
使用 Python 函数:效果很好!!
from pyspark.sql.functions import datediff
def get_diff(x, y):
result = datediff(x,y)
return result
data.withColumn('differ',get_diff('end_date','start_date')).show()
两种情况的结果:
+---+-------------------+-------------------+--------+------+
| id| end_date| start_date|location|differ|
+---+-------------------+-------------------+--------+------+
| 1|2015-10-14 00:00:00|2015-09-14 00:00:00| CA-SF| 30|
| 2|2015-10-15 01:00:20|2015-08-14 00:00:00| CA-SD| 62|
| 3|2015-10-16 02:30:00|2015-01-14 00:00:00| NY-NY| 275|
| 4|2015-10-17 03:00:20|2015-02-14 00:00:00| NY-NY| 245|
| 5|2015-10-18 04:30:00|2014-04-14 00:00:00| CA-SD| 552|
+---+-------------------+-------------------+--------+------+
将函数注册为 UDF:不工作!!
from pyspark.sql.functions import udf, datediff
get_diff_udf = udf(lambda x, y: datediff(x,y))
data.withColumn('differ',get_diff_udf('end_date','start_date')).show()
错误:
Py4JJavaError: An error occurred while calling o934.showString.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 18.0 failed 1 times, most recent failure: Lost task 0.0 in stage 18.0 (TID 18, localhost, executor driver): org.apache.spark.SparkException: Python worker failed to connect back.
【问题讨论】:
-
不允许在 UDF 中使用 Spark Dataframe API 函数。
标签: apache-spark pyspark apache-spark-sql pyspark-sql