【发布时间】:2021-03-02 16:38:46
【问题描述】:
我有一个 pyspark 数据框,其列 parsed_date (dtype: date) 和 id (dtype: bigint) 如下所示:
+-------+-----------+
| id|parsed_date|
+-------+-----------+
|1471783| 2017-12-18|
|1471885| 2017-12-18|
|1472928| 2017-12-19|
|1476917| 2017-12-19|
|1477469| 2017-12-21|
|1478190| 2017-12-21|
|1478570| 2017-12-19|
|1481415| 2017-12-21|
|1472592| 2017-12-20|
|1474023| 2017-12-22|
|1474029| 2017-12-22|
+-------+-----------+
我有一个如下所示的函数。目的是传递一个日期(天)和 t(天数)。在 df1 中,id 计入范围(day-t,day)中,在 df2 中,id 计入范围(day,day+t)中。
from pyspark.sql import functions as F, Window
def hypo_1(df, day, t):
df1 = (df.filter(f"parsed_date between '{day}' - interval {t} days and '{day}'")
.withColumn('count_before', F.count('id').over(Window.partitionBy('parsed_date')))
.orderBy('parsed_date')
)
df2 = (df.filter(f"parsed_date between '{day}' and '{day}' + interval {t} days")
.withColumn('count_after', F.count('id').over(Window.partitionBy('parsed_date')))
.orderBy('parsed_date')
)
return [df1, df2]
使用此代码,函数返回两个数据帧:
示例:hypo_1(df, '2017-12-20', 2)
df1
+-----------+-------+------------+
|parsed_date| id|count_before|
+-----------+-------+------------+
| 2017-12-20|1471783| 1|
+-----------+-------+------------+
df2
+-----------+-------+-----------+
|parsed_date| id|count_after|
+-----------+-------+-----------+
| 2017-12-20|1472592| 1|
| 2017-12-21|1477469| 3|
| 2017-12-22|1474029| 2|
+-----------+-------+-----------+
问题:
-
df1 的日期间隔看起来不正确。
-
不应该计算我通过的日期 (2017-12-20) 的 id,这在 df1 和 df2 中都发生 ->
+-----------+-------+-----------+ |parsed_date| id|count_after| +-----------+-------+-----------+ | 2017-12-20|1472592| 1|
预期输出:
示例:hypo_1(df, '2017-12-20', 2)
df1:
+-------+-----------+------------+
| id|parsed_date|count_before|
+-------+-----------+------------+
|1471783| 2017-12-18| 2|
|1478570| 2017-12-19| 3|
+-------+-----------+------------+
df2:
+-------+-----------+------------+
| id|parsed_date| count_after|
+-------+-----------+------------+
|1477469| 2017-12-21| 3|
|1474023| 2017-12-22| 2|
+-------+-----------+------------+
请帮忙。
【问题讨论】:
标签: python apache-spark date pyspark apache-spark-sql