【发布时间】:2022-01-19 08:14:04
【问题描述】:
我有两组带有多个 case 语句的查询。我需要在 pyspark 中实现相同的逻辑。我试过了,但我在多个时间遇到了一些困难。任何帮助都将不胜感激。
第一个查询
case
when appointment_date is null
then 0
when resolution_desc in (
'CSTXCL - OK BY PHONE'
)
or resolution_des ilike '%NO VAN ROLL%'
then 0
when status in ('PENDING','CANCELLED')
then 0
when ticket_type = 'install'
and appointment_required is true
end as truck_roll
第二个查询
case when status = 'COMPLETED' and resolution not in ('CANCELLING ORDER','CANCEL ORDER')
then 1 else 0 end as completed,
case when status = 'CANCELLED' or ( status in ('COMPLETED','PENDING' ) and resolution_desc in ('CANCELLING ORDER','CANCEL ORDER') ) then 1 else 0 end as cancelled.
我尝试了以下代码进行第二次查询,但不起作用:
sparkdf.withColumn('completed', f.when((sparkdf.ticket_status =='COMPLETED') & (~sparkdf.resolution_description.isin('CANCELLING ORDER','CANCEL ORDER','CLOSE SRO')),1).otherwise(0))\
.withColumn('cancelled', f.when((sparkdf.ticket_status == 'CANCELLED') | (sparkdf.ticket_status.isin('COMPLETED','PENDING')) & (sparkdf.resolution_description.isin('CANCELLING ORDER','CANCEL ORDER','CLOSE SRO')),1).otherwise(0))
【问题讨论】:
-
您可以更轻松地将数据框注册为 sql 视图,并针对该视图运行相同的、稍加调整的 sql 代码。注册使用:
df.createOrReplaceTempView('df_view'),使用sql使用:df = spark.sql('''your_sql_query from df_view''')
标签: sql database apache-spark pyspark apache-spark-sql