【问题标题】:PySpark: cast "string-integer" column to IntegerTypePySpark:将“字符串整数”列转换为 IntegerType
【发布时间】:2023-03-06 16:55:01
【问题描述】:

我有一列以datetime.datetime 对象作为其内容。我正在尝试使用pyspark.sql.Window 功能,它需要数字类型,而不是日期时间或字符串。所以我的计划是将datetime.datetime对象转换为UNIX时间戳:

设置:

>>> import datetime; df = sqlContext.createDataFrame(
... [(datetime.datetime(2018, 1, 17, 19, 0, 15),),
... (datetime.datetime(2018, 1, 17, 19, 0, 16),)], ['dt'])
>>> df
DataFrame[dt: timestamp]
>>> df.dtypes
[('dt', 'timestamp')]
>>> df.show(5, False)
+---------------------+
|dt                   |
+---------------------+
|2018-01-17 19:00:15.0|
|2018-01-17 19:00:16.0|
+---------------------+

定义一个函数来访问datetime.datetime对象的timestamp函数:

def dt_to_timestamp():
    def _dt_to_timestamp(dt):
        return int(dt.timestamp() * 1000)
    return func.udf(_dt_to_timestamp)

应用该功能:

>>> df = df.withColumn('dt_ts', dt_to_timestamp()(func.col('dt')))
>>> df.show(5, False)
+---------------------+-------------+
|dt                   |dt_ts        |
+---------------------+-------------+
|2018-01-17 19:00:15.0|1516237215000|
|2018-01-17 19:00:16.0|1516237216000|
+---------------------+-------------+

>>> df.dtypes
[('dt', 'timestamp'), ('dt_ts', 'string')]

当内部_dt_to_timestamp 函数返回int 时,我不确定为什么此列默认为string,但让我们尝试将这些“字符串整数”转换为IntegerTypes:

>>> df = df.withColumn('dt_ts', func.col('dt_ts').cast(IntegerType()))
>>> df.show(5, False)
+---------------------+-----+
|dt                   |dt_ts|
+---------------------+-----+
|2018-01-17 19:00:15.0|null |
|2018-01-17 19:00:16.0|null |
+---------------------+-----+

>>> df.dtypes
[('dt', 'timestamp'), ('dt_ts', 'int')]

这似乎只是IntegerType coercion 的问题。对于DoubleTypes,转换有效,但我更喜欢整数...

>>> df = df.withColumn('dt_ts', dt_to_timestamp()(func.col('dt')))
>>> df = df.withColumn('dt_ts', func.col('dt_ts').cast(DoubleType()))
>>> df.show(5, False)
+---------------------+--------------+
|dt                   |dt_ts         |
+---------------------+--------------+
|2018-01-17 19:00:15.0|1.516237215E12|
|2018-01-17 19:00:16.0|1.516237216E12|
+---------------------+--------------+

【问题讨论】:

    标签: python datetime types pyspark


    【解决方案1】:

    这是因为IntegerType 无法存储与您尝试转换一样大的数字。请改用bigint/long 类型:

    >>> df = df.withColumn('dt_ts', dt_to_timestamp()(func.col('dt')))
    >>> df.show()
    +--------------------+-------------+
    |                  dt|        dt_ts|
    +--------------------+-------------+
    |2018-01-17 19:00:...|1516237215000|
    |2018-01-17 19:00:...|1516237216000|
    +--------------------+-------------+
    
    >>> df = df.withColumn('dt_ts', func.col('dt_ts').cast('long'))
    >>> df.show()
    +--------------------+-------------+
    |                  dt|        dt_ts|
    +--------------------+-------------+
    |2018-01-17 19:00:...|1516237215000|
    |2018-01-17 19:00:...|1516237216000|
    +--------------------+-------------+
    
    >>> df.dtypes
    [('dt', 'timestamp'), ('dt_ts', 'bigint')]
    

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 2017-02-03
      • 1970-01-01
      • 1970-01-01
      • 2018-04-15
      • 2015-01-14
      • 2021-06-18
      • 2017-12-19
      • 2020-07-16
      相关资源
      最近更新 更多