【发布时间】:2018-02-01 18:49:51
【问题描述】:
我正在尝试使用 DataFrame API 返回两个时间戳之间的行。
示例代码是:
val df = Seq(
("red", "2016-11-29 07:10:10.234"),
("green", "2016-11-29 07:10:10.234"),
("blue", "2016-11-29 07:10:10.234")).toDF("color", "date")
df.where(unix_timestamp($"date", "yyyy-MM-dd HH:mm:ss.S").cast("timestamp").between(LocalDateTime.now(), LocalDateTime.now().minusHours(1))).show()
但它抛出 Unsupported literal type class java.time.LocalDateTime 错误。
Exception in thread "main" java.lang.RuntimeException: Unsupported literal type class java.time.LocalDateTime 2016-11-29T07:32:12.084
at org.apache.spark.sql.catalyst.expressions.Literal$.apply(literals.scala:57)
at org.apache.spark.sql.functions$.lit(functions.scala:101)
at org.apache.spark.sql.Column.$greater$eq(Column.scala:438)
at org.apache.spark.sql.Column.between(Column.scala:542)
at com.sankar.SparkSQLTimestampDifference$.delayedEndpoint$com$sankar$SparkSQLTimestampDifference$1(SparkSQLTimestampDifference.scala:23)
at com.sankar.SparkSQLTimestampDifference$delayedInit$body.apply(SparkSQLTimestampDifference.scala:7)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at com.sankar.SparkSQLTimestampDifference$.main(SparkSQLTimestampDifference.scala:7)
at com.sankar.SparkSQLTimestampDifference.main(SparkSQLTimestampDifference.scala)
【问题讨论】:
标签: apache-spark apache-spark-sql spark-dataframe