【发布时间】:2016-05-24 06:33:00
【问题描述】:
我正在尝试将我的数据帧保存在 s3 中,如下所示:
myDF.write.format("com.databricks.spark.csv").options(codec="org.apache.hadoop.io.compress.GzipCodec").save("s3n://myPath/myData.csv")
然后我得到了错误:
<console>:132: error: overloaded method value options with alternatives:
(options: java.util.Map[String,String])org.apache.spark.sql.DataFrameWriter <and>
(options: scala.collection.Map[String,String])org.apache.spark.sql.DataFrameWriter
cannot be applied to (codec: String)
有人知道我错过了什么吗?谢谢!
【问题讨论】:
标签: scala apache-spark dataframe