【发布时间】:2021-12-08 23:52:25
【问题描述】:
我在 Java 中尝试过这个:
Dataset<Row> df1 = spark.read()
.format("avro")
.load("mysource_path");
Row[] rows = df1.selectExpr("sum(distance) as total").take(1);
编译器说:
error: incompatible types: Object cannot be converted to Row[]
Row[] rows = df1.selectExpr("sum(distance) as total").take(1);
【问题讨论】:
标签: java apache-spark apache-spark-sql