【问题标题】:Using Spark JDBC and Avatica to read records from a table in Apache Druid使用 Spark JDBC 和 Avatica 从 Apache Druid 中的表中读取记录
【发布时间】:2021-01-31 14:28:36
【问题描述】:

我正在尝试在 Spark 中创建一个数据框,该数据框将包含 Apache Druid 中表中的所有记录,并且我正在使用 JDBC 执行此操作。 Druid 似乎正在使用 Calcite-Avatica JDBC 驱动程序(提到了here)。

    df = spark.read.format('jdbc').option('url', 'jdbc:avatica:remote:url=http://172.31.5.20:8082/druid/v2/sql/avatica/').option('driver', 'org.apache.calcite.avatica.remote.Driver').option('dbtable', 'mytable').load()

但我收到以下错误

Py4JJavaError: An error occurred while calling o456.load.
: java.sql.SQLException: While closing connection
    at org.apache.calcite.avatica.Helper.createException(Helper.java:39)
    at org.apache.calcite.avatica.AvaticaConnection.close(AvaticaConnection.java:156)
    at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:70)
    at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:115)
    at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:52)
    at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:341)
    at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:239)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
    at py4j.Gateway.invoke(Gateway.java:282)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:238)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "rpcMetadata" (class org.apache.calcite.avatica.remote.Service$CloseConnectionResponse), not marked as ignorable (0 known properties: ])
 at [Source: (String)"{"response":"closeConnection","rpcMetadata":{"response":"rpcMetadata","serverAddress":"ip-172-31-5-234.ap-southeast-1.compute.internal:8082"}}
"; line: 1, column: 46] (through reference chain: org.apache.calcite.avatica.remote.Service$CloseConnectionResponse["rpcMetadata"])
    at org.apache.calcite.avatica.remote.JsonService.handle(JsonService.java:142)
    at org.apache.calcite.avatica.remote.JsonService.apply(JsonService.java:229)
    at org.apache.calcite.avatica.remote.RemoteMeta.closeConnection(RemoteMeta.java:78)
    at org.apache.calcite.avatica.AvaticaConnection.close(AvaticaConnection.java:153)
    ... 18 more
Caused by: com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "rpcMetadata" (class org.apache.calcite.avatica.remote.Service$CloseConnectionResponse), not marked as ignorable (0 known properties: ])
 at [Source: (String)"{"response":"closeConnection","rpcMetadata":{"response":"rpcMetadata","serverAddress":"ip-172-31-5-234.ap-southeast-1.compute.internal:8082"}}
"; line: 1, column: 46] (through reference chain: org.apache.calcite.avatica.remote.Service$CloseConnectionResponse["rpcMetadata"])
    at com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException.from(UnrecognizedPropertyException.java:61)
    at com.fasterxml.jackson.databind.DeserializationContext.handleUnknownProperty(DeserializationContext.java:823)
    at com.fasterxml.jackson.databind.deser.std.StdDeserializer.handleUnknownProperty(StdDeserializer.java:1153)
    at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownProperty(BeanDeserializerBase.java:1589)
    at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownVanilla(BeanDeserializerBase.java:1567)
    at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:294)
    at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeOther(BeanDeserializer.java:189)
    at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:161)
    at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer._deserializeTypedForId(AsPropertyTypeDeserializer.java:130)
    at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer.deserializeTypedFromObject(AsPropertyTypeDeserializer.java:97)
    at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeWithType(BeanDeserializerBase.java:1178)
    at com.fasterxml.jackson.databind.deser.impl.TypeWrappedDeserializer.deserialize(TypeWrappedDeserializer.java:68)
    at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4014)
    at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3005)
    at org.apache.calcite.avatica.remote.JsonService.decode(JsonService.java:131)
    at org.apache.calcite.avatica.remote.JsonService.apply(JsonService.java:227)
    ... 20 more

有谁知道这可能是什么原因,我该如何解决这个问题?这似乎是 Avatica 驱动程序的一个问题,它获取了一个带有无法识别字段的 json 对象

我正在使用驱动程序 org.apache.calcite.avatica:avatica-core:1.17.0 并将 jar 文件添加到我的 spark.jars 属性中。我正在使用 Druid 0.19.0 和 Spark2。

编辑:我检查了 Avatica JDBC 框架的源代码,注释为 @JsonCreator 的构造函数在被反序列化的 json 对象中需要一个名为 rpcMetadata 的属性。源代码是here

【问题讨论】:

    标签: apache-spark pyspark druid apache-calcite


    【解决方案1】:

    使用下面的库并重试。

    "org.apache.calcite.avatica" % "avatica" % "1.8.0"

    【讨论】:

    • 感谢您的建议,但这并没有帮助。我收到了完全相同的错误消息。
    • 好的,我已经创建了 build.sbt 并测试了它在 scala 中的工作。你是如何在 python 中使用来导入这个库的?
    • 我从 maven Central 下载了 jar 并将其添加到 spark.jars 配置参数中。然后,我在 zeppelin 笔记本上写了那一行并得到了错误。我查看了源代码,发现@JsonCreator 构造函数需要一个名为rpcMetadata 的字段,这就是为什么我不明白为什么会抛出这个错误。
    • @thisisshantzz 面对同样的问题,这里的任何建议都会有很大帮助。提前致谢。
    猜你喜欢
    • 2021-02-04
    • 2022-11-24
    • 2017-09-17
    • 1970-01-01
    • 1970-01-01
    • 2020-08-02
    • 1970-01-01
    • 1970-01-01
    • 2022-11-02
    相关资源
    最近更新 更多