【问题标题】:Kafka Connect BigQuery Sink Connector requests non-existing key-subject names from the Schema RegistryKafka Connect BigQuery Sink Connector 从 Schema Registry 请求不存在的关键主题名称
【发布时间】:2021-07-26 23:13:03
【问题描述】:

这是该问题的后续问题:Kafka Connect BigQuery Sink Connector requests incorrect subject names from the Schema Registry

在我们的 Kafka (Avro) 事件中尝试使用 confluentinc/kafka-connect-bigquery 时,我遇到了以下错误:

org.apache.kafka.connect.errors.DataException: Failed to deserialize data for topic domain.rating.annotated to Avro: 
    at io.confluent.connect.avro.AvroConverter.toConnectData(AvroConverter.java:125)
[...]
Caused by: org.apache.kafka.common.errors.SerializationException: Error retrieving Avro key schema version for id 619
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Subject 'domain.rating.annotated-com.acme.message_schema.type.domain.key.DefaultKey' not found.; error code: 40401
    at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:295)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:355)
    at io.confluent.kafka.schemaregistry.client.rest.RestService.lookUpSubjectVersion(RestService.java:440)
[...]

确实存在的主题是

curl --silent -X GET http://avro-schema-registry.core-kafka.svc.cluster.local:8081/subjects | jq .
[...]
  "domain.rating.annotated-com.acme.message_schema.domain.rating.annotated.Key",
  "domain.rating.annotated-com.acme.message_schema.domain.rating.annotated.RatingTranslated",
[...]

为什么它会寻找...DefaultKey,我怎样才能让它做正确的事情?

我的properties/standalone.properties(我正在使用quickstart 文件夹。)如下所示:

bootstrap.servers=kafka.core-kafka.svc.cluster.local:9092
key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://avro-schema-registry.core-kafka.svc.cluster.local:8081
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://avro-schema-registry.core-kafka.svc.cluster.local:8081
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
offset.storage.file.filename=/tmp/connect.offsets
key.converter.key.subject.name.strategy=io.confluent.kafka.serializers.subject.TopicRecordNameStrategy
value.converter.value.subject.name.strategy=io.confluent.kafka.serializers.subject.TopicRecordNameStrategy

【问题讨论】:

    标签: apache-kafka google-bigquery avro apache-kafka-connect confluent-schema-registry


    【解决方案1】:

    啊,我需要以下:

    key.converter.key.subject.name.strategy=io.confluent.kafka.serializers.subject.RecordNameStrategy
    value.converter.value.subject.name.strategy=io.confluent.kafka.serializers.subject.TopicRecordNameStrategy
    

    有了这个,它就可以工作了。 :)

    我真的很难浏览文档。很抱歉提出多余的问题。

    【讨论】:

      猜你喜欢
      • 2021-07-26
      • 2021-03-17
      • 1970-01-01
      • 2020-09-02
      • 2021-09-08
      • 2020-07-28
      • 2021-07-15
      • 2021-03-16
      • 2021-03-08
      相关资源
      最近更新 更多