【发布时间】:2023-04-05 08:09:02
【问题描述】:
我正在使用 Spring cloud stream kafka 作为活页夹。当我的消息太大时,我收到错误
ERROR o.s.k.s.LoggingProducerListener - Exception thrown when sending a message with key='null' and payload='{123, 34, 105, 100, 34, 58, 34, 115, 105, 110, 103, 97, 112, 111, 114, 101, 104, 101, 114, 97, 108, ...' to topic page:
org.apache.kafka.common.errors.RecordTooLargeException: The message is 4711755 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.
这是我下面用于发送消息的springboot代码
private BinderAwareChannelResolver resolver;
boolean isSent = this.resolver.resolveDestination(this.topic)
.send(message);
由于我收到错误,我应该能够在我的 springboot 代码中捕获 RecordTooLargeException。但是,它没有被捕获并且代码继续。 isSent 也返回为“true”。它不应该返回为假吗?我怎样才能捕捉到这个错误并处理它?谢谢
【问题讨论】:
标签: spring-boot apache-kafka spring-cloud