有业务上推送到kafka的json串非常大,json文件达到了6m,差不多36万行,内部嵌套四层,需要我们从kafka中接收数据并进行解析。
在这里插入图片描述
在测试过程中,需要自己将该json串生产到测试的topic,发现这么大的字符串,没有办法从控制台那里粘贴上去。此处我们是用java写个生产者,读取文件然后发送值topic。然而不报错,也消费不到。

这种情况下,需要配置kafka相关的一些参数,以下相关的参数放上来。

Consumer side : fetch.message.max.bytes

  • this will determine the largest size of a message that can be fetched by the consumer.

Broker side : replica.fetch.max.bytes

  • this will allow for the replicas in the brokers to send messages within the cluster and make sure the messages are replicated correctly. If this is too small, then the message will never be replicated, and therefore, the consumer will never see the message because the message will never be committed (fully replicated).

Broker side : message.max.bytes

  • this is the largest size of the message that can be received by the broker from a producer.

Broker side (per topic) : max.message.bytes

  • this is the largest size of the message the broker will allow to be appended to the topic. This size is validated pre-compression. (Defaults to broker’s message.max.bytes.)

Producer: Increase max.request.size
to send the larger message.

Consumer: Increase max.partition.fetch.bytes
to receive larger messages.

Logo

Kafka开源项目指南提供详尽教程,助开发者掌握其架构、配置和使用,实现高效数据流管理和实时处理。它高性能、可扩展,适合日志收集和实时数据处理,通过持久化保障数据安全,是企业大数据生态系统的核心。

更多推荐