代码如下

 public static void producer1() throws ExecutionException, InterruptedException {
        Properties props = new Properties();
        props.put("bootstrap.servers", "xxx:9092,xxx:9092,xxx:9092");
        props.put("zookeeper.connect", "xxx:2181,xxx:2181,xxx:2181");
        props.put("acks", "all");
        props.put("retries", 0);
        props.put("batch.size", 16384);
        props.put("linger.ms", 1);
        props.put("buffer.memory", 33554432);
        props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");


        Producer<String, String> producer = new KafkaProducer<String, String>(props);
        for (int i = 0; i < 100; i++) {
            System.out.println(i);
            Future<RecordMetadata> future = producer.send(new ProducerRecord<String, String>("topic_test_1219", Integer.toString(i), Integer.toString(i)));
            System.out.println("partiton:" +  future.get().partition() + ",offset:" + future.get().offset() + ",isDone:" + future.isDone());
        }
        producer.close();
    }

 

错误如下

Exception in thread "main" java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.TimeoutException: Expiring 1 record(s) for topic_test_1219-2: 30010 ms has passed since batch creation plus linger time
	at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.valueOrError(FutureRecordMetadata.java:94)
	at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.get(FutureRecordMetadata.java:64)
	at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.get(FutureRecordMetadata.java:29)
	at com.zhen.kafka.ProducerTest.producer1(ProducerTest.java:38)
	at com.zhen.kafka.ProducerTest.main(ProducerTest.java:18)
Caused by: org.apache.kafka.common.errors.TimeoutException: Expiring 1 record(s) for topic_test_1219-2: 30010 ms has passed since batch creation plus linger time

  折腾许久,原来是域名解析问题,记得本地hosts中配一下kafka服务所在机器的IP

 

转载于:https://www.cnblogs.com/EnzoDin/p/10143513.html

Logo

Kafka开源项目指南提供详尽教程,助开发者掌握其架构、配置和使用,实现高效数据流管理和实时处理。它高性能、可扩展,适合日志收集和实时数据处理,通过持久化保障数据安全,是企业大数据生态系统的核心。

更多推荐