创建topic maxmessagetest 

./kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 2 --partitions 8 --topic maxmessagetest

集群zookeeper 创建队列

./kafka-topics.sh --zookeeper zk-server-01, zk-server-02, zk-server-03 --create --topic test-pro-lww --partitions 8 --replication-factor 2



发送消息50万条数据,每条3000 字节

./kafka-producer-perf-test.sh --num-records 500000 --record-size 3000 --topic test-rep-lww  --producer-props bootstrap.servers=kafka-01:19022,kafka-02:19023,kafka-03:19024

提高吞吐量可以提高发送速度  

./kafka-producer-perf-test.sh --num-records 500000 --record-size 3000 --topic test-rep-lww --throughput 1000 --producer-props bootstrap.servers=kafka-01:19022,kafka-02:19023,kafka-03:19024

消费消息

./kafka-consumer-perf-test.sh  --zookeeper zk-server-01, zk-server-02, zk-server-03 -messages 50000 --topic test-rep-lww --threads 1 --broker-list kafka-01:19022,kafka-02:19023,kafka-03:19024  --batch-size 2000

删除队列

./kafka-topics.sh --zookeeper zk-server-01, zk-server-02, zk-server-03 --topic test-rep-lww  --delete


查看队列信息

./kafka-topics  --describe --zookeeper zk-server-01, zk-server-02, zk-server-03  --topic test-rep-lww

设置队列属性max.message.bytes ./kafka-topics.sh --zookeeper zk-server-01, zk-server-02, zk-server-03 --alter --topic maxmessagetest  --config max.message.bytes=30000000

./kafka-configs.sh  --zookeeper zk-server-01, zk-server-02, zk-server-03 --describe --add-config 'max.message.bytes=30000000' --entity-name maxmessagetest --entity-type topics


./kafka-consumer-offset-checker.sh   --zookeeper zk-server-01, zk-server-02, zk-server-03 --topic mq-to-appserver-topic-3_fkMglEp3Na3IMmN2qUZ9 --broker-info kafka-01:19022,kafka-02:19023,kafka-03:19024  --group  mq-dispatcher-consumer

./kafka-consumer-groups.sh --zookeeper zk-server-01, zk-server-02, zk-server-03  --topic mq-to-appserver-topic-3_fkMglEp3Na3IMmN2qUZ9  --bootstrap-server kafka-01:19022,kafka-02:19023,kafka-03:19024  --group  mq-dispatcher-consumer --describe


Logo

Kafka开源项目指南提供详尽教程,助开发者掌握其架构、配置和使用,实现高效数据流管理和实时处理。它高性能、可扩展,适合日志收集和实时数据处理,通过持久化保障数据安全,是企业大数据生态系统的核心。

更多推荐