Kafka 获取N条消息
解决方案写在前面:./kafka-console-consumer.sh –topic xxxxxx –bootstrap-server localhost:9092 –max-messages 10 –from-beginning业务线有使用 Kafka,有时想从 Kafka 里弄些数据出来做测试,但是 --from-beginning 就有点太多了,同时满屏刷,不太好,如果有类似于...
·
解决方案写在前面:./kafka-console-consumer.sh –topic xxxxxx –bootstrap-server localhost:9092 –max-messages 10 –from-beginning
业务线有使用 Kafka,有时想从 Kafka 里弄些数据出来做测试,但是 --from-beginning
就有点太多了,同时满屏刷,不太好,如果有类似于 SQL 的 limit
关键字就好了。最后是执行 ./kafka-console-consumer.sh
# ./kafka-console-consumer.sh
The console consumer is a tool that reads data from Kafka and outputs it to standard output.
Option Description
------ -----------
--blacklist <blacklist> Blacklist of topics to exclude from
consumption.
--bootstrap-server <server to connect REQUIRED (unless old consumer is
to> used): The server to connect to.
--consumer-property <consumer_prop> A mechanism to pass user-defined
properties in the form key=value to
the consumer.
--consumer.config <config file> Consumer config properties file. Note
that [consumer-property] takes
precedence over this config.
--csv-reporter-enabled If set, the CSV metrics reporter will
be enabled
--delete-consumer-offsets If specified, the consumer path in
zookeeper is deleted when starting up
--enable-systest-events Log lifecycle events of the consumer
in addition to logging consumed
messages. (This is specific for
system tests.)
--formatter <class> The name of a class to use for
formatting kafka messages for
display. (default: kafka.tools.
DefaultMessageFormatter)
--from-beginning If the consumer does not already have
an established offset to consume
from, start with the earliest
message present in the log rather
than the latest message.
--key-deserializer <deserializer for
key>
--max-messages <Integer: num_messages> The maximum number of messages to
consume before exiting. If not set,
consumption is continual.
--metrics-dir <metrics directory> If csv-reporter-enable is set, and
this parameter isset, the csv
metrics will be outputed here
--new-consumer Use the new consumer implementation.
This is the default.
--offset <consume offset> The offset id to consume from (a non-
negative number), or 'earliest'
which means from beginning, or
'latest' which means from end
(default: latest)
--partition <Integer: partition> The partition to consume from.
--property <prop> The properties to initialize the
message formatter.
--skip-message-on-error If there is an error when processing a
message, skip it instead of halt.
--timeout-ms <Integer: timeout_ms> If specified, exit if no message is
available for consumption for the
specified interval.
--topic <topic> The topic id to consume on.
--value-deserializer <deserializer for
values>
--whitelist <whitelist> Whitelist of topics to include for
consumption.
--zookeeper <urls> REQUIRED (only when using old
consumer): The connection string for
the zookeeper connection in the form
host:port. Multiple URLS can be
given to allow fail-over.
注意到里面有个 –max-messages,会只消费 N 条数据,如果配合 --from-beginning
,就会消费最早 N 条数据。之后就可以愉快地玩耍了。
如果是消费之后到达的 N 条信息后退出,不需要加 --from-beginning
就可以了。
路漫漫其修远兮,慢慢加油!好记性不如烂键盘!
更多推荐
已为社区贡献1条内容
所有评论(0)