FlinkSql配置Kerberos环境下的Kafka Source&Sink
1、配置flink-conf.yaml配置文件,将kerberos相关配置放开security.kerberos.login.use-ticket-cache: truesecurity.kerberos.login.keytab: /path/to/kerberos/keytabsecurity.kerberos.login.principal: flink-user# The configur
·
1、配置flink-conf.yaml配置文件,将kerberos相关配置放开
security.kerberos.login.use-ticket-cache: true
security.kerberos.login.keytab: /path/to/kerberos/keytab
security.kerberos.login.principal: flink-user
# The configuration below defines which JAAS login contexts
security.kerberos.login.contexts: Client,KafkaClient
2、kafka flinksql配置
新增这三个配置
'connector.properties.security.protocol' = 'SASL_PLAINTEXT',
'connector.properties.sasl.mechanism' = 'GSSAPI',
'connector.properties.sasl.kerberos.service.name' = 'kafka',
CREATE TABLE person (
address VARCHAR COMMENT '家庭住址',
age VARCHAR COMMENT '年龄',
city VARCHAR COMMENT '所在城市',
name VARCHAR COMMENT '姓名'
)
WITH (
'connector.type' = 'kafka',
'connector.topic' = 'test1',
'connector.version' = 'universal',
'connector.startup-mode' = 'earliest-offset',
'connector.properties.zookeeper.connect' = 'cdh1:2181,cdh2:2181,cdh3:2181',
'connector.properties.bootstrap.servers' = 'cdh1:9092,cdh2:9092,cdh3:9092',
'connector.properties.security.protocol' = 'SASL_PLAINTEXT',
'connector.properties.sasl.mechanism' = 'GSSAPI',
'connector.properties.sasl.kerberos.service.name' = 'kafka',
'update-mode' = 'append',
'format.type' = 'csv',
'format.derive-schema' = 'true'
);
更多推荐
已为社区贡献1条内容
所有评论(0)