咋将kafka与elasticsearch连通,将kafka的数据传入elasticsearch

可以用kafka 的connect,连接elasticsearch的话有开源的连接器。connect介绍:Kafka Connect简介github地址:GitHub - hannesstockner/kafka-connect-elasticsearch
■网友
kafka作为一个缓存的中间件,可以使用流式处理组件spark streaming对kafka中的数据进行消费写到es中,也可以使用flume。看你实际需要!

■网友
可能您已经搞定了····这是我们的logstash配置
input {
kafka {
zk_connect =\u0026gt; "192.168.80.XX:2181,192.168.80.YY:2181,192.168.80.ZZ:2181,192.168.80.XX:2181,192.168.80.YY:2181,192.168.80.ZZ:2181"
decoder_class =\u0026gt; "cn.test.kafka.v08.RequestDecoder"
codec =\u0026gt; "json"
group_id =\u0026gt; "yunweielk3"
consumer_id =\u0026gt; "219"
topic_id =\u0026gt; "DP10000025F901"
auto_offset_reset =\u0026gt; "largest"
type =\u0026gt; "timeconsuming-tpeg-74"
}
}
filter {
BALABALA...
}
output {
elasticsearch {
hosts =\u0026gt; "192.168.15.XX:9201"
index =\u0026gt; "logstash-YY-ZZ-%{+YYYY.MM.dd}"
}
【咋将kafka与elasticsearch连通,将kafka的数据传入elasticsearch】 }

■网友
Kafka has a built-in framework called Kafka Connect for writing sources and sinks that either continuously ingest data into Kafka or continuously ingest data in Kafka into external systems. The connectors themselves for different applications or data systems are federated and maintained separately from the main code base.Elasticsearch Connector


    推荐阅读