使用KafkaSInk将Flume收集到的数据输出到Kafka失败

来源:11-5 -使用KafkaSInk将Flume收集到的数据输出到Kafka

hwbaker

2018-02-18

启动IDEA程序后,观察flume输出,报错如下图所示:

http://img.mukewang.com/szimg/5a89889d00019b6b16000761.jpg


配置flume Agent文件 streaming2.conf:

agent1.sources = avro-source

agent1.channels = logger-channel

agent1.sinks = kafka-sink

#define source

agent1.sources.avro-source.type = avro

agent1.sources.avro-source.bind = 127.0.0.1

agent1.sources.avro-source.port = 41414

#define channel

agent1.channels.logger-channel.type = memory

#define sink

agent1.sinks.kafka-sink.type = org.apache.flume.sink.kafka.KafkaSink

agent1.sinks.kafka-sink.topic = streamingtopic

agent1.sinks.kafka-sink.brokerList = localhost:9092

agent1.sinks.kafka-sink.requiredAcks = 1

agent1.sinks.kafka-sink.batchSize = 20

agent1.sources.avro-source.channels = logger-channel

agent1.sinks.kafka-sink.channel = logger-channel


写回答

2回答

hwbaker

提问者

2018-02-19

pom.xml中kafka改为0.9.0.0,如下:

<kafka.version>0.9.0.0</kafka.version>


把如下这段注释掉:

<dependency>

            <groupId>org.apache.kafka</groupId>

            <artifactId>kafka_2.11</artifactId>

            <version>${kafka.version}</version>

        </dependency>

注释掉就好了,不太明白为什么。


0
1
Michael_PK
有些依赖的jar冲突
2018-02-19
共1条回复

Michael_PK

2018-02-18

群里回答

0
0

Spark Streaming实时流处理项目实战

Flume+Kafka+Spark Streaming 构建通用实时流处理平台

1404 学习 · 571 问题

查看课程