执行ImoocStatStreamingApp时报错
来源:12-11 -Spark Streaming对接Kafka的数据进行消费

weixin_慕哥1181079
2019-11-29
这个是我的ImoocStatStreamingApp代码
package com.imooc.spark.project
import org.apache.spark.SparkConf
import org.apache.spark.streaming.kafka.KafkaUtils
import org.apache.spark.streaming.{Seconds, StreamingContext}
/**
* 使用SparkStreaming处理Kafka过来的数据
*/
object ImoocStatStreamingApp {
def main(args: Array[String]): Unit = {
if(args.length != 4) {
println("Usage: ImoocStatStreamingApp <zkQuorum> <group> <topics> <numThreads>")
System.exit(1)
}
val Array(zkQuorum, groupId, topics, numThreads) = args
val sparkConf = new SparkConf().setAppName("ImoocStatStreamingApp")
.setMaster("local[2]")
val ssc = new StreamingContext(sparkConf, Seconds(60))
val topicMap = topics.split(",").map((_, numThreads.toInt)).toMap
val messages = KafkaUtils.createStream(ssc, zkQuorum, groupId, topicMap)
//测试步骤一:测试数据接收
messages.map(_._2).count().print
ssc.start()
ssc.awaitTermination()
}
}
错误信息如下:
19/11/29 11:59:21 INFO spark.SparkContext: Running Spark version 2.3.1
19/11/29 11:59:22 INFO spark.SparkContext: Submitted application: ImoocStatStreamingApp
19/11/29 11:59:23 INFO spark.SecurityManager: Changing view acls to: admin
19/11/29 11:59:23 INFO spark.SecurityManager: Changing modify acls to: admin
19/11/29 11:59:23 INFO spark.SecurityManager: Changing view acls groups to:
19/11/29 11:59:23 INFO spark.SecurityManager: Changing modify acls groups to:
19/11/29 11:59:23 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(admin); groups with view permissions: Set(); users with modify permissions: Set(admin); groups with modify permissions: Set()
Exception in thread "main" java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric;
at org.apache.spark.network.util.NettyMemoryMetrics.registerMetrics(NettyMemoryMetrics.java:80)
at org.apache.spark.network.util.NettyMemoryMetrics.<init>(NettyMemoryMetrics.java:76)
at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:109)
at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:71)
at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:423)
at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:838)
at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:85)
at com.imooc.spark.project.ImoocStatStreamingApp$.main(ImoocStatStreamingApp.scala:22)
at com.imooc.spark.project.ImoocStatStreamingApp.main(ImoocStatStreamingApp.scala)
19/11/29 11:59:24 INFO util.ShutdownHookManager: Shutdown hook called
Process finished with exit code 1
写回答
1回答
-
Michael_PK
2019-11-29
netty包版本有冲突。找到现在依赖netty的dependency,先排除掉,再手工加合适的版本在pom中
052019-11-30
相似问题