hadoop jar hadoop-mapreduce-examples-2.6.0-cdh5.15.1.jar pi 2 3集群连接不上
来源:9-7 课程总结
慕虎7937911
2019-09-28
输入正文
[root@hadoop mapreduce]# hadoop jar hadoop-mapreduce-examples-2.6.0-cdh5.15.1.jar pi 2 3
Number of Maps = 2
Samples per Map = 3
19/09/28 11:29:53 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/09/28 11:29:54 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Bad connect ack with firstBadLink as 192.168.182.137:50010
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1770)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1668)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:790)
19/09/28 11:29:54 WARN hdfs.DFSClient: Abandoning BP-942963525-192.168.182.140-1569640810308:blk_1073741827_1003
19/09/28 11:29:54 WARN hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[192.168.182.137:50010,DS-c10f0aa7-5134-4481-99ff-f5a8e6849741,DISK]
Wrote input for Map #0
19/09/28 11:29:54 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Bad connect ack with firstBadLink as 192.168.182.137:50010
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1770)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1668)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:790)
19/09/28 11:29:54 WARN hdfs.DFSClient: Abandoning BP-942963525-192.168.182.140-1569640810308:blk_1073741829_1005
19/09/28 11:29:54 WARN hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[192.168.182.137:50010,DS-c10f0aa7-5134-4481-99ff-f5a8e6849741,DISK]
Wrote input for Map #1
Starting Job
19/09/28 11:29:54 INFO client.RMProxy: Connecting to ResourceManager at hadoop/192.168.182.140:8032
19/09/28 11:29:55 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Bad connect ack with firstBadLink as 192.168.182.137:50010
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1770)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1668)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:790)
19/09/28 11:29:55 WARN hdfs.DFSClient: Abandoning BP-942963525-192.168.182.140-1569640810308:blk_1073741831_1007
19/09/28 11:29:55 WARN hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[192.168.182.137:50010,DS-c10f0aa7-5134-4481-99ff-f5a8e6849741,DISK]
19/09/28 11:29:55 INFO input.FileInputFormat: Total input paths to process : 2
19/09/28 11:29:55 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Bad connect ack with firstBadLink as 192.168.182.137:50010
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1770)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1668)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:790)
19/09/28 11:29:55 WARN hdfs.DFSClient: Abandoning BP-942963525-192.168.182.140-1569640810308:blk_1073741833_1009
19/09/28 11:29:55 WARN hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[192.168.182.137:50010,DS-c10f0aa7-5134-4481-99ff-f5a8e6849741,DISK]
19/09/28 11:29:55 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Bad connect ack with firstBadLink as 192.168.182.137:50010
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1770)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1668)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:790)
19/09/28 11:29:55 WARN hdfs.DFSClient: Abandoning BP-942963525-192.168.182.140-1569640810308:blk_1073741835_1011
19/09/28 11:29:55 WARN hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[192.168.182.137:50010,DS-c10f0aa7-5134-4481-99ff-f5a8e6849741,DISK]
19/09/28 11:29:55 INFO mapreduce.JobSubmitter: number of splits:2
19/09/28 11:29:55 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Bad connect ack with firstBadLink as 192.168.182.137:50010
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1770)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1668)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:790)
19/09/28 11:29:55 WARN hdfs.DFSClient: Abandoning BP-942963525-192.168.182.140-1569640810308:blk_1073741837_1013
19/09/28 11:29:55 WARN hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[192.168.182.137:50010,DS-c10f0aa7-5134-4481-99ff-f5a8e6849741,DISK]
19/09/28 11:29:55 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1569641046069_0001
19/09/28 11:29:56 INFO impl.YarnClientImpl: Submitted application application_1569641046069_0001
19/09/28 11:29:56 INFO mapreduce.Job: The url to track the job: http://hadoop:8088/proxy/application_1569641046069_0001/
19/09/28 11:29:56 INFO mapreduce.Job: Running job: job_1569641046069_0001
19/09/28 11:30:14 INFO mapreduce.Job: Job job_1569641046069_0001 running in uber mode : false
19/09/28 11:30:14 INFO mapreduce.Job: map 0% reduce 0%
19/09/28 11:30:21 INFO mapreduce.Job: map 50% reduce 0%
1回答
-
Michael_PK
2019-09-28
你的hadoop环境有问题
00
相似问题