DataNode无法启动,下面是日志信息

来源:3-8 HDFS伪分布式环境搭建

被吊打的学渣

2018-01-13

STARTUP_MSG:   java = 1.8.0_151
************************************************************/
2018-01-13 16:35:33,416 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]
2018-01-13 16:35:34,730 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2018-01-13 16:35:34,845 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s).
2018-01-13 16:35:34,845 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started
2018-01-13 16:35:34,855 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576
2018-01-13 16:35:34,855 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is localhost
2018-01-13 16:35:34,858 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 0
2018-01-13 16:35:34,884 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:50010
2018-01-13 16:35:34,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwith is 10485760 bytes/s
2018-01-13 16:35:34,886 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50
2018-01-13 16:35:35,152 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
2018-01-13 16:35:35,160 INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets.
2018-01-13 16:35:35,185 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not defined
2018-01-13 16:35:35,190 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2018-01-13 16:35:35,191 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode
2018-01-13 16:35:35,191 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
2018-01-13 16:35:35,191 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
2018-01-13 16:35:35,225 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 40295
2018-01-13 16:35:35,225 INFO org.mortbay.log: jetty-6.1.26
2018-01-13 16:35:35,485 INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@localhost:40295
2018-01-13 16:35:35,800 INFO org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer: Listening HTTP traffic on /0.0.0.0:50075
2018-01-13 16:35:35,835 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor
2018-01-13 16:35:36,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = root
2018-01-13 16:35:36,227 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup
2018-01-13 16:35:36,340 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler
2018-01-13 16:35:36,389 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 50020
2018-01-13 16:35:36,469 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /0.0.0.0:50020
2018-01-13 16:35:36,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: null
2018-01-13 16:35:36,490 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: <default>
2018-01-13 16:35:36,521 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Block pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:9000 starting to offer service
2018-01-13 16:35:36,658 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting
2018-01-13 16:35:36,850 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 50020: starting
2018-01-13 16:35:37,300 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:9000
2018-01-13 16:35:37,306 INFO org.apache.hadoop.hdfs.server.common.Storage: Using 1 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=1, dataDirs=1)
2018-01-13 16:35:37,348 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /opt/hadoop-2.8.1/tmp/dfs/data/in_use.lock acquired by nodename 2964@localhost
2018-01-13 16:35:37,365 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage directory [DISK]file:/opt/hadoop-2.8.1/tmp/dfs/data/
java.io.IOException: Incompatible clusterIDs in /opt/hadoop-2.8.1/tmp/dfs/data: namenode clusterID = CID-24ec2789-02ba-44c2-ae15-590a7d19d470; datanode clusterID = CID-45789740-00c7-4574-8518-6afaa58f522c
        at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:760)
        at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:293)
        at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:409)
        at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:388)
        at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:556)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1566)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1527)
        at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:327)
        at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:266)
        at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:746)
        at java.lang.Thread.run(Thread.java:748)
2018-01-13 16:35:37,367 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool <registering> (Datanode Uuid ad41268d-4b55-4196-b087-9ff064f7b5c3) service to localhost/127.0.0.1:9000. Exiting. 
java.io.IOException: All specified directories are failed to load.
        at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:557)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1566)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1527)
        at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:327)
        at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:266)
        at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:746)
        at java.lang.Thread.run(Thread.java:748)
2018-01-13 16:35:37,368 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Ending block pool service for: Block pool <registering> (Datanode Uuid ad41268d-4b55-4196-b087-9ff064f7b5c3) service to localhost/127.0.0.1:9000
2018-01-13 16:35:37,492 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Removed Block pool <registering> (Datanode Uuid ad41268d-4b55-4196-b087-9ff064f7b5c3)
2018-01-13 16:35:39,499 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Exiting Datanode
2018-01-13 16:35:39,501 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 0
2018-01-13 16:35:39,573 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at localhost/127.0.0.1
************************************************************/


写回答

1回答

Michael_PK

2018-01-13

信息已经告诉你cluster ID不兼容,你可以先通过搜索引擎自己尝试解决,

0
1
被吊打的学渣
清除DataNode里面的内容就可以了,刚刚找到了。
2018-01-13
共1条回复

10小时入门大数据

【毕设】大数据零基础入门必备,轻松掌握Hadoop开发核心技能

1456 学习 · 656 问题

查看课程