成功格式化之后没有datanode
来源:2-7 Hadoop格式化&启停

npcxh
2021-08-07
[hadoop@hadoop000 sbin]$ jps
8513 MainGenericRunner
10978 Jps
[hadoop@hadoop000 sbin]$ ./hadoop-daemon.sh start namenode
starting namenode, logging to /home/hadoop/app/hadoop-2.6.0-cdh5.16.2/logs/hadoop-hadoop-namenode-hadoop000.out
[hadoop@hadoop000 sbin]$ jps
8513 MainGenericRunner
11013 NameNode
11087 Jps
[hadoop@hadoop000 sbin]$ ./hadoop-daemon.sh start datanode
starting datanode, logging to /home/hadoop/app/hadoop-2.6.0-cdh5.16.2/logs/hadoop-hadoop-datanode-hadoop000.out
[hadoop@hadoop000 sbin]$ jps
8513 MainGenericRunner
11013 NameNode
11167 Jps
[hadoop@hadoop000 sbin]$ tail -200f /home/hadoop/app/hadoop-2.6.0-cdh5.16.2/logs/hadoop-hadoop-datanode-hadoop000.log
日志:
————————————————————————————————————————
2021-08-07 20:04:00,327 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM,
HUP, INT]
2021-08-07 20:04:00,500 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your plat
form… using builtin-java classes where applicable
2021-08-07 20:04:00,695 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.prope
rties
2021-08-07 20:04:00,755 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s
).
2021-08-07 20:04:00,755 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started
2021-08-07 20:04:00,764 INFO org.apache.hadoop.hdfs.server.datanode.BlockScanner: Initialized block scanner with targetB
ytesPerSec 1048576
2021-08-07 20:04:00,766 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is hadoop000
2021-08-07 20:04:00,770 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting DataNode with maxLockedMemory = 0
2021-08-07 20:04:00,790 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:50010
2021-08-07 20:04:00,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwith is 10485760 bytes/s
2021-08-07 20:04:00,792 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Number threads for balancing is 50
2021-08-07 20:04:00,833 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mort
bay.log.Slf4jLog
2021-08-07 20:04:00,836 INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize
FileSignerSecretProvider, falling back to use random secrets.
2021-08-07 20:04:00,846 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.datanode is not d
efined
2021-08-07 20:04:00,852 INFO org.apache.hadoop.http.HttpServer2: Added global filter ‘safety’ (class=org.apache.hadoop.h
ttp.HttpServer2QuotingInputFilter)2021−08−0720:04:00,853INFOorg.apache.hadoop.http.HttpServer2:Addedfilterstaticuserfilter(class=org.apache.hadoop.http.lib.StaticUserWebFilterQuotingInputFilter)
2021-08-07 20:04:00,853 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoo
p.http.lib.StaticUserWebFilterQuotingInputFilter)2021−08−0720:04:00,853INFOorg.apache.hadoop.http.HttpServer2:Addedfilterstaticuserfilter(class=org.apache.hadoop.http.lib.StaticUserWebFilterStaticUserFilter) to context datanode
2021-08-07 20:04:00,854 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoo
p.http.lib.StaticUserWebFilterStaticUserFilter)tocontextstatic2021−08−0720:04:00,854INFOorg.apache.hadoop.http.HttpServer2:Addedfilterstaticuserfilter(class=org.apache.hadoop.http.lib.StaticUserWebFilterStaticUserFilter) to context static
2021-08-07 20:04:00,854 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoo
p.http.lib.StaticUserWebFilterStaticUserFilter)tocontextstatic2021−08−0720:04:00,854INFOorg.apache.hadoop.http.HttpServer2:Addedfilterstaticuserfilter(class=org.apache.hadoop.http.lib.StaticUserWebFilterStaticUserFilter) to context logs
2021-08-07 20:04:00,864 INFO org.apache.hadoop.http.HttpServer2: HttpServer.start() threw a non Bind IOException
java.net.BindException: Port in use: localhost:0
at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:963)
at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:899)
at org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer.(DatanodeHttpServer.java:105)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startInfoServer(DataNode.java:897)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1312)
at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:465)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2592)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2479)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2526)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2708)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2732)
Caused by: java.net.BindException: Cannot assign requested address
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.mortbay.jetty.nio.SelectChannelConnector.open(SelectChannelConnector.java:216)
at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:958)
… 10 more
2021-08-07 20:04:00,866 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Shutdown complete.
2021-08-07 20:04:00,866 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.net.BindException: Port in use: localhost:0
at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:963)
at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:899)
at org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer.(DatanodeHttpServer.java:105)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startInfoServer(DataNode.java:897)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1312)
at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:465)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2592)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2479)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2526)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2708)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2732)
Caused by: java.net.BindException: Cannot assign requested address
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.mortbay.jetty.nio.SelectChannelConnector.open(SelectChannelConnector.java:216)
at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:958)
… 10 more
2021-08-07 20:04:00,872 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
2021-08-07 20:04:00,873 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at hadoop000/192.168.101.3
************************************************************/
1回答
-
Michael_PK
2021-08-08
00
相似问题