copyBytes方法一直无法执行完毕

来源:3-19 HDFS API编程之查看HDFS文件内容

慕少7351152

2022-05-01

老师您好,我这边测试mkdir是没问题,但是测试查看文件内容的copyBytes方法一直无法执行完毕。

客户端:IDEA(MacOS 10.15.4)
hadoop集群:阿里云主机

其中客户端是直接通过云主机的外网IP访问的,麻烦老师看一下哪里出问题了吗,谢谢~

图片描述

最后报错:
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1050021096-172.17.42.138-1649771531449:blk_1073741829_1005 file=/slaves

at org.apache.hadoop.hdfs.DFSInputStream.refetchLocations(DFSInputStream.java:1049)
at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:1032)
at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:1011)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:650)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:904)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:963)
at java.io.DataInputStream.read(DataInputStream.java:100)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:87)
at com.imooc.bigdata.hadoop.hdfs.HDFSApp.testText(HDFSApp.java:41)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)
at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:230)
at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:58)

Disconnected from the target VM, address: ‘127.0.0.1:0’, transport: ‘socket’

Process finished with exit code 255

写回答

1回答

Michael_PK

2022-05-02

这个问题问答区有,你搜下 云主机的。需要在配置文件中加个参数,然后重启hdfs。

测试端也需要设置一个参数 就可以了。

然后:云主机的安全组也要打开。


0
1
慕少7351152
嗯嗯,这边找到原因了,是datanode的端口没开放。谢谢老师~
2022-05-03
共1条回复

Hadoop 系统入门+核心精讲

从Hadoop核心技术入手,掌握数据处理中ETL应用,轻松进军大数据

2413 学习 · 909 问题

查看课程