创建文件org.apache.hadoop.ipc.RpcException: RPC response exceeds maximum data length

来源:3-17 HDFS API编程之第一个应用程序的开发

慕容3565349

2022-06-22

@Test
    public void test01() throws Exception {
        Configuration conf = new Configuration();
        FileSystem fs = FileSystem.get(new URI("hdfs://192.168.72.129:9870"), conf, "hadoop000");
        System.out.println(fs);
        System.out.println(fs.getDefaultBlockSize());
        System.out.println(fs.getDefaultReplication());

        boolean exists = fs.exists(new Path("/input"));
        System.out.println(exists);

        fs.create(new Path("/log"));
    }
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Users/wusx/.m2/repository/org/slf4j/slf4j-reload4j/1.7.36/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/wusx/.m2/repository/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Reload4jLoggerFactory]
DFS[DFSClient[clientName=DFSClient_NONMAPREDUCE_1277951121_1, ugi=hadoop000 (auth:SIMPLE)]]
134217728
3

org.apache.hadoop.ipc.RpcException: RPC response exceeds maximum data length

	at org.apache.hadoop.ipc.Client$IpcStreams.readResponse(Client.java:1936)
	at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1238)
	at org.apache.hadoop.ipc.Client$Connection.run(Client.java:1134)

filesystem对象是得到了,也能正确返回节点的副本数这些,但是创建文件这些操作就报错

写回答

1回答

Michael_PK

2022-06-23

下面还有其他的堆栈信息吗?有的话,贴出来看看

0
0

Hadoop 系统入门+核心精讲

从Hadoop核心技术入手,掌握数据处理中ETL应用,轻松进军大数据

2417 学习 · 909 问题

查看课程