beeline服务启动出错

来源:6-3 thriftserver&beeline的使用

慕容3565349

2021-06-24

输入正文

[hadoop@hadoop000 bin]$ ./beeline -u jdbc:hive2://hadoop000:10000

Connecting to jdbc:hive2://hadoop000:10000

21/06/24 12:38:42 INFO jdbc.Utils: Supplied authorities: hadoop000:10000

21/06/24 12:38:42 INFO jdbc.Utils: Resolved authority: hadoop000:10000

21/06/24 12:38:42 DEBUG auth.HiveAuthFactory: Cannot find private method "getKeytab" in class:org.apache.hadoop.security.UserGroupInformation

java.lang.NoSuchMethodException: org.apache.hadoop.security.UserGroupInformation.getKeytab()

at java.lang.Class.getDeclaredMethod(Class.java:2130)

at org.apache.hive.service.auth.HiveAuthFactory.<clinit>(HiveAuthFactory.java:112)

at org.apache.hive.jdbc.HiveConnection.createBinaryTransport(HiveConnection.java:478)

at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:201)

at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)

at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)

at java.sql.DriverManager.getConnection(DriverManager.java:664)

at java.sql.DriverManager.getConnection(DriverManager.java:208)

at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:142)

at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:207)

at org.apache.hive.beeline.Commands.connect(Commands.java:1149)

at org.apache.hive.beeline.Commands.connect(Commands.java:1070)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:52)

at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:970)

at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:707)

at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:757)

at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:484)

at org.apache.hive.beeline.BeeLine.main(BeeLine.java:467)

21/06/24 12:38:42 INFO jdbc.HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://hadoop000:10000

21/06/24 12:38:42 DEBUG transport.TSaslTransport: opening transport org.apache.thrift.transport.TSaslClientTransport@30a3107a

21/06/24 12:38:43 DEBUG transport.TSaslClientTransport: Sending mechanism name PLAIN and initial response of length 20

21/06/24 12:38:43 DEBUG transport.TSaslTransport: CLIENT: Writing message with status START and payload length 5

21/06/24 12:38:43 DEBUG transport.TSaslTransport: CLIENT: Writing message with status COMPLETE and payload length 20

21/06/24 12:38:43 DEBUG transport.TSaslTransport: CLIENT: Start message handled

21/06/24 12:38:43 DEBUG transport.TSaslTransport: CLIENT: Main negotiation loop complete

21/06/24 12:38:43 DEBUG transport.TSaslTransport: CLIENT: SASL Client receiving last message

21/06/24 12:38:43 DEBUG transport.TSaslTransport: CLIENT: Received message with status COMPLETE and payload length 0

21/06/24 12:38:43 DEBUG transport.TSaslTransport: writing data length: 71

21/06/24 12:38:43 DEBUG transport.TSaslTransport: CLIENT: reading data length: 17841

Error: Failed to open new session: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=anonymous, access=EXECUTE, inode="/tmp":hadoop:supergroup:drwx------

at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)

at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)

at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:201)

at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:154)

at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)

at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3885)

at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6855)

at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4455)

at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:912)

at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:533)

at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:862)

at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)

at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)

at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)

at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)

at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)

at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275) (state=,code=0)

Beeline version 1.2.1.spark2 by Apache Hive

0: jdbc:hive2://hadoop000:10000 (closed)>

http://img.mukewang.com/szimg/60d40d1709a9133106820453.jpg

已经进入spark_home/bin下执行:./beeline -u jdbc:hive2://hadoop000:10000,还是没用

看日志Hive Server2好像没启动

启动hiveserver报错:

s: cannot access /home/hadoop/app/spark-2.4.3-bin-2.6.0-cdh5.15.1/lib/spark-assembly-*.jar: No such file or directory

which: no hbase in (/home/hadoop/app/spark-2.4.3-bin-2.6.0-cdh5.15.1/bin:/home/hadoop/app/hive-1.1.0-cdh5.15.1/bin:/home/hadoop/app/hadoop-2.6.0-cdh5.15.1/bin:/home/hadoop/app/jdk1.8.0_91/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/home/hadoop/.local/bin:/home/hadoop/bin


写回答

1回答

Michael_PK

2021-06-25

1)你这个beeline是连接的是hive的还是spark的?如果是hive的一定要到hive的bin下,反之要到spark的bin下去执行

2)如果是hive的,你的hiveserver2启动没? 如果是spark的,那么thriftserver启动没?   必须要先启动服务,才能时候用beeline连接

3)apache.hadoop.security.AccessControlException: Permission denied: user=anonymous, access=EXECUTE, inode="/tmp":hadoop:supergroup:drwx------  你的这个hdfs的/tmp目录权限也不对,可以手工改下

4)建议hive以及spark按照上课的版本来测试,测试ok了,你再使用其他版本来测试。 兼容性是一定要注意的

0
0

SparkSQL入门 整合Kudu实现广告业务数据分析

大数据工程师干货课程 带你从入门到实战掌握SparkSQL

535 学习 · 192 问题

查看课程