HiveException: Unable to move source

来源:5-6 -thriftserver&beeline的使用

OjQuery

2017-09-29

Exception in thread "main" java.sql.SQLException: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to move source hdfs://namenode:8020/user/hive/warehouse/staging/_hive_2017-09-29_15-56-24_475_7647432284127920875-3/-ext-10000/part-00000-f3018378-999a-4d3c-92ee-66e747bfa7de-c000 to destination hdfs://namenode:8020/user/hive/warehouse/staging/mydb.db/mytable/part-00000-f3018378-999a-4d3c-92ee-66e747bfa7de-c000; at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:296)

写回答

4回答

OjQuery

提问者

2017-09-29

No rows selected (0.011 seconds)

0: jdbc:hive2://localhost:10000> INSERT into TABLE mytable VALUES (1, '1');

Error: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to move source hdfs://namenode:8020/user/hive/warehouse/staging/_hive_2017-09-29_21-53-33_467_5456078821325813240-4/-ext-10000/part-00000-5be07e5b-fdad-454f-8015-04eeec527f5a-c000 to destination hdfs://namenode:8020/user/hive/warehouse/staging/mydb.db/mytable/part-00000-5be07e5b-fdad-454f-8015-04eeec527f5a-c000; (state=,code=0)

0: jdbc:hive2://localhost:10000> 


0
1
Michael_PK
你这环境就有问题
2017-09-29
共1条回复

Michael_PK

2017-09-29

你先检查下hiveserver2配合beeline好使不

0
0

OjQuery

提问者

2017-09-29

package com.imooc.spark

import java.sql.DriverManager

/**
*  通过JDBC的方式访问
*/
object SparkSQLThriftServerApp {

 def main(args: Array[String]) {

   Class.forName("org.apache.hive.jdbc.HiveDriver")

   val conn = DriverManager.getConnection("jdbc:hive2://namenode:10000/mydb","hadoop","")
   //val pstmt = conn.prepareStatement("INSERT INTO TABLE address VALUES (1, 1, 1, '测试省', 11, '测试市', 111, '测试区', '2037021', '联系人', '手机号', '详细地址', '2017-9-1 1:1:1', '2017-9-1 1:1:1', '2017-9-1 1:1:1')")
   val pstmt = conn.createStatement()
   val rs = pstmt.executeUpdate("INSERT into TABLE mytable VALUES (1, '1')")
//    while (rs.next()) {
//      println("id:" + rs.getInt("id") +
//        " , user_id:" + rs.getString("name") )
//
//    }

//    rs.close()
   pstmt.close()
   conn.close()


 }


}

0
0

Michael_PK

2017-09-29

什么操作

0
0

以慕课网日志分析为例 进入大数据Spark SQL的世界

快速转型大数据:Hadoop,Hive,SparkSQL步步为赢

1644 学习 · 1129 问题

查看课程