spark-sql> create table t(key string, value string);出错
来源:3-9 spark-sql使用详解并结合讲解Catalyst的执行过程

慕田峪0177977
2020-07-21
报错内容如下:
20/07/21 00:23:46 INFO HiveMetaStore: 0: get_database: default
20/07/21 00:23:46 INFO audit: ugi=hadoop ip=unknown-ip-addr cmd=get_database: default
20/07/21 00:23:46 INFO HiveMetaStore: 0: get_table : db=default tbl=t
20/07/21 00:23:46 INFO audit: ugi=hadoop ip=unknown-ip-addr cmd=get_table : db=default tbl=t
20/07/21 00:23:46 INFO HiveMetaStore: 0: get_database: default
20/07/21 00:23:46 INFO audit: ugi=hadoop ip=unknown-ip-addr cmd=get_database: default
20/07/21 00:23:46 INFO HiveMetaStore: 0: get_database: default
20/07/21 00:23:46 INFO audit: ugi=hadoop ip=unknown-ip-addr cmd=get_database: default
20/07/21 00:23:46 INFO HiveMetaStore: 0: get_database: default
20/07/21 00:23:46 INFO audit: ugi=hadoop ip=unknown-ip-addr cmd=get_database: default
20/07/21 00:23:46 INFO HiveMetaStore: 0: get_table : db=default tbl=t
20/07/21 00:23:46 INFO audit: ugi=hadoop ip=unknown-ip-addr cmd=get_table : db=default tbl=t
20/07/21 00:23:46 INFO HiveMetaStore: 0: get_database: default
20/07/21 00:23:46 INFO audit: ugi=hadoop ip=unknown-ip-addr cmd=get_database: default
20/07/21 00:23:46 INFO HiveMetaStore: 0: create_table: Table(tableName:t, dbName:default, owner:hadoop, createTime:1595262226, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:key, type:string, comment:null), FieldSchema(name:value, type:string, comment:null)], location:file:/user/hive/warehouse/t, inputFormat:org.apache.hadoop.mapred.TextInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, parameters:{serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{})), partitionKeys:[], parameters:{spark.sql.sources.schema.part.0={“type”:“struct”,“fields”:[{“name”:“key”,“type”:“string”,“nullable”:true,“metadata”:{}},{“name”:“value”,“type”:“string”,“nullable”:true,“metadata”:{}}]}, spark.sql.sources.schema.numParts=1, spark.sql.create.version=2.4.3}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE, privileges:PrincipalPrivilegeSet(userPrivileges:{}, groupPrivileges:null, rolePrivileges:null))
20/07/21 00:23:46 INFO audit: ugi=hadoop ip=unknown-ip-addr cmd=create_table: Table(tableName:t, dbName:default, owner:hadoop, createTime:1595262226, lastAccessTime:0, retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:key, type:string, comment:null), FieldSchema(name:value, type:string, comment:null)], location:file:/user/hive/warehouse/t, inputFormat:org.apache.hadoop.mapred.TextInputFormat, outputFormat:org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat, compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null, serializationLib:org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, parameters:{serialization.format=1}), bucketCols:[], sortCols:[], parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[], skewedColValues:[], skewedColValueLocationMaps:{})), partitionKeys:[], parameters:{spark.sql.sources.schema.part.0={“type”:“struct”,“fields”:[{“name”:“key”,“type”:“string”,“nullable”:true,“metadata”:{}},{“name”:“value”,“type”:“string”,“nullable”:true,“metadata”:{}}]}, spark.sql.sources.schema.numParts=1, spark.sql.create.version=2.4.3}, viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE, privileges:PrincipalPrivilegeSet(userPrivileges:{}, groupPrivileges:null, rolePrivileges:null))
20/07/21 00:23:46 WARN HiveMetaStore: Location: file:/user/hive/warehouse/t specified for non-external table:t
20/07/21 00:23:46 INFO FileUtils: Creating directory if it doesn’t exist: file:/user/hive/warehouse/t
Error in query: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:file:/user/hive/warehouse/t is not a directory or unable to create one);
1回答
-
Michael_PK
2020-07-21
file:/user/hive/warehouse/t is not a directory or unable to create one);
看着个消息,hive数据是存放在HDFS上的,你这里是本地呢。。。 你确定你的HDFS正常工作吗
042020-07-21
相似问题