storm-hive整合数据成功写入hdfs中对应的桶中,但在hive对应的表中查询不到写入的数据
来源:7-4 -Storm作业运行UI页面上的参数详解
mybatis
2018-10-19
344942 [hive-bolt-0] INFO o.a.o.i.PhysicalFsWriter - ORC writer created for path: hdfs://namenode.hadoop:8020/warehouse/tablespace/managed/hive/jsw.db/test/city=sunnyvale/state=ca/delta_0089191_0089200/bucket_00002 with stripeSize: 8388608 blockSize: 268435456 compression: NONE bufferSize: 32768
344950 [hive-bolt-0] INFO o.a.o.i.WriterImpl - ORC writer created for path: hdfs://namenode.hadoop:8020/warehouse/tablespace/managed/hive/jsw.db/test/city=sunnyvale/state=ca/delta_0089191_0089200/bucket_00002 with stripeSize: 8388608 blockSize: 268435456 compression: NONE bufferSize: 32768
2 user2 123456 street2 sunnyvale ca
5回答
-
mybatis
提问者
2018-10-19
请问老师如何刷新?
012018-10-19 -
Michael_PK
2018-10-19
你尝试刷新下hive meta数据实现,是不是数据和元数据没同步
00 -
mybatis
提问者
2018-10-19
建表语句:
create table test( id INT, name STRING, phone STRING, street STRING) partitioned by (city STRING, state STRING) CLUSTERED BY (id) INTO 4 BUCKETS stored as orc tblproperties ("orc.compress"="NONE");
00 -
mybatis
提问者
2018-10-19
太奇怪了,并且没有报任何错误
00 -
mybatis
提问者
2018-10-19
请Michael__PK老师帮忙分析下是什么原因?
00
相似问题