You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Insert is working fine and in hdfs also the .parquet file is getting generated and when I decode that parquet file using parquet-tools, I can see the data which I wrote using insert query. But when I perform select operation, it show no data. Here is the output:
aken: 0.001 seconds
+----------------------------+
| excercise_hive_eceberg1.i |
+----------------------------+
+----------------------------+
Apache Iceberg version
1.6.1 (latest release)
Query engine
Hive
Please describe the bug 🐞
Insert is working fine and in hdfs also the .parquet file is getting generated and when I decode that parquet file using parquet-tools, I can see the data which I wrote using insert query. But when I perform select operation, it show no data. Here is the output:
aken: 0.001 seconds
+----------------------------+
| excercise_hive_eceberg1.i |
+----------------------------+
+----------------------------+
Here is desc formatted ;
+-------------------------------+----------------------------------------------------+----------------------------------------------------+
| col_name | data_type | comment |
+-------------------------------+----------------------------------------------------+----------------------------------------------------+
| i | int | |
| | NULL | NULL |
| # Detailed Table Information | NULL | NULL |
| Database: | somesh_dev | NULL |
| OwnerType: | USER | NULL |
| Owner: | hive | NULL |
| CreateTime: | Thu Sep 19 08:40:33 UTC 2024 | NULL |
| LastAccessTime: | UNKNOWN | NULL |
| Retention: | 0 | NULL |
| Location: | s3a://com.somesh/opt/hive/data/warehouse/somesh_dev.db/excercise_hive_eceberg1 | NULL |
| Table Type: | EXTERNAL_TABLE | NULL |
| Table Parameters: | NULL | NULL |
| | EXTERNAL | TRUE |
| | TRANSLATED_TO_EXTERNAL | TRUE |
| | bucketing_version | 2 |
| | current-schema | {"type":"struct","schema-id":0,"fields":[{"id":1,"name":"i","required":false,"type":"int"}]} |
| | external.table.purge | TRUE |
| | format-version | 2 |
| | iceberg.orc.files.only | false |
| | metadata_location | s3a://com.somesh/opt/hive/data/warehouse/somesh_dev.db/excercise_hive_eceberg1/metadata/00000-eb95ea40-7408-4368-8204-baaf3960fd41.metadata.json |
| | numFiles | 0 |
| | numRows | 0 |
| | parquet.compression | zstd |
| | rawDataSize | 0 |
| | serialization.format | 1 |
| | snapshot-count | 0 |
| | storage_handler | org.apache.iceberg.mr.hive.HiveIcebergStorageHandler |
| | table_type | ICEBERG |
| | totalSize | 0 |
| | transient_lastDdlTime | 1726737447 |
| | uuid | 3a828192-5f28-4d20-9be0-841f078f60e4 |
| | NULL | NULL |
| # Storage Information | NULL | NULL |
| SerDe Library: | org.apache.iceberg.mr.hive.HiveIcebergSerDe | NULL |
| InputFormat: | org.apache.iceberg.mr.hive.HiveIcebergInputFormat | NULL |
| OutputFormat: | org.apache.iceberg.mr.hive.HiveIcebergOutputFormat | NULL |
| Compressed: | No | NULL |
| Sort Columns: | [] | NULL |
+-------------------------------+----------------------------------------------------+----------------------------------------------------+
38 rows selected (1.743 seconds)
0: jdbc:hive2://localhost:10000/>
Please help here if anyone konow this solution.
Willingness to contribute
The text was updated successfully, but these errors were encountered: