-
Notifications
You must be signed in to change notification settings - Fork 415
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug report] Table comment doesn't support UTF-8 #6612
Comments
A mysql table doesn't have this problem. So seems it is only related with Hive tables. |
Yeah, the data in MySQL attached to the Hive cluster is as follows:
We need to change the charset of the MySQL cluster, but it does not work when I take try, Let me dive into it. |
I also modified the connection string in hive-site.xml to "jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true&useSSL=false&characterEncoding=UTF-8", but that doesn't work. |
This is the limitation of Hive Metastoreat present. For MySQL, Hive metastore only supports latin1 and does not support UTF8, see https://issues.apache.org/jira/browse/HIVE-18083 |
This problem lies in the storage itself and Gravitino server and web do not have character issues. |
Version
main branch
Describe what's wrong
I have hive table, and I enter the comment with some Chinese character. After save the table, on UI, the comments just shows with "?", like this:
I checked the HTTP response, which just carries "?" in the json content.
Error message and/or stacktrace
No error.
How to reproduce
Enter some chinese character in a table's comment field, you will see it:
Additional context
No response
The text was updated successfully, but these errors were encountered: