ambari-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Nitiraj Singh Rathore (JIRA)" <>
Subject [jira] [Created] (AMBARI-18583) Ambari Hive View 'Upload Table' does not support UTF8 files with BOM
Date Thu, 13 Oct 2016 08:37:20 GMT
Nitiraj Singh Rathore created AMBARI-18583:

             Summary: Ambari Hive View 'Upload Table' does not support UTF8 files with BOM
                 Key: AMBARI-18583
             Project: Ambari
          Issue Type: Bug
          Components: ambari-views
    Affects Versions: 2.2.2
            Reporter: Nitiraj Singh Rathore
            Assignee: Nitiraj Singh Rathore
             Fix For: 2.5.0

PROBLEM: Ambari Hive view throws an E090 HiveClientFormattedException exception when trying
to create a table via the 'Upload Table' function when the file contains the BOM for UTF8
(Byte Order Mark - 0xEF 0xBB 0xBF) . The same file has no problem being loaded when saved
without the BOM (via Sublime Text). Deleting the contents of the first column heading and
retyping it fixes the issue
STEPS TO REPRODUCE: Download attached CSV file and try to upload table via Ambari 2.2.2 view.
EXPECTED RESULT: Table previews correctly, so expected to be able to make the hive table
ACTUAL RESULT: Ambari view throws the 'E090 HiveClientFormattedException' error, and the hiverserver2.log
file throws the following error:

2016-08-01 15:29:57,284 INFO  [HiveServer2-Handler-Pool: Thread-32]: parse.ParseDriver (
- Parsing command: create table recordsView2 (�nengetu INT, kanji STRING) STORED AS ORC
2016-08-01 15:29:57,285 ERROR [HiveServer2-Handler-Pool: Thread-32]: ql.Driver (
- FAILED: ParseException line 1:27 character '�' not supported here
org.apache.hadoop.hive.ql.parse.ParseException: line 1:27 character '�' not supported here
	at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(
	at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(
	at org.apache.hadoop.hive.ql.Driver.compile(
	at org.apache.hadoop.hive.ql.Driver.compile(
	at org.apache.hadoop.hive.ql.Driver.compileInternal(
	at org.apache.hadoop.hive.ql.Driver.compileAndRespond(
	at org.apache.hive.service.cli.operation.SQLOperation.prepare(
	at org.apache.hive.service.cli.operation.SQLOperation.runInternal(
	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(
	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(
	at java.lang.reflect.Method.invoke(
	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(
	at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(
	at org.apache.hive.service.cli.session.HiveSessionProxy$
	at Method)
	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(
	at com.sun.proxy.$Proxy19.executeStatementAsync(Unknown Source)
	at org.apache.hive.service.cli.CLIService.executeStatementAsync(
	at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(
	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(
	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(
	at org.apache.thrift.ProcessFunction.process(
	at org.apache.thrift.TBaseProcessor.process(
	at org.apache.hive.service.auth.TSetIpAddressProcessor.process(
	at org.apache.thrift.server.TThreadPoolServer$
	at java.util.concurrent.ThreadPoolExecutor.runWorker(
	at java.util.concurrent.ThreadPoolExecutor$

This message was sent by Atlassian JIRA

View raw message