ambari-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hadoop QA (JIRA)" <>
Subject [jira] [Commented] (AMBARI-18583) Ambari Hive View 'Upload Table' does not support UTF8 files with BOM
Date Thu, 13 Oct 2016 16:14:20 GMT


Hadoop QA commented on AMBARI-18583:

{color:red}-1 overall{color}.  Here are the results of testing the latest attachment
  against trunk revision .

    {color:green}+1 @author{color}.  The patch does not contain any @author tags.

    {color:red}-1 tests included{color}.  The patch doesn't appear to include any new or modified
                        Please justify why no new tests are needed for this patch.
                        Also please list what manual steps were performed to verify this patch.

    {color:green}+1 javac{color}.  The applied patch does not increase the total number of
javac compiler warnings.

    {color:green}+1 release audit{color}.  The applied patch does not increase the total number
of release audit warnings.

    {color:red}-1 core tests{color}.  The test build failed in contrib/views/hive-next contrib/views/hive

Test results:
Console output:

This message is automatically generated.

> Ambari Hive View 'Upload Table' does not support UTF8 files with BOM
> --------------------------------------------------------------------
>                 Key: AMBARI-18583
>                 URL:
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-views
>    Affects Versions: 2.2.2
>            Reporter: Nitiraj Singh Rathore
>            Assignee: Nitiraj Singh Rathore
>             Fix For: 2.5.0
>         Attachments: AMBARI-18583_branch-2.5.patch
> PROBLEM: Ambari Hive view throws an E090 HiveClientFormattedException exception when
trying to create a table via the 'Upload Table' function when the file contains the BOM for
UTF8 (Byte Order Mark - 0xEF 0xBB 0xBF) . The same file has no problem being loaded when saved
without the BOM (via Sublime Text). Deleting the contents of the first column heading and
retyping it fixes the issue
> STEPS TO REPRODUCE: Download attached CSV file and try to upload table via Ambari 2.2.2
> EXPECTED RESULT: Table previews correctly, so expected to be able to make the hive table
> ACTUAL RESULT: Ambari view throws the 'E090 HiveClientFormattedException' error, and
the hiverserver2.log file throws the following error:
> 2016-08-01 15:29:57,284 INFO  [HiveServer2-Handler-Pool: Thread-32]: parse.ParseDriver
( - Parsing command: create table recordsView2 (�nengetu INT,
> 2016-08-01 15:29:57,285 ERROR [HiveServer2-Handler-Pool: Thread-32]: ql.Driver (
- FAILED: ParseException line 1:27 character '�' not supported here
> org.apache.hadoop.hive.ql.parse.ParseException: line 1:27 character '�' not supported
> 	at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(
> 	at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(
> 	at org.apache.hadoop.hive.ql.Driver.compile(
> 	at org.apache.hadoop.hive.ql.Driver.compile(
> 	at org.apache.hadoop.hive.ql.Driver.compileInternal(
> 	at org.apache.hadoop.hive.ql.Driver.compileAndRespond(
> 	at org.apache.hive.service.cli.operation.SQLOperation.prepare(
> 	at org.apache.hive.service.cli.operation.SQLOperation.runInternal(
> 	at
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(
> 	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> 	at java.lang.reflect.Method.invoke(
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(
> 	at org.apache.hive.service.cli.session.HiveSessionProxy$
> 	at Method)
> 	at
> 	at
> 	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(
> 	at com.sun.proxy.$Proxy19.executeStatementAsync(Unknown Source)
> 	at org.apache.hive.service.cli.CLIService.executeStatementAsync(
> 	at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(
> 	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(
> 	at org.apache.thrift.ProcessFunction.process(
> 	at org.apache.thrift.TBaseProcessor.process(
> 	at org.apache.hive.service.auth.TSetIpAddressProcessor.process(
> 	at org.apache.thrift.server.TThreadPoolServer$
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(
> 	at java.util.concurrent.ThreadPoolExecutor$
> 	at

This message was sent by Atlassian JIRA

View raw message