hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alex Newman <p...@planet.com>
Subject Re: Moving data from one table with a json column to one with a map<string,string>
Date Thu, 08 Oct 2015 22:46:47 GMT
An easier question is how does one even insert into a table with a
map<string,string>

insert overwrite table hivetablename select map("sdf","sdf") from table
limit 1;

gets me

Diagnostic Messages for this Task:
Error: java.lang.RuntimeException:
org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while
processing row (tag=0) {"key":{},"value":{"_col0":{"sdf":"sdf"}}}
        at
org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:265)
        at
org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:455)
        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:397)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:171)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:166)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime
Error while processing row (tag=0)
{"key":{},"value":{"_col0":{"sdf":"sdf"}}}
        at
org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:253)
        ... 7 more
Caused by: com.google.gson.JsonSyntaxException:
java.lang.IllegalStateException: Expected BEGIN_OBJECT but was STRING at
line 1 column 1
        at
com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:176)
        at com.google.gson.Gson.fromJson(Gson.java:803)
        at com.google.gson.Gson.fromJson(Gson.java:768)
        at com.google.gson.Gson.fromJson(Gson.java:717)
        at
org.apache.hadoop.hive.dynamodb.type.HiveDynamoDBItemType.deserializeAttributeValue(HiveDynamoDBItemType.java:119)
        at
org.apache.hadoop.hive.dynamodb.type.HiveDynamoDBItemType.parseDynamoDBData(HiveDynamoDBItemType.java:104)
        at
org.apache.hadoop.hive.dynamodb.DynamoDBSerDe.serialize(DynamoDBSerDe.java:106)
        at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:692)
        at
org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815)
        at
org.apache.hadoop.hive.ql.exec.LimitOperator.processOp(LimitOperator.java:51)
        at
org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815)
        at
org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
        at
org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:244)
        ... 7 more
Caused by: java.lang.IllegalStateException: Expected BEGIN_OBJECT but was
STRING at line 1 column 1
        at
com.google.gson.stream.JsonReader.beginObject(JsonReader.java:374)
        at
com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:165)
        ... 19 more



On Thu, Oct 8, 2015 at 11:26 AM Alex Newman <posi@planet.com> wrote:

> I have a table which has highly nested json which i create with .
> CREATE EXTERNAL TABLE table1 (item map<string,string>) STORED BY
> 'org.apache.hadoop.hive.dynamodb.DynamoDBStorageHandler' TBLPROPERTIES ("
> dynamodb.table.name" = "...");
>
> I have to have it be a map<string,string> table because it is highly
> nested json.
>
> I have another table which just contains newline delimited json which i
> mount with
> CREATE EXTERNAL TABLE table2 ( json_blob  string) LOCATION '....';
>
> So I want to be able to load table1 with table2 data but :
> INSERT OVERWRITE TABLE table1 select str_to_map(json_blob) from table2
> limit 1;
>
> But that maps to
>  {"key":{},"value":{"_col0":{random oozes of data
>
> What do I do!
> --
>
> Alex Newman sent this from a small glass rectangle at 4045076749
>
-- 

Alex Newman sent this from a small glass rectangle at 4045076749

Mime
View raw message