hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Viraj Bhat (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-5515) Writing to an HBase table throws IllegalArgumentException, failing job submission
Date Mon, 18 Nov 2013 23:51:22 GMT

    [ https://issues.apache.org/jira/browse/HIVE-5515?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13825963#comment-13825963
] 

Viraj Bhat commented on HIVE-5515:
----------------------------------

Hi Sushanth,
 Thanks for your comments about the unit test case. I think the current test cases do not
exercise the path where they read the metadata from the metastore. Also about fixing the patch.
Let me use "conf" and also limit the individual lines to 80. I will repost it as soon as possible.
Viraj

> Writing to an HBase table throws IllegalArgumentException, failing job submission
> ---------------------------------------------------------------------------------
>
>                 Key: HIVE-5515
>                 URL: https://issues.apache.org/jira/browse/HIVE-5515
>             Project: Hive
>          Issue Type: Bug
>          Components: HBase Handler
>    Affects Versions: 0.12.0
>         Environment: Hadoop2, Hive 0.12.0, HBase-0.96RC
>            Reporter: Nick Dimiduk
>            Assignee: Viraj Bhat
>              Labels: hbase
>             Fix For: 0.13.0
>
>         Attachments: HIVE-5515.patch
>
>
> Inserting data into HBase table via hive query fails with the following message:
> {noformat}
> $ hive -e "FROM pgc INSERT OVERWRITE TABLE pagecounts_hbase SELECT pgc.* WHERE rowkey
LIKE 'en/q%' LIMIT 10;"
> ...
> Total MapReduce jobs = 1
> Launching Job 1 out of 1
> Number of reduce tasks determined at compile time: 1
> In order to change the average load for a reducer (in bytes):
>   set hive.exec.reducers.bytes.per.reducer=<number>
> In order to limit the maximum number of reducers:
>   set hive.exec.reducers.max=<number>
> In order to set a constant number of reducers:
>   set mapred.reduce.tasks=<number>
> java.lang.IllegalArgumentException: Property value must not be null
>         at com.google.common.base.Preconditions.checkArgument(Preconditions.java:88)
>         at org.apache.hadoop.conf.Configuration.set(Configuration.java:810)
>         at org.apache.hadoop.conf.Configuration.set(Configuration.java:792)
>         at org.apache.hadoop.hive.ql.exec.Utilities.copyTableJobPropertiesToConf(Utilities.java:2002)
>         at org.apache.hadoop.hive.ql.exec.FileSinkOperator.checkOutputSpecs(FileSinkOperator.java:947)
>         at org.apache.hadoop.hive.ql.io.HiveOutputFormatImpl.checkOutputSpecs(HiveOutputFormatImpl.java:67)
>         at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:458)
>         at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:342)
>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
>         at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
>         at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>         at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
>         at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
>         at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:425)
>         at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
>         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)
>         at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)
>         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1414)
>         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1192)
>         at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1020)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888)
>         at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
>         at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
>         at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
>         at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
>         at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:731)
>         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:601)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> Job Submission failed with exception 'java.lang.IllegalArgumentException(Property value
must not be null)'
> FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.1#6144)

Mime
View raw message