hadoop-hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Carl Steinbach (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HIVE-1163) Eclipse launchtemplate changes to enable debugging
Date Sat, 13 Feb 2010 00:59:27 GMT

    [ https://issues.apache.org/jira/browse/HIVE-1163?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12833271#action_12833271
] 

Carl Steinbach commented on HIVE-1163:
--------------------------------------

Ning, when I run the TestCliDriver run configuration from within Eclipse I see
failures in alter3, binary_output_format, binary_sortable_1, etc. alter3 throws
the following error:

{noformat}
insert overwrite table alter3 partition (pCol1='test_part', pcol2='test_part') select col1
from alter3_src 
10/02/12 16:51:02 ERROR SessionState: PREHOOK: query: insert overwrite table alter3 partition
(pCol1='test_part', pcol2='test_part') select col1 from alter3_src
10/02/12 16:51:02 ERROR SessionState: PREHOOK: type: QUERY
10/02/12 16:51:02 ERROR SessionState: PREHOOK: Input: default@alter3_src
10/02/12 16:51:02 ERROR SessionState: PREHOOK: Output: default@alter3@pcol1=test_part/pcol2=test_part
10/02/12 16:51:02 INFO ql.Driver: Total MapReduce jobs = 2
10/02/12 16:51:02 INFO ql.Driver: Launching Job 1 out of 2
10/02/12 16:51:02 INFO exec.MapRedTask: Generating plan file /Users/carl/Projects/hive/svn/hive/build/ql/scratchdir/plan4897842523376893870.xml
10/02/12 16:51:02 INFO exec.MapRedTask: Executing: /Users/carl/Projects/hive/svn/hive/build/hadoopcore/hadoop-0.20.0/bin/hadoop
jar /Users/carl/Projects/hive/svn/hive/build/ql/hive-exec-0.6.0.jar org.apache.hadoop.hive.ql.exec.ExecDriver
 -plan /Users/carl/Projects/hive/svn/hive/build/ql/scratchdir/plan4897842523376893870.xml
 -jobconf hive.exec.script.allow.partial.consumption=false -jobconf hive.query.id=carl_20100212165151
-jobconf hive.hwi.listen.port=9999 -jobconf hive.map.aggr=true -jobconf hive.map.aggr.hash.min.reduction=0.5
-jobconf hive.exec.reducers.bytes.per.reducer=1000000000 -jobconf hive.optimize.cp=true -jobconf
hive.merge.size.smallfiles.avgsize=16000000 -jobconf hive.script.serde=org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
-jobconf hive.optimize.ppd=true -jobconf hive.optimize.groupby=true -jobconf javax.jdo.option.ConnectionUserName=APP
-jobconf hive.mapred.reduce.tasks.speculative.execution=true -jobconf mapred.job.name=insert+overwrite+table+alter3+p...alter3_src%28Stage-1%29
-jobconf javax.jdo.option.DetachAllOnCommit=true -jobconf hive.mapred.local.mem=0 -jobconf
hive.session.id=carl_201002121651 -jobconf hive.script.operator.id.env.var=HIVE_SCRIPT_OPERATOR_ID
-jobconf hadoop.job.ugi=carl%2Cstaff%2Ccom.apple.sharepoint.group.1%2C_lpadmin%2Ccom.apple.sharepoint.group.2%2C_appserveradm%2C_appserverusr%2Cadmin
-jobconf test.src.dir=file%3A%2F%2F%24%7Bbuild.dir%7D%2Fsrc%2Ftest -jobconf hive.udtf.auto.progress=false
-jobconf datanucleus.validateTables=false -jobconf hive.exec.compress.output=false -jobconf
hive.test.mode.prefix=test_ -jobconf test.log.dir=%24%7Bbuild.dir%7D%2Ftest%2Flogs -jobconf
test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf datanucleus.validateConstraints=false
-jobconf mapred.reduce.tasks=-1 -jobconf hive.query.string=%0A%0Ainsert+overwrite+table+alter3+partition+%28pCol1%3D%27test_part%27%2C+pcol2%3D%27test_part%27%29+select+col1+from+alter3_src+
-jobconf hive.input.format=org.apache.hadoop.hive.ql.io.HiveInputFormat -jobconf hive.task.progress=false
-jobconf hive.jar.path=%24%7Bbuild.dir.hive%7D%2Fql%2Fhive-exec-%24%7Bversion%7D.jar -jobconf
datanuclues.cache.level2=true -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver
-jobconf hive.skewjoin.mapjoin.map.tasks=10000 -jobconf hive.mapjoin.maxsize=100000 -jobconf
hive.exec.pre.hooks=org.apache.hadoop.hive.ql.hooks.PreExecutePrinter -jobconf hive.optimize.skewjoin=false
-jobconf hive.groupby.mapaggr.checkinterval=100000 -jobconf hive.test.mode=false -jobconf
hive.exec.parallel=false -jobconf datanuclues.cache.level2.type=SOFT -jobconf hive.default.fileformat=TextFile
-jobconf hive.test.mode.samplefreq=32 -jobconf javax.jdo.option.NonTransactionalRead=true
-jobconf hive.script.auto.progress=false -jobconf hive.merge.mapredfiles=false -jobconf fs.scheme.class=dfs
-jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue
-jobconf hive.exec.compress.intermediate=false -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore
-jobconf hive.map.aggr.hash.percentmemory=0.5 -jobconf hive.hwi.listen.host=0.0.0.0 -jobconf
hive.merge.size.per.task=256000000 -jobconf datanucleus.autoCreateSchema=true -jobconf hive.exec.post.hooks=org.apache.hadoop.hive.ql.hooks.PostExecutePrinter
-jobconf hive.groupby.skewindata=false -jobconf hive.metastore.local=true -jobconf hive.skewjoin.mapjoin.min.split=33554432
-jobconf hadoop.tmp.dir=%24%7Bbuild.dir.hive%7D%2Ftest%2Fhadoop-%24%7Buser.name%7D -jobconf
hive.mapred.mode=nonstrict -jobconf hive.optimize.pruner=true -jobconf hive.skewjoin.key=100000
-jobconf datanucleus.validateColumns=false -jobconf hive.querylog.location=%24%7Bbuild.dir%7D%2Ftmp
-jobconf datancucleus.transactionIsolation=read-committed -jobconf hive.exec.reducers.max=999
-jobconf javax.jdo.PersistenceManagerFactoryClass=org.datanucleus.jdo.JDOPersistenceManagerFactory
-jobconf hive.heartbeat.interval=1000 -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Bbuild.dir%7D%2Ftest%2Fdata%2Fwarehouse%2F
-jobconf datanucleus.autoStartMechanismMode=checked -jobconf javax.jdo.option.ConnectionPassword=mine
-jobconf hive.metastore.connect.retries=5 -jobconf hive.mapjoin.cache.numrows=25000 -jobconf
hive.exec.parallel.thread.number=8 -jobconf datanucleus.storeManagerType=rdbms -jobconf hive.script.recordreader=org.apache.hadoop.hive.ql.exec.TextRecordReader
-jobconf hive.exec.scratchdir=%24%7Bbuild.dir%7D%2Fscratchdir -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Bbuild.dir%7D%2Ftest%2Fdata%2Fmetadb%2F
-jobconf hive.script.recordwriter=org.apache.hadoop.hive.ql.exec.TextRecordWriter -jobconf
hive.merge.mapfiles=true -jobconf hive.exec.script.maxerrsize=100000 -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q
-jobconf hive.join.emit.interval=1000 -jobconf hive.added.jars.path= -jobconf mapred.system.dir=%2FUsers%2Fcarl%2FProjects%2Fhive%2Fsvn%2Fhive%2Fbuild%2Ftest%2Fhadoop-carl%2Fmapred%2Fsystem%2F-1496122043
-jobconf mapred.local.dir=%2FUsers%2Fcarl%2FProjects%2Fhive%2Fsvn%2Fhive%2Fbuild%2Ftest%2Fhadoop-carl%2Fmapred%2Flocal%2F-462613887
Error: JAVA_HOME is not set.
10/02/12 16:51:02 ERROR exec.MapRedTask: Execution failed with exit status: 1
10/02/12 16:51:02 ERROR ql.Driver: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask
Exception: Client Execution failed with error code = 9
junit.framework.AssertionFailedError: Client Execution failed with error code = 9
	at junit.framework.Assert.fail(Assert.java:47)
	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_alter3(TestCliDriver.java:1146)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:154)
	at junit.framework.TestCase.runBare(TestCase.java:127)
	at junit.framework.TestResult$1.protect(TestResult.java:106)
	at junit.framework.TestResult.runProtected(TestResult.java:124)
	at junit.framework.TestResult.run(TestResult.java:109)
	at junit.framework.TestCase.run(TestCase.java:118)
	at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
	at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
	at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467)
	at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683)
	at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390)
	at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197)
{noformat}

Setting JAVA_HOME=${system_property:java.home} in the configuration fixes this problem.

bq. Also I've been able to import Hive/Hadoop projects using Eclipse after 'ant eclipse-files',
without this manual editing .project file. So I guess that instruction was for some older
version of Eclipse?

Right. steps 4-7 are unnecessary, and you can skip step 3 if
you run 'ant gen-test' in the ql directory. I think the directions should 
instead tell you to do the following:

{noformat}
% ant package
% cd metastore
% ant model-jar
% cd ../ql
% ant gen-test
% cd ..
% ant eclipse-files
{noformat}

# Launch Eclipse.
# File->Import->General->Existing Projects into Workspace and select the root of
your Hive directory.


> Eclipse launchtemplate changes to enable debugging
> --------------------------------------------------
>
>                 Key: HIVE-1163
>                 URL: https://issues.apache.org/jira/browse/HIVE-1163
>             Project: Hadoop Hive
>          Issue Type: Bug
>    Affects Versions: 0.6.0
>            Reporter: Ning Zhang
>            Assignee: Ning Zhang
>         Attachments: HIVE-1163.patch, HIVE-1163_2.patch
>
>
> Some recent changes in the build.xml and build-common.xml breaks the debugging functionality
in eclipse. Some system defined properties were missing when running eclipse debugger. 

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message