giraph-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From pradeep kumar <pradeep0...@gmail.com>
Subject Re: Using HiveGiraphRunner with dependencies
Date Fri, 18 Jan 2013 05:53:29 GMT
Hi avery,

thanks for reply and suggestion,

while running commands we were getting some exceptions, we are not sure of
reasons, may be path or config.. did you guys faced such an issue while
developing, if then, what can be wrong..?

13/01/16 15:01:11 WARN conf.HiveConf: DEPRECATED: Ignoring hive-default.xml
found on the CLASSPATH at /etc/hive/conf.dist/hive-default.xml
13/01/16 15:01:16 INFO metastore.HiveMetaStore: 0: Opening raw store with
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
13/01/16 15:01:17 INFO metastore.ObjectStore: ObjectStore, initialize called
Exception in thread "main"
com.google.common.util.concurrent.UncheckedExecutionException:
javax.jdo.JDOFatalInternalException: Unexpected exception caught.
NestedThrowables:
java.lang.reflect.InvocationTargetException
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2256)
at com.google.common.cache.LocalCache.get(LocalCache.java:3980)
at
com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4783)
at
org.apache.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:166)
at org.apache.hcatalog.common.HiveClientCache.get(HiveClientCache.java:142)
at org.apache.hcatalog.common.HCatUtil.getHiveClient(HCatUtil.java:542)
at
org.apache.hcatalog.mapreduce.HCatUtils.getInputJobInfo(HCatUtils.java:75)
at
org.apache.giraph.io.hcatalog.GiraphHCatInputFormat.setVertexInput(GiraphHCatInputFormat.java:81)
at
org.apache.giraph.io.hcatalog.HiveGiraphRunner.run(HiveGiraphRunner.java:174)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at
org.apache.giraph.io.hcatalog.HiveGiraphRunner.main(HiveGiraphRunner.java:147)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.util.RunJar.main(RunJar.java:197)
Caused by: javax.jdo.JDOFatalInternalException: Unexpected exception caught.
NestedThrowables:
java.lang.reflect.InvocationTargetException
at
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1186)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:246)
at
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:275)
at
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:208)
at
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:183)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at
org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:62)
at
org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:346)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:333)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:371)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:278)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:248)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:114)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:98)
at
org.apache.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:245)
at
org.apache.hcatalog.common.HiveClientCache$4.call(HiveClientCache.java:169)
at
org.apache.hcatalog.common.HiveClientCache$4.call(HiveClientCache.java:166)
at
com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4786)
at
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3579)
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2372)
at
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2335)
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2250)
... 16 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
at java.security.AccessController.doPrivileged(Native Method)
at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
at
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
... 41 more
Caused by: java.lang.NullPointerException
at
org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPluginRegistry.java:443)
at
org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPluginRegistry.java:355)
at
org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensions(NonManagedPluginRegistry.java:215)
at
org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensionPoints(NonManagedPluginRegistry.java:156)
at
org.datanucleus.plugin.PluginManager.registerExtensionPoints(PluginManager.java:82)
at org.datanucleus.OMFContext.<init>(OMFContext.java:156)
at org.datanucleus.OMFContext.<init>(OMFContext.java:137)
at
org.datanucleus.ObjectManagerFactoryImpl.initialiseOMFContext(ObjectManagerFactoryImpl.java:132)
at
org.datanucleus.jdo.JDOPersistenceManagerFactory.initialiseProperties(JDOPersistenceManagerFactory.java:363)
at
org.datanucleus.jdo.JDOPersistenceManagerFactory.<init>(JDOPersistenceManagerFactory.java:307)
at
org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:255)
at
org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
... 49 more




and

have you tried shortest path with hive job, if then what the result..??
actually we are curious with giraph and hive any suggestion will be very
helpful..!!




On Fri, Jan 18, 2013 at 10:42 AM, Avery Ching <aching@apache.org> wrote:

> Yeah, this is where things get a bit tricky.  You'll have to experiment
> with what works for you, but we are using Hive to launch the job with the
> jar.sh script.  This gets the environment straight from the Hive side.
>
> jar_help () {
>   echo "Used for applications that require Hadoop and Hive classpath and
> environment."
>   echo "./hive --service jar <yourjar> <yourclass> HIVE_OPTS <your_args>"
> }
>
> Avery
>
>
> On 1/17/13 4:49 PM, pradeep kumar wrote:
>
>>
>> Hi,
>>
>> Actually we are trying to use giraph in our project for graph analysis
>> with hive, so far it was good build was successful shortestpath example ran
>> fine but working with hive is been a real issue.  we started with command
>> line
>>
>> hadoop jar giraph-hcatalog-0.2-SNAPSHOT-**jar-with-dependencies.jar
>> org.apache.giraph.io.hcatalog.**HiveGiraphRunner -db default
>> -vertexClass org.apache.giraph.vertex.**Vertex -vertexInputFormatClass
>> org.apache.giraph.io.hcatalog.**HCatalogVertexInputFormat
>> -vertexOutputFormatClass org.apache.giraph.io.hcatalog.**HCatalogVertexOutputFormat
>> -w 1 -vi testinput -o testoutput -hiveconf javax.jdo.option.**
>> ConnectionURL=jdbc:mysql://**localhost/metastore -hiveconf
>> javax.jdo.option.**ConnectionDriverName=com.**mysql.jdbc.Driver
>> -hiveconf javax.jdo.option.**ConnectionUserName=root -hiveconf
>> javax.jdo.option.**ConnectionPassword=root -hiveconf
>> datanucleus.autoCreateSchema=**false -hiveconf
>> datanucleus.fixedDatastore=**true
>>
>> is it a wrong way of doing it.. because we are running into exception
>> while doing so..
>>
>> and if its wrong,
>>
>> then any suggestion on how can we proceed will be a great help.
>>
>> Regards,
>>
>> Pradeep
>>
>>
>


-- 
Pradeep Kumar

Mime
View raw message