hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Fastupload <>
Subject Raw MapReduce code read RC file using HCatalog
Date Tue, 17 Jun 2014 09:07:19 GMT

When MapReduce code read RC file using HCatalog, and hive meta store in a remote Oracle database.
I write a demo code following by the wiki page,,
and package all dependence jar into one jar.

The job got the error while running. some line of failure stack is:
Caused by: org.datanucleus.exceptions.NucleusUserException: Persistence process has been specified
to use a ClassLoaderResolver of name "datanucleus" yet this has not been found by the DataNucleus
plugin mechanism. Please check your CLASSPATH and plugin specification.

It seems that the HCatInputFormat class can not create a JDOPersisitencemanagerFactory object
for HiveMetaStoreClient object in from line 101 to 106,
if (conf != null) {
        hiveConf = HCatUtil.getHiveConf(conf);
      } else {
        hiveConf = new HiveConf(HCatInputFormat.class);
      client = HCatUtil.getHiveClient(hiveConf);

The lines of code create a HiveMetaStoreClient object with job configuration or HCatInputFormat
properties. So I add code to add hive-site.xml file both job configuration and HCatInputFormat
properties. as,

		// load hive meta store configuration file, both properties and job config
		Properties prop = new Properties();
		FileInputStream confStream = new FileInputStream(args[4]);
		HCatInputFormat.setInput(job, dbName, tblName).setFilter(filter).setProperties(prop);

But the job still get the same error.  any idea?  
More error log and code please look at the two attachments.

Best Regards,
Link Qian
View raw message