hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 권병창 <magnu...@navercorp.com>
Subject Re: oozie issue java.lang.UnsupportedOperationException: Not implemented by the TFS FileSystem implementatio
Date Wed, 18 Jan 2017 03:35:06 GMT
Hi.
I think there are many jar in sharelib/spark.
Try to delete all jar in sharelib/spark except oozie-sharelib-spark-*.jar, spark-assembly-*.jar.

 
-----Original Message-----
From: "Rohit Mishra"&lt;rohitkmishra@mindagroup.com&gt; 
To: "권병창"&lt;magnum.c@navercorp.com&gt;; &lt;user@hadoop.apache.org&gt;;

Cc: 
Sent: 2017-01-17 (화) 20:38:38
Subject: Re: oozie issue java.lang.UnsupportedOperationException: Not implemented by the TFS
FileSystem implementatio
 
Hi there, Please find below portion of my oozie-site.xml &lt;property&gt;        &lt;name&gt;oozie.service.HadoopAccessorService.hadoop.configurations&lt;/name&gt;
       &lt;value&gt;*=/disk2/oozie/conf/hadoop-conf&lt;/value&gt;        &lt;description&gt;
           Comma separated AUTHORITY=HADOOP_CONF_DIR, where AUTHORITY is the HOST:PORT of
           the Hadoop service (JobTracker, HDFS). The wildcard '*' configuration is      
     used when there is no exact match for an authority. The HADOOP_CONF_DIR contains    
       the relevant Hadoop *-site.xml files. If the path is relative is looked within    
       the Oozie configuration directory; though the path can be absolute (i.e. to point 
          to Hadoop client conf/ directories in the local filesystem.        &lt;/description&gt;
   &lt;/property&gt;     &lt;property&gt;        &lt;name&gt;oozie.service.WorkflowAppService.system.libpath&lt;/name&gt;
       &lt;value&gt;/user/oozie/sharelib/sharelib&lt;/value&gt;        &lt;description&gt;
           System library path to use for workflow applications.            This path is added
to workflow application if their job properties sets            the property 'oozie.use.system.libpath'
to true.        &lt;/description&gt;    &lt;/property&gt; in hdfs at location
/user/oozie/sharelib/sharelib I have following content: distcp  hcatalog  hive  hive2  mapreduce-streaming
 oozie  pig  sharelib.properties  spark  sqoop in the spark folder i do have spark-assembly-1.5.2-hadoop2.6.0.jar
Please let me know if this is the required set up, otherwise what am i missing over here.
 Regards,Rohit Mishra On 16-Jan-2017, at 1:33 pm, 권병창 &lt;magnum.c@navercorp.com&gt;
wrote:
Hitry to  make sure there is spark-assembly-1.5.2-hadoop2.6.0.jar in oozie spark share lib.
spark assembly jar must be locate in oozie spark share lib.  
-----Original Message-----
From: "Rohit Mishra"&lt;rohitkmishra@mindagroup.com&gt; 
To: &lt;user@hadoop.apache.org&gt;; 
Cc: 
Sent: 2017-01-16 (월) 15:04:26
Subject: oozie issue java.lang.UnsupportedOperationException: Not implemented by the TFS FileSystem
implementatio
 Hello, I am new to hadoop.I am having issue to run a spark job in oozie.individually i am
able to run the spark job but with oozie after the job is launched i am getting the following
error: 017-01-12 13:51:57,696 INFO [main] org.apache.hadoop.service.AbstractService: Service
org.apache.hadoop.mapreduce.v2.app.MRAppMaster failed in state INITED; cause: java.lang.UnsupportedOperationException:
Not implemented by the TFS FileSystem implementationjava.lang.UnsupportedOperationException:
Not implemented by the TFS FileSystem implementation	at org.apache.hadoop.fs.FileSystem.getScheme(FileSystem.java:216)
at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2564)	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2574)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.getFileSystem(MRAppMaster.java:497)	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:281)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$4.run(MRAppMaster.java:1499)
at java.security.AccessController.doPrivileged(Native Method)	at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)	at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1496)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1429) Spark version:
spark-1.5.2-bin-hadoop2.6Hadoop: hadoop-2.6.2Hbase : hbase-1.1.5Oozie: oozie-4.2.0 snapshot
of my pom.xml is: &lt;dependency&gt;
   &lt;groupId&gt;org.apache.zookeeper&lt;/groupId&gt;
   &lt;artifactId&gt;zookeeper&lt;/artifactId&gt;
   &lt;version&gt;3.4.8&lt;/version&gt;
   &lt;type&gt;pom&lt;/type&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
   &lt;groupId&gt;org.apache.hbase&lt;/groupId&gt;
   &lt;artifactId&gt;hbase-common&lt;/artifactId&gt;
   &lt;version&gt;1.1.5&lt;/version&gt;
   &lt;exclusions&gt;
      &lt;exclusion&gt;
         &lt;groupId&gt;org.slf4j&lt;/groupId&gt;
         &lt;artifactId&gt;slf4j-log4j12&lt;/artifactId&gt;
      &lt;/exclusion&gt;
   &lt;/exclusions&gt;
&lt;/dependency&gt;

&lt;dependency&gt;
   &lt;groupId&gt;org.apache.hbase&lt;/groupId&gt;
   &lt;artifactId&gt;hbase-client&lt;/artifactId&gt;
   &lt;version&gt;1.1.5&lt;/version&gt;
   &lt;exclusions&gt;
      &lt;exclusion&gt;
         &lt;groupId&gt;org.slf4j&lt;/groupId&gt;
         &lt;artifactId&gt;slf4j-log4j12&lt;/artifactId&gt;
      &lt;/exclusion&gt;
   &lt;/exclusions&gt;
&lt;/dependency&gt;

&lt;dependency&gt;
   &lt;groupId&gt;org.apache.hbase&lt;/groupId&gt;
   &lt;artifactId&gt;hbase-server&lt;/artifactId&gt;
   &lt;version&gt;1.1.5&lt;/version&gt;
   &lt;exclusions&gt;
      &lt;exclusion&gt;
         &lt;groupId&gt;org.slf4j&lt;/groupId&gt;
         &lt;artifactId&gt;slf4j-log4j12&lt;/artifactId&gt;
      &lt;/exclusion&gt;
   &lt;/exclusions&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
   &lt;groupId&gt;org.apache.hbase&lt;/groupId&gt;
   &lt;artifactId&gt;hbase-testing-util&lt;/artifactId&gt;
   &lt;version&gt;1.1.5&lt;/version&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
   &lt;groupId&gt;org.apache.spark&lt;/groupId&gt;
   &lt;artifactId&gt;spark-core_2.11&lt;/artifactId&gt;
   &lt;version&gt;1.5.2&lt;/version&gt;
   &lt;exclusions&gt;
      &lt;exclusion&gt;
         &lt;artifactId&gt;javax.servlet&lt;/artifactId&gt;
         &lt;groupId&gt;org.eclipse.jetty.orbit&lt;/groupId&gt;
      &lt;/exclusion&gt;
   &lt;/exclusions&gt;
&lt;/dependency&gt;
&lt;!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10 --&gt;
&lt;dependency&gt;
   &lt;groupId&gt;org.apache.spark&lt;/groupId&gt;
   &lt;artifactId&gt;spark-sql_2.11&lt;/artifactId&gt;
   &lt;version&gt;1.5.2&lt;/version&gt;
&lt;/dependency&gt;
&lt;!-- https://mvnrepository.com/artifact/org.apache.spark/spark-yarn_2.10 --&gt;
&lt;dependency&gt;
   &lt;groupId&gt;org.apache.spark&lt;/groupId&gt;
   &lt;artifactId&gt;spark-yarn_2.11&lt;/artifactId&gt;
   &lt;version&gt;1.5.2&lt;/version&gt;

&lt;/dependency&gt;


&lt;!-- https://mvnrepository.com/artifact/org.mongodb.mongo-hadoop/mongo-hadoop-core
--&gt;
&lt;dependency&gt;
   &lt;groupId&gt;org.mongodb.mongo-hadoop&lt;/groupId&gt;
   &lt;artifactId&gt;mongo-hadoop-core&lt;/artifactId&gt;
   &lt;version&gt;1.5.2&lt;/version&gt;
&lt;/dependency&gt;
&lt;!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common --&gt;
&lt;dependency&gt;
   &lt;groupId&gt;org.apache.hadoop&lt;/groupId&gt;
   &lt;artifactId&gt;hadoop-common&lt;/artifactId&gt;
   &lt;version&gt;2.6.2&lt;/version&gt;
   &lt;exclusions&gt;
      &lt;exclusion&gt;
         &lt;artifactId&gt;servlet-api&lt;/artifactId&gt;
         &lt;groupId&gt;javax.servlet&lt;/groupId&gt;
      &lt;/exclusion&gt;
      &lt;exclusion&gt;
         &lt;artifactId&gt;jetty-util&lt;/artifactId&gt;
         &lt;groupId&gt;org.mortbay.jetty&lt;/groupId&gt;
      &lt;/exclusion&gt;
      &lt;exclusion&gt;
         &lt;artifactId&gt;jsp-api&lt;/artifactId&gt;
         &lt;groupId&gt;javax.servlet.jsp&lt;/groupId&gt;
      &lt;/exclusion&gt;
   &lt;/exclusions&gt;
&lt;/dependency&gt;
&lt;!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-client --&gt;
&lt;dependency&gt;
   &lt;groupId&gt;org.apache.hadoop&lt;/groupId&gt;
   &lt;artifactId&gt;hadoop-client&lt;/artifactId&gt;
   &lt;version&gt;2.6.2&lt;/version&gt;
   &lt;exclusions&gt;
      &lt;exclusion&gt;
         &lt;artifactId&gt;jetty-util&lt;/artifactId&gt;
         &lt;groupId&gt;org.mortbay.jetty&lt;/groupId&gt;
      &lt;/exclusion&gt;
   &lt;/exclusions&gt;
&lt;/dependency&gt;
&lt;!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-core
--&gt;
&lt;dependency&gt;
   &lt;groupId&gt;org.apache.hadoop&lt;/groupId&gt;
   &lt;artifactId&gt;hadoop-mapreduce-client-core&lt;/artifactId&gt;
   &lt;version&gt;2.6.2&lt;/version&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
   &lt;groupId&gt;org.mongodb&lt;/groupId&gt;
   &lt;artifactId&gt;mongo-java-driver&lt;/artifactId&gt;
   &lt;version&gt;3.2.1&lt;/version&gt;
&lt;/dependency&gt;
&lt;!-- hadoop dependency --&gt;
&lt;!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core --&gt;
&lt;dependency&gt;
   &lt;groupId&gt;org.apache.hadoop&lt;/groupId&gt;
   &lt;artifactId&gt;hadoop-core&lt;/artifactId&gt;
   &lt;version&gt;1.2.1&lt;/version&gt;
   &lt;exclusions&gt;
      &lt;exclusion&gt;
         &lt;artifactId&gt;jetty-util&lt;/artifactId&gt;
         &lt;groupId&gt;org.mortbay.jetty&lt;/groupId&gt;
      &lt;/exclusion&gt;
   &lt;/exclusions&gt;
&lt;/dependency&gt;  Till now I have searched several blogs. What i do understand
from reading those blogs iis that there is some issue with the tachyon jar which is embedded
in spark-assembly-1.5.2-hadoop2.6.0.jar.I tried removing tachyon-0.5.0.jar tachyon-client-0.5.0.jar
from shared library of oozie (was present under spark library) but then i started getting
error: Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], main()
threw exception, org.apache.spark.util.Utils$.DEFAULT_DRIVER_MEM_MB()Ijava.lang.NoSuchMethodError:
org.apache.spark.util.Utils$.DEFAULT_DRIVER_MEM_MB()I Please help me debug and solve it. Thanks,Rohit


Mime
View raw message