Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id A577C101B8 for ; Sat, 19 Jul 2014 01:05:43 +0000 (UTC) Received: (qmail 86516 invoked by uid 500); 19 Jul 2014 01:05:41 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 86284 invoked by uid 500); 19 Jul 2014 01:05:41 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 85710 invoked by uid 99); 19 Jul 2014 01:05:41 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 19 Jul 2014 01:05:41 +0000 X-ASF-Spam-Status: No, hits=2.5 required=5.0 tests=FREEMAIL_REPLY,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of chris.mawata@gmail.com designates 209.85.220.181 as permitted sender) Received: from [209.85.220.181] (HELO mail-vc0-f181.google.com) (209.85.220.181) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 19 Jul 2014 01:05:37 +0000 Received: by mail-vc0-f181.google.com with SMTP id lf12so8605880vcb.12 for ; Fri, 18 Jul 2014 18:05:16 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=a69zztMvn5lv/zQIAZ1ym4fJrn1AX07BVFwosrJmNiY=; b=KdkUQA85lfAyIMBfQHVNxSBG6/TrQHEc+cLNg59IqJlS9DRqifpJxf8FVYTCoK1ul7 EY1+5DOfQKqE6E6S1CGQyxJmjCDrn1PvUG5/lf/us/12WWHxSf1O/kZjGM8cVS6zmAah 5rAJDreFPc5Ck6pY0+SVe65ZeaSUzuFtX994C9sfIxOsfxHgTBOLQa8tx/zYX0Z3NZY7 yKefPgjQ9bckF88sHyLsoOfAWvu0nUUIvfX32pXfvpwTnWsKQwTlXnBTg2DUHXHeYmj+ sRiNUf4G5E3fujSCvSAKD0Pin/YQpZklmtg82eI6RVffFb32tolYMf6rZcic2EJ2Mzrq FMTQ== MIME-Version: 1.0 X-Received: by 10.220.30.69 with SMTP id t5mr10642687vcc.6.1405731916252; Fri, 18 Jul 2014 18:05:16 -0700 (PDT) Received: by 10.220.160.202 with HTTP; Fri, 18 Jul 2014 18:05:16 -0700 (PDT) Received: by 10.220.160.202 with HTTP; Fri, 18 Jul 2014 18:05:16 -0700 (PDT) In-Reply-To: References: <201407180921319324715@gmail.com> Date: Fri, 18 Jul 2014 21:05:16 -0400 Message-ID: Subject: Re: Re: HDFS input/output error - fuse mount From: Chris Mawata To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=001a11c2bf02848dec04fe817755 X-Virus-Checked: Checked by ClamAV on apache.org --001a11c2bf02848dec04fe817755 Content-Type: text/plain; charset=UTF-8 Great that you got it sorted out. I'm afraid I don't know if there is a configuration that would automatically check the versions -- maybe someone who knows might chime in. Cheers Chris On Jul 18, 2014 3:06 PM, "andrew touchet" wrote: > Thanks Chris! > > The issue was that even though I set jdk-7u21 as my default, it checked > for /usr/java/jdk-1.6* first. Even though it was compiled with 1.7. > > Is there anyway to generate a proper hadoop-config.sh to reflect the minor > version hadoop was built with? So that in my case, it would check for > /usr/java/jdk-1.7* instead? I appreciate the help! > > > On Thu, Jul 17, 2014 at 11:11 PM, Chris Mawata > wrote: > >> Yet another place to check -- in the hadoop-env.sh file there is also a >> JAVA_HOME setting. >> Chris >> On Jul 17, 2014 9:46 PM, "andrew touchet" wrote: >> >>> Hi Fireflyhoo, >>> >>> Below I follow the symbolic links for the jdk-7u21. These links are >>> changed accordingly as I change between versions. Also, I have 8 datanodes >>> and 2 other various servers that are capable of mounting /hdfs. So it is >>> just this server is an issue. >>> >>> $ java -version >>> java version "1.7.0_21" >>> Java(TM) SE Runtime Environment (build 1.7.0_21-b11) >>> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode) >>> >>> java >>> $ ls -l `which java` >>> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java -> >>> /usr/java/default/bin/java* >>> $ ls -l /usr/java/default >>> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default -> >>> /usr/java/latest* >>> $ ls -l /usr/java/latest >>> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest -> >>> /usr/java/jdk1.7.0_21* >>> >>> jar >>> $ ls -l `which jar` >>> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar -> >>> /etc/alternatives/jar* >>> $ ls -l /etc/alternatives/jar >>> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar -> >>> /usr/java/jdk1.7.0_21/bin/jar* >>> >>> javac >>> $ ls -l `which javac` >>> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac -> >>> /etc/alternatives/javac* >>> $ ls -l /etc/alternatives/javac >>> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac -> >>> /usr/java/jdk1.7.0_21/bin/javac* >>> >>> Now that I've tried version from 6 & 7, I'm really not sure what is >>> causing this issue. >>> >>> >>> >>> >>> >>> >>> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com < >>> fireflyhoo@gmail.com> wrote: >>> >>>> I think you first confirm you local java version , >>>> Some liux will pre-installed java ,that version is very low >>>> >>>> ------------------------------ >>>> fireflyhoo@gmail.com >>>> >>>> >>>> *From:* andrew touchet >>>> *Date:* 2014-07-18 09:06 >>>> *To:* user >>>> *Subject:* Re: HDFS input/output error - fuse mount >>>> Hi Chris, >>>> >>>> I tried to mount /hdfs with java versions below but there was no change >>>> in output. >>>> jre-7u21 >>>> jdk-7u21 >>>> jdk-7u55 >>>> jdk1.6.0_31 >>>> jdk1.6.0_45 >>>> >>>> >>>> >>>> >>>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata >>>> wrote: >>>> >>>>> Version 51 ia Java 7 >>>>> Chris >>>>> On Jul 17, 2014 7:50 PM, "andrew touchet" wrote: >>>>> >>>>>> Hello, >>>>>> >>>>>> Hadoop package installed: >>>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch >>>>>> >>>>>> Operating System: >>>>>> CentOS release 5.8 (Final) >>>>>> >>>>>> I am mounting HDFS from my namenode to another node with fuse. After >>>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to >>>>>> the below output. >>>>>> >>>>>> >>>>>> $ls /hdfs >>>>>> *ls: /hdfs: Input/output error* >>>>>> $hadoop fs -ls >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError: >>>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0 at >>>>>> java.lang.ClassLoader.defineClass1(Native Method) at >>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631) at >>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615) at >>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141) >>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283) at >>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58) at >>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197) at >>>>>> java.security.AccessController.doPrivileged(Native Method) at >>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190) at >>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306) at >>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at >>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the >>>>>> main class: org.apache.hadoop.fs.FsShell. Program will exit.* >>>>>> >>>>>> >>>>>> I have attempted to mount /hdfs manually in debug mode and then >>>>>> attempted to access /hdfs from a different terminal. This is the output. >>>>>> The namenode is *glados*. The server where /hdfs is being mounted is >>>>>> *glados2*. >>>>>> >>>>>> >>>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 >>>>>> fuse-dfs didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, >>>>>> opcode: INIT (26), nodeid: 0, insize: 56INIT: >>>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting >>>>>> glados:9000Exception in thread "main" >>>>>> java.lang.UnsupportedClassVersionError: >>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0 >>>>>> at java.lang.ClassLoader.defineClass1(Native Method) at >>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631) at >>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615) at >>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141) >>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283) at >>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58) at >>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197) at >>>>>> java.security.AccessController.doPrivileged(Native Method) at >>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190) at >>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306) at >>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at >>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct >>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127 >>>>>> Unable to establish test connection to server INIT: 7.8 >>>>>> flags=0x00000001 max_readahead=0x00020000 max_write=0x00020000 >>>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3), >>>>>> nodeid: 1, insize: 56Exception in thread "Thread-0" >>>>>> java.lang.UnsupportedClassVersionError: >>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0 >>>>>> at java.lang.ClassLoader.defineClass1(Native Method) at >>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631) at >>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615) at >>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141) >>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283) at >>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58) at >>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197) at >>>>>> java.security.AccessController.doPrivileged(Native Method) at >>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190) at >>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306) at >>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at >>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct >>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR >>>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR >>>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000 unique: 2, >>>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3), >>>>>> nodeid: 1, insize: 56* >>>>>> >>>>>> I adopted this system after this was already setup, so I do not know >>>>>> which java version was used during install. Currently I'm using: >>>>>> >>>>>> $java -version >>>>>> >>>>>> >>>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build >>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed >>>>>> mode)* >>>>>> >>>>>> $java -version >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build >>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed >>>>>> mode)* >>>>>> Is my java version really the cause of this issue? What is the >>>>>> correct java version to be used for this version of hadoop. I have also >>>>>> tried 1.6.0_31 but no changes were seen. >>>>>> >>>>>> If java isn't my issue, then what is? >>>>>> >>>>>> Best regards, >>>>>> >>>>>> Andrew >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>> >>> > --001a11c2bf02848dec04fe817755 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable

Great that you got it sorted out. I'm afraid I don't= know if there is a configuration that would automatically check the versio= ns -- maybe someone who knows might chime in.
Cheers
Chris

On Jul 18, 2014 3:06 PM, "andrew touchet&qu= ot; <adt027@latech.edu> wrot= e:
Thanks Chris!

The issue was th= at even though I set jdk-7u21 as my default, it checked for /usr/java/jdk-1= .6* first.=C2=A0 Even though it was compiled with 1.7.=C2=A0

= Is there anyway to generate a proper hadoop-config.sh to reflect the minor = version hadoop was built with? So that in my case, it would check for /usr/= java/jdk-1.7* instead?=C2=A0 I appreciate the help!


O= n Thu, Jul 17, 2014 at 11:11 PM, Chris Mawata <chris.mawata@gmail.com= > wrote:

Yet another place to check --= in the hadoop-env.sh file there is also a JAVA_HOME setting.
Chris

On Jul 17, 2014 9:46 PM, "andrew touchet&qu= ot; <adt027@latec= h.edu> wrote:
Hi Fireflyhoo,

Below I follow = the symbolic links for the jdk-7u21. These links are changed accordingly as= I change between versions. Also, I have 8 datanodes and 2 other various se= rvers that are capable of mounting /hdfs.=C2=A0 So it is just this server i= s an issue.

$ java -version
java version "1.7.0_21"
Java(TM) SE Run= time Environment (build 1.7.0_21-b11)
Java HotSpot(TM) 64-Bit Server VM = (build 23.21-b01, mixed mode)

java
$ ls -l `which java`
lrw= xrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java -> /usr/java/default/b= in/java
$ ls -l /usr/java/default
lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr= /java/default -> /usr/java/latest
$ ls -l /usr/java/latest
= lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest -> /usr/java/jdk= 1.7.0_21

jar
$ ls -l `which jar`
lrwxrwxrwx 1 root root 21 Jul 17= 20:18 /usr/bin/jar -> /etc/alternatives/jar
$ ls -l /etc/alterna= tives/jar
lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/ja= r -> /usr/java/jdk1.7.0_21/bin/jar

javac
$ ls -l `which javac`
lrwxrwxrwx 1 root root 23 Jul 17 2= 0:18 /usr/bin/javac -> /etc/alternatives/javac
$ ls -l /etc/alter= natives/javac
lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternative= s/javac -> /usr/java/jdk1.7.0_21/bin/javac

Now that I've tried version from=C2=A0 6 & 7, I'm rea= lly not sure what is causing this issue.






On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <fireflyhoo@gmail.com><= /span> wrote:
I think =C2=A0you first=C2=A0confirm you local java versio= n , =C2=A0
Some =C2=A0liux=C2=A0will =C2=A0pre-installed java ,that version is very=C2=A0low=C2=A0


=C2=A0
From:=C2=A0andrew touchet
Date:=C2=A02014-07-18=C2=A009:0= 6
To:=C2=A0user
Subject:=C2=A0Re: HDFS input/output error - fuse mount
Hi Chris,
=
I tried to mount /hdfs with java versions below but there was no = change in output.=C2=A0
jre-7u21
jdk-7u21
jdk-7u55
jdk1.6.0_31
jdk1.6.= 0_45




On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <= chris.mawata@gm= ail.com> wrote:

Version 51 ia Java 7
Chris

On Jul 17, 2014 7:50 PM, "andrew touchet&qu= ot; <adt027@latec= h.edu> wrote:
Hello,

Hadoop package installed= :
hadoop-0.20-0.20.2+737-33.osg.el5.noarch

Operating S= ystem:
CentOS release 5.8 (Final)

I am mounting = HDFS from my namenode to another node with fuse.=C2=A0 After mounting to /h= dfs, any attempts to 'ls', 'cd', or use 'hadoop fs'= leads to the below output.


$ls /hdfs
ls: /hdfs: Input/output error
$hadoop fs -ls=
Exception in thread "main" java.lang.UnsupportedClassVersi= onError: org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.= 0
=C2=A0=C2=A0=C2=A0 at java.lang.ClassLoader.defineClass1(Native Method)
= =C2=A0=C2=A0=C2=A0 at java.lang.ClassLoader.defineClassCond(ClassLoader.jav= a:631)
=C2=A0=C2=A0=C2=A0 at java.lang.ClassLoader.defineClass(ClassLoad= er.java:615)
=C2=A0=C2=A0=C2=A0 at java.security.SecureClassLoader.defin= eClass(SecureClassLoader.java:141)
=C2=A0=C2=A0=C2=A0 at java.net.URLClassLoader.defineClass(URLClassLoader.ja= va:283)
=C2=A0=C2=A0=C2=A0 at java.net.URLClassLoader.access$000(URLClas= sLoader.java:58)
=C2=A0=C2=A0=C2=A0 at java.net.URLClassLoader$1.run(URL= ClassLoader.java:197)
=C2=A0=C2=A0=C2=A0 at java.security.AccessControll= er.doPrivileged(Native Method)
=C2=A0=C2=A0=C2=A0 at java.net.URLClassLoader.findClass(URLClassLoader.java= :190)
=C2=A0=C2=A0=C2=A0 at java.lang.ClassLoader.loadClass(ClassLoader.= java:306)
=C2=A0=C2=A0=C2=A0 at sun.misc.Launcher$AppClassLoader.loadCla= ss(Launcher.java:301)
=C2=A0=C2=A0=C2=A0 at java.lang.ClassLoader.loadCl= ass(ClassLoader.java:247)
Could not find the main class: org.apache.hadoop.fs.FsShell.=C2=A0 Program = will exit.



I have attempted to mount /hdfs manually in= debug mode and then attempted to access /hdfs from a different terminal. T= his is the output. The namenode is glados. The server where /hdfs is= being mounted is glados2.


$hdfs -oserver=3Dglados,port=3D9000,rdbuffer=3D131072,allow_other /= hdfs -d
fuse-dfs ignoring option allow_other
ERROR fuse_options.c:= 162 fuse-dfs didn't recognize /hdfs,-2

fuse-dfs ignoring option = -d
unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
INIT: 7.10
flags= =3D0x0000000b
max_readahead=3D0x00020000
INFO fuse_init.c:115 Mountin= g glados:9000
Exception in thread "main" java.lang.Unsupported= ClassVersionError: org/apache/hadoop/conf/Configuration : Unsupported major= .minor version 51.0
=C2=A0=C2=A0=C2=A0 at java.lang.ClassLoader.defineClass1(Native Method)
= =C2=A0=C2=A0=C2=A0 at java.lang.ClassLoader.defineClassCond(ClassLoader.jav= a:631)
=C2=A0=C2=A0=C2=A0 at java.lang.ClassLoader.defineClass(ClassLoad= er.java:615)
=C2=A0=C2=A0=C2=A0 at java.security.SecureClassLoader.defin= eClass(SecureClassLoader.java:141)
=C2=A0=C2=A0=C2=A0 at java.net.URLClassLoader.defineClass(URLClassLoader.ja= va:283)
=C2=A0=C2=A0=C2=A0 at java.net.URLClassLoader.access$000(URLClas= sLoader.java:58)
=C2=A0=C2=A0=C2=A0 at java.net.URLClassLoader$1.run(URL= ClassLoader.java:197)
=C2=A0=C2=A0=C2=A0 at java.security.AccessControll= er.doPrivileged(Native Method)
=C2=A0=C2=A0=C2=A0 at java.net.URLClassLoader.findClass(URLClassLoader.java= :190)
=C2=A0=C2=A0=C2=A0 at java.lang.ClassLoader.loadClass(ClassLoader.= java:306)
=C2=A0=C2=A0=C2=A0 at sun.misc.Launcher$AppClassLoader.loadCla= ss(Launcher.java:301)
=C2=A0=C2=A0=C2=A0 at java.lang.ClassLoader.loadCl= ass(ClassLoader.java:247)
Can't construct instance of class org.apache.hadoop.conf.Configuration<= br>ERROR fuse_init.c:127 Unable to establish test connection to server
= =C2=A0=C2=A0 INIT: 7.8
=C2=A0=C2=A0 flags=3D0x00000001
=C2=A0=C2=A0 m= ax_readahead=3D0x00020000
=C2=A0=C2=A0 max_write=3D0x00020000
=C2=A0=C2=A0 unique: 1, error: 0 (Success), outsize: 40
unique: 2, opcod= e: GETATTR (3), nodeid: 1, insize: 56
Exception in thread "Thread-0= " java.lang.UnsupportedClassVersionError: org/apache/hadoop/conf/Confi= guration : Unsupported major.minor version 51.0
=C2=A0=C2=A0=C2=A0 at java.lang.ClassLoader.defineClass1(Native Method)
= =C2=A0=C2=A0=C2=A0 at java.lang.ClassLoader.defineClassCond(ClassLoader.jav= a:631)
=C2=A0=C2=A0=C2=A0 at java.lang.ClassLoader.defineClass(ClassLoad= er.java:615)
=C2=A0=C2=A0=C2=A0 at java.security.SecureClassLoader.defin= eClass(SecureClassLoader.java:141)
=C2=A0=C2=A0=C2=A0 at java.net.URLClassLoader.defineClass(URLClassLoader.ja= va:283)
=C2=A0=C2=A0=C2=A0 at java.net.URLClassLoader.access$000(URLClas= sLoader.java:58)
=C2=A0=C2=A0=C2=A0 at java.net.URLClassLoader$1.run(URL= ClassLoader.java:197)
=C2=A0=C2=A0=C2=A0 at java.security.AccessControll= er.doPrivileged(Native Method)
=C2=A0=C2=A0=C2=A0 at java.net.URLClassLoader.findClass(URLClassLoader.java= :190)
=C2=A0=C2=A0=C2=A0 at java.lang.ClassLoader.loadClass(ClassLoader.= java:306)
=C2=A0=C2=A0=C2=A0 at sun.misc.Launcher$AppClassLoader.loadCla= ss(Launcher.java:301)
=C2=A0=C2=A0=C2=A0 at java.lang.ClassLoader.loadCl= ass(ClassLoader.java:247)
Can't construct instance of class org.apache.hadoop.conf.Configuration<= br>ERROR fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR fuse_impls_getattr.c:40 Could not connect to glados:9000
=C2=A0= =C2=A0 unique: 2, error: -5 (Input/output error), outsize: 16
unique: 3, opcode: GETATTR (3), nodeid: 1, insize: 56


I ad= opted this system after this was already setup, so I do not know which java= version was used during install. Currently I'm using:

$java -ve= rsion
java version "1.6.0_45"
Java(TM) SE Runtime Environment (bu= ild 1.6.0_45-b06)
Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mi= xed mode)


$java -version
java version "1.6.0_45"=
Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
Java HotSpot(TM) 64= -Bit Server VM (build 20.45-b01, mixed mode)


Is m= y java version really the cause of this issue?=C2=A0 What is the correct ja= va version to be used for this version of hadoop.=C2=A0 I have also tried 1= .6.0_31 but no changes were seen.

If java isn't my issue, then what is?

= Best regards,

Andrew






--001a11c2bf02848dec04fe817755--