Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 231A61168D for ; Wed, 27 Aug 2014 12:06:21 +0000 (UTC) Received: (qmail 85120 invoked by uid 500); 27 Aug 2014 12:05:54 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 84975 invoked by uid 500); 27 Aug 2014 12:05:54 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 84965 invoked by uid 99); 27 Aug 2014 12:05:53 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 27 Aug 2014 12:05:53 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of malte.maltesmann@gmail.com designates 209.85.217.196 as permitted sender) Received: from [209.85.217.196] (HELO mail-lb0-f196.google.com) (209.85.217.196) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 27 Aug 2014 12:05:48 +0000 Received: by mail-lb0-f196.google.com with SMTP id p9so39851lbv.7 for ; Wed, 27 Aug 2014 05:05:27 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=ppLEVVP2wY5h9b76AjUWNgvJf6PBUuE7seYLs+R5OC8=; b=P9EyTOUczaf+ikDUdhcG4pCjHgjwAhxQPnXZlU4fCSvW44nyV9YRvY/ggxtQrk5jA4 s1+Tr2lRMVqiGeGu6Tc4Z1ofOl5OaVKa+iFK2KJPQD0ljAOe4BN7iZWlTOrktUve0uo2 IkPBbep4i9szEOTMeCtAUmzK2MeU0Ley2hBdBODelXLTBf78BYJ2/WApDbTvpif32hc9 nxe6M0YcBvY1/qG8HQHVpykPFFcf66+7VxIqnBgIVRmfxxIDiTkhxTMnNkSk5ha2rboH IgXi4iecfZLBfx4j2k9tMa+y+lplyETNYpHX6I8FePIRn7Jg6v1h47h6NqKwfPlms23J Wxhg== MIME-Version: 1.0 X-Received: by 10.152.9.100 with SMTP id y4mr34144930laa.26.1409141127049; Wed, 27 Aug 2014 05:05:27 -0700 (PDT) Received: by 10.114.183.172 with HTTP; Wed, 27 Aug 2014 05:05:27 -0700 (PDT) Date: Wed, 27 Aug 2014 14:05:27 +0200 Message-ID: Subject: NoSuchElementException while running local MapReduce-Job on FreeBSD From: Malte Maltesmann To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=089e0158b9c850f58505019b3c7d X-Virus-Checked: Checked by ClamAV on apache.org --089e0158b9c850f58505019b3c7d Content-Type: text/plain; charset=UTF-8 Hi all, I tried to run a MapReduce-Job on my tow node FreeBSD-Cluster with Hadoop 2.4.1 and HBase 0.98.4 and ran into below Exception. I then tried the example provided here , running a MapReduce-Job without involving HBase or the HDFS on a single FreeBSD machine and got the same Exception: Exception in thread "main" java.util.NoSuchElementException at java.util.StringTokenizer.nextToken(StringTokenizer.java:349) at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:565) at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:534) at org.apache.hadoop.fs.LocatedFileStatus.(LocatedFileStatus.java:42) at org.apache.hadoop.fs.FileSystem$4.next(FileSystem.java:1697) at org.apache.hadoop.fs.FileSystem$4.next(FileSystem.java:1679) at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:302) at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:263) at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:375) at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493) at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303) at test.NewMaxTemperature$NewMaxTemperatureMapper.main(NewMaxTemperature.java:39) I ran the exact same project on a Linux machine and it worked just fine. So it seems this is a FreeBSD related issue. I looked into the code and found the following part, where the exception happens: /// loads permissions, owner, and group from `ls -ld`private void loadPermissionInfo() { IOException e = null; try { String output = FileUtil.execCommand(new File(getPath().toUri()), Shell.getGetPermissionCommand()); StringTokenizer t = new StringTokenizer(output, Shell.TOKEN_SEPARATOR_REGEX); //expected format //-rw------- 1 username groupname ... String permission = t.nextToken(); So here the StrinTokenizer tries to call nextToken() on an empty String. How can this happen? I checked that "ls -ld" provides the same output on FreeBSD and Linux. Does anybody have any suggestion what happens here? Thanks and regards, Malte --089e0158b9c850f58505019b3c7d Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Hi all,

I tried to run a MapRe= duce-Job on my tow node FreeBSD-Cluster with Hadoop 2.4.1 and HBase 0.98.4 = and ran into below Exception. I then tried the example provided here, running a MapReduc= e-Job without involving HBase or the HDFS on a single FreeBSD machine and g= ot the same Exception:

Exception in thread "main" java.util.NoSuchElementException=C2=A0=C2=A0=C2=A0 at java.util.StringTokenizer.nextToken(StringTokenizer= .java:349)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.fs.RawLocalFileSystem= $DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:56= 5)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRaw= LocalFileStatus.getPermission(RawLocalFileSystem.java:534)
=C2=A0=C2=A0= =C2=A0 at org.apache.hadoop.fs.LocatedFileStatus.<init>(LocatedFileSt= atus.java:42)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.fs.FileSystem$4.ne= xt(FileSystem.java:1697)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.fs.FileSystem$4.next(FileSystem.jav= a:1679)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.mapreduce.lib.input.File= InputFormat.singleThreadedListStatus(FileInputFormat.java:302)
=C2=A0=C2= =A0=C2=A0 at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStat= us(FileInputFormat.java:263)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.mapreduce.lib.input.FileInputFormat= .getSplits(FileInputFormat.java:375)
=C2=A0=C2=A0=C2=A0 at org.apache.ha= doop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493)
=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSub= mitter.java:510)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInt= ernal(JobSubmitter.java:394)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.map= reduce.Job$10.run(Job.java:1285)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop= .mapreduce.Job$10.run(Job.java:1282)
=C2=A0=C2=A0=C2=A0 at java.security= .AccessController.doPrivileged(Native Method)
=C2=A0=C2=A0=C2=A0 at javax.security.auth.Subject.doAs(Subject.java:415)=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.security.UserGroupInformation.doAs= (UserGroupInformation.java:1556)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop= .mapreduce.Job.submit(Job.java:1282)
=C2=A0=C2=A0=C2=A0 at org.apache.ha= doop.mapreduce.Job.waitForCompletion(Job.java:1303)
=C2=A0=C2=A0=C2=A0 at test.NewMaxTemperature$NewMaxTemperatureMapper.main(N= ewMaxTemperature.java:39)

I ran the exact same project on a Li= nux machine and it worked just fine. So it seems this is a FreeBSD related = issue.

I looked into the code and found the following part, whe= re the exception happens:

/// loads permissions, owner, and group from `ls -ld`
private void loadPermissionInfo() {
  IOException e =3D null;
  try {
    String output =3D FileUtil=
.execCommand(new File(getPath().toUri()),=20
        Shell.getGetPermissionCommand());
    StringTokenizer t =3D
        new StringTokenizer(ou=
tput, Shell.TOKEN_SEPARATOR_RE=
GEX);
    //expected format
    //-rw-------    1 username groupname ...=

    String permission =3D t.nextToken();


So here the StrinTokenizer tries to call nextToken() on an empty String. Ho= w can this happen?
I checked that "ls -ld" provides the same o= utput on FreeBSD and Linux.

Does anybody have any suggestion what ha= ppens here?

Thanks and regards,
Malte
--089e0158b9c850f58505019b3c7d--