hadoop-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alberto Luengo Cabanillas <cabi...@gmail.com>
Subject Re: TWO QUESTIONS ABOUT LOGS & DEBUGGING IN HADOOP
Date Tue, 03 Nov 2009 23:33:59 GMT
Hi Harshad, the point is that I want to read the content of a file in HDFS
from inside my Java code to use the information (i.e. I have some file that
has one number per line, so the first one is the number of iterations, the
second is a loop limit, etc, and I need to read them to store them in a
variable). goutham suggested "IOUtils"...I´ll take a look.
Thanks for the reply.

2009/11/1 Harshad Shrikhande <harsh.107@gmail.com>

> Hi,
>     You can read the content of a file in Hadoop Distributed File System
> using
>  bin/hadoop dfs -cat file name
>
> What else do you want to do . It's simple
>
> Bye
> Harshad Shrikhande
>
>
> On Sun, Nov 1, 2009 at 10:25 PM, Alberto Luengo Cabanillas <
> cabiwan@gmail.com> wrote:
>
> > Hi everyone! In a project involving Hadoop (0.20.1) I´m working, I´d need
> > to
> > know if I´m reading properly the content of a file located in the HDFS.
> > Beside the try...catch block, is there any other option for printing
> values
> > in the console (in a similar way standard IO Java mechanisms do, like
> > System.out.println("the variable is ", var))?
> > Another thing, is there a simply way to debug Hadoop applications?
> > (actually
> > I´m working in pseudo-distributed mode, for developing purposes).
> > Thanks a lot in advance.
> >
> > --
> > Alberto
> >
> > --
> > This message has been scanned for viruses and
> > dangerous content by MailScanner, and is
> > believed to be clean.
> >
> >
>



-- 
Alberto

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message