hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Arun C Murthy (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HADOOP-483) Document libhdfs and fix some OS specific stuff in Makefile
Date Tue, 29 Aug 2006 19:21:23 GMT
    [ http://issues.apache.org/jira/browse/HADOOP-483?page=comments#action_12431328 ] 
Arun C Murthy commented on HADOOP-483:

 This is due to the fact that libhdfs relies on the hadoop setup to be up and running (and
$CLASSPATH to be setup correctly to the running instance), which isn't the case after 'ant
clean'. Should I setup the 'compile-libhdfs' target in build.xml to depend on 'package' too?
(However that still won't solve the problem that hadoop isn't running when we need to run

  All this while we relied on this to be done explicitly and hence the comment in build.xml
which notes that test-libhdfs should be run manually.

  I'd appreciate any ideas on how to improve this further... including how I can go about
ensuring that hadoop is running, setup correct classpaths (for jni) etc.


> Document libhdfs and fix some OS specific stuff in Makefile
> -----------------------------------------------------------
>                 Key: HADOOP-483
>                 URL: http://issues.apache.org/jira/browse/HADOOP-483
>             Project: Hadoop
>          Issue Type: Improvement
>          Components: dfs
>            Reporter: Arun C Murthy
>         Assigned To: Arun C Murthy
>         Attachments: HADOOP-483.patch, libhdfs.patch, libhdfs.second.patch
> Document libhdfs (api docs and user manual) and fix some OS specific stuff (paths to
utilities like rm) in the Makefile and TestDFSCIO.java

This message is automatically generated by JIRA.
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira


View raw message