hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Christophe Taton (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HADOOP-3249) Browsing DFS behind gateway
Date Tue, 15 Apr 2008 21:10:23 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-3249?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12589238#action_12589238

Christophe Taton commented on HADOOP-3249:

If you are able to open one port on your gateway, then you can probably open a socks proxy
server on it (have a look at dante:http://www.inet.no/dante/), in which case all your users
would have to setup the proxy address in their browser to point to your gateway.
Concerning the JSPs, you have to rebuild the hadoop-core jar to make your changes available.
The JSPs are statically compiled into servlets during the build process.

> Browsing DFS behind gateway
> ---------------------------
>                 Key: HADOOP-3249
>                 URL: https://issues.apache.org/jira/browse/HADOOP-3249
>             Project: Hadoop Core
>          Issue Type: Wish
>          Components: dfs
>    Affects Versions: 0.16.0, 0.16.1, 0.16.2
>         Environment: Red-Hat cluster
>            Reporter: Martin Boeker
>   Original Estimate: 5h
>  Remaining Estimate: 5h
> Dear Hadoop guys,
> I'm urgently trying to make a way for users to be able to see the contents of a Hadoop
DFS that is behind a gateway. I'm using port forwarding on the gateway itself to point to
the DFS web interface, something like this:
> [gateway_external_IP]:50070 >> [node_internal_IP]:50070
> This works fine, if I go to http://gateway_external_ip:50070/ I can view the DFS cluster
html page from the outside world. The problem is that if I click on any of the slave node
links, it forwards to http://node_hostname/.., which obviously doesn't work. I really need
to get this going, a couple of projects require this to be implemented.
> I'm willing to do this any way possible, I don't really need to use the 50070 web interface,
even a simple directory structure would do, but I'm not sure how to implement that either,
because I don't know of a way to make an httpd or ftpd use "bin/hadoop dfs -lsr /" as the
root directory. I'd also be willing to make people use a proxy server if that would fix my
issue somehow..
> If anyone can help, I would greatly appreciate it, like I said it's kind of urgent and
I'm running out of ideas to try..
> Thanks a lot in advance,
> -Martin

This message is automatically generated by JIRA.
You can reply to this email to add a comment to the issue online.

View raw message