hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Christophe Taton (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HADOOP-3249) Browsing DFS behind gateway
Date Mon, 14 Apr 2008 20:13:09 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-3249?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12588764#action_12588764
] 

Christophe Taton commented on HADOOP-3249:
------------------------------------------

You should be able to do this using a SOCKS proxy:
- create the proxy using SSH: ssh -D 1080 gateway (you can also do this with some windows
SSH clients such as putty): this will create a proxy listening on localhost:1080 and forwarding
connections as if they were created from your gateway;
- tell your browser to use your proxy (if you use firefox, edit the preferences, advanced
tab, network tab, settings button, enable SOCKS to localhost, port 1080; there is also a plug-in
named foxyproxy).
Hope this helps...

> Browsing DFS behind gateway
> ---------------------------
>
>                 Key: HADOOP-3249
>                 URL: https://issues.apache.org/jira/browse/HADOOP-3249
>             Project: Hadoop Core
>          Issue Type: Wish
>          Components: dfs
>    Affects Versions: 0.16.0, 0.16.1, 0.16.2
>         Environment: Red-Hat cluster
>            Reporter: Martin Boeker
>   Original Estimate: 5h
>  Remaining Estimate: 5h
>
> Dear Hadoop guys,
> I'm urgently trying to make a way for users to be able to see the contents of a Hadoop
DFS that is behind a gateway. I'm using port forwarding on the gateway itself to point to
the DFS web interface, something like this:
> [gateway_external_IP]:50070 >> [node_internal_IP]:50070
> This works fine, if I go to http://gateway_external_ip:50070/ I can view the DFS cluster
html page from the outside world. The problem is that if I click on any of the slave node
links, it forwards to http://node_hostname/.., which obviously doesn't work. I really need
to get this going, a couple of projects require this to be implemented.
> I'm willing to do this any way possible, I don't really need to use the 50070 web interface,
even a simple directory structure would do, but I'm not sure how to implement that either,
because I don't know of a way to make an httpd or ftpd use "bin/hadoop dfs -lsr /" as the
root directory. I'd also be willing to make people use a proxy server if that would fix my
issue somehow..
> If anyone can help, I would greatly appreciate it, like I said it's kind of urgent and
I'm running out of ideas to try..
> Thanks a lot in advance,
> -Martin

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message