hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Martin Boeker (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HADOOP-3249) Browsing DFS behind gateway
Date Wed, 16 Apr 2008 23:13:21 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-3249?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12589763#action_12589763
] 

Martin Boeker commented on HADOOP-3249:
---------------------------------------

Update:

I decided I'd test the setup on a different system, and it works, so it's something specific
about this gateway... I checked /etc/sshd_config and TCP forwarding is allowed.. anyone have
any suggestions as to what other settings or configuration file I should take a look at? The
test system where I got the ssh -gD proxy to work is FedCor 8, but the gateway is RHEL ES4...
Could there be some default setting I'm not seeing? I checked the firewall, the settings seem
to be fine.

Thanks,

Martin

> Browsing DFS behind gateway
> ---------------------------
>
>                 Key: HADOOP-3249
>                 URL: https://issues.apache.org/jira/browse/HADOOP-3249
>             Project: Hadoop Core
>          Issue Type: Wish
>          Components: dfs
>    Affects Versions: 0.16.0, 0.16.1, 0.16.2
>         Environment: Red-Hat cluster
>            Reporter: Martin Boeker
>   Original Estimate: 5h
>  Remaining Estimate: 5h
>
> Dear Hadoop guys,
> I'm urgently trying to make a way for users to be able to see the contents of a Hadoop
DFS that is behind a gateway. I'm using port forwarding on the gateway itself to point to
the DFS web interface, something like this:
> [gateway_external_IP]:50070 >> [node_internal_IP]:50070
> This works fine, if I go to http://gateway_external_ip:50070/ I can view the DFS cluster
html page from the outside world. The problem is that if I click on any of the slave node
links, it forwards to http://node_hostname/.., which obviously doesn't work. I really need
to get this going, a couple of projects require this to be implemented.
> I'm willing to do this any way possible, I don't really need to use the 50070 web interface,
even a simple directory structure would do, but I'm not sure how to implement that either,
because I don't know of a way to make an httpd or ftpd use "bin/hadoop dfs -lsr /" as the
root directory. I'd also be willing to make people use a proxy server if that would fix my
issue somehow..
> If anyone can help, I would greatly appreciate it, like I said it's kind of urgent and
I'm running out of ideas to try..
> Thanks a lot in advance,
> -Martin

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message