hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From jiang licht <licht_ji...@yahoo.com>
Subject Re: user authentication: protect hdfs/job web interface from public
Date Fri, 05 Mar 2010 23:35:39 GMT
Thanks, Jakob.

No surprise it is supported :)

Thanks,
--

Michael

--- On Fri, 3/5/10, Jakob Homan <jhoman@yahoo-inc.com> wrote:

From: Jakob Homan <jhoman@yahoo-inc.com>
Subject: Re: user authentication: protect hdfs/job web interface from public
To: common-user@hadoop.apache.org
Date: Friday, March 5, 2010, 4:38 PM

Jiang-
   Hadoop has support for this via the hadoop.http.filter.initializers property, which
allows you set the name of a class to add as a standard servlet filter for the public-facing
websites, such as:

<property>
      <name>hadoop.http.filter.initializers</name>
      <value>com.widgetcorp.HadoopFilter</value>
</property>

Each public-facing page will then be routed through this filter, which can reject the request. 
This was designed to be pluggable to work with different organization's authentication schemes. 
Other web servers are not currently secured, but will be via Kerberos in the currently in-progress
security release.

-Jakob
Hadoop @ Yahoo!

jiang licht wrote:
> I guess I might need to check jetty documentation as well. Anyway, here is my question.
> 
> hdfs and map/reduce can be tracked via web interfaces, e.g. at 50070, 50030, etc. We
won't want to see that anyone can access such web pages. But only authorized ppl can access
them from anywhere anytime. One way might be to wrap the JSPs or rename JSPs and put an access
page in front of them (cautious, not to break any possible links). But what is the best way
to protect such web interfaces by adding some user authentication? Any suggestions?
> 
> Thanks,
> --
> 
> Michael
> 
> 
>       




      
Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message