hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bertrand Dechoux <decho...@gmail.com>
Subject Re: Impersonating HDFS user
Date Fri, 05 Oct 2012 14:33:27 GMT
Indeed, you are connecting to localhost and you said it was a remote
connection so I guess there is nothing there which is relevant for you.
The main idea is that you need to provide the configuration files. They are
read by default from the classpath. Any place where you have a
Configuration/JobConf you could also set up the right properties which
would be the location of the HDFS master (and mapred if you want to do
something about it).

Regards

Bertrand

On Fri, Oct 5, 2012 at 4:15 PM, Oleg Zhurakousky <oleg.zhurakousky@gmail.com
> wrote:

> So now I am passed it and able to RunAs 'hduser', but when I attempt to
> read from FSDataInputStream i see this message in my console
>
> 10:12:10,065  WARN main hdfs.DFSClient:2106 - Failed to connect to /
> 127.0.0.1:50010, add to deadNodes and continuejava.net.ConnectException:
> Connection refused
>
> 10:12:10,072  INFO main hdfs.DFSClient:2272 - Could not obtain block
> blk_-4047236896256451627_1003 from any node: java.io.IOException: No live
> nodes contain current block. Will get new block locations from namenode and
> retry...
>
>
> I am obviously missing a configuration setting somewhere. . . any idea?
>
> Thanks
>
> Oleg
>
> On Fri, Oct 5, 2012 at 9:37 AM, Oleg Zhurakousky <
> oleg.zhurakousky@gmail.com> wrote:
>
>> After i clicked send I found the same link ;), but thank you anyway.
>>
>> Oleg
>>
>>
>> On Fri, Oct 5, 2012 at 9:34 AM, Bertrand Dechoux <dechouxb@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> You might be looking for something like :
>>> UserGroupInformation.createRemoteUser(user).doAs(
>>>
>>> see
>>>
>>> http://hadoop.apache.org/docs/r1.0.3/api/org/apache/hadoop/security/UserGroupInformation.html
>>>
>>> It is a JAAS wrapper for Hadoop.
>>>
>>> Regards
>>>
>>> Bertrand
>>>
>>>
>>>
>>>
>>> On Fri, Oct 5, 2012 at 3:19 PM, Oleg Zhurakousky <
>>> oleg.zhurakousky@gmail.com> wrote:
>>>
>>>> I am working on some samples where I want to write to HDFS running on
>>>> another machine (different OS etc.)
>>>> The identity of my client process is just whatever my OS says it is
>>>> (e.g., 'oleg') hence:
>>>>
>>>> 08:56:49,240 DEBUG IPC Client (47) connection to /192.168.15.20:54310from
oleg ipc.Client:803 - IPC Client (47) connection to /
>>>> 192.168.15.20:54310 from oleg got value #2
>>>>
>>>> But there is no 'oleg' where the hadoop is running. Instead there is
>>>> 'hduser'.
>>>>
>>>> Is there a way or an equivalent of "RunAs" in hadoop?
>>>>
>>>> Thanks
>>>>  Oleg
>>>>
>>>
>>>
>>>
>>> --
>>> Bertrand Dechoux
>>>
>>
>>
>


-- 
Bertrand Dechoux

Mime
View raw message