hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Susheel Kumar Gadalay <skgada...@gmail.com>
Subject Re: Cannot start DataNode after adding new volume
Date Tue, 16 Sep 2014 10:49:53 GMT
The VERSION file has to be same across all the data nodes directories.

So I suggested to copy it as it is using OS command and start data node.

On 9/16/14, Charles Robertson <charles.robertson@gmail.com> wrote:
> Hi Susheel,
>
> Thanks for the reply. I'm not entirely sure what you mean.
>
> When I created the new directory on the new volume I simply created an
> empty directory. I see from the existing data node directory that it has a
> sub-directory called current containing a file called VERSION.
>
> Your advice is to create the 'current' sub-directory and copy the VERSION
> file across to it without changes? I see it has various guids, and so I'm
> worried about it clashing with the VERSION file in the other data
> directory.
>
> Thanks,
> Charles
>
> On 16 September 2014 10:57, Susheel Kumar Gadalay <skgadalay@gmail.com>
> wrote:
>
>> Is it something to do current/VERSION file in data node directory.
>>
>> Just copy from the existing directory and start.
>>
>> On 9/16/14, Charles Robertson <charles.robertson@gmail.com> wrote:
>> > Hi all,
>> >
>> > I am running out of space on a data node, so added a new volume to the
>> > host, mounted it and made sure the permissions were set OK. Then I
>> updated
>> > the 'DataNode Directories' property in Ambari to include the new path
>> > (comma separated, i.e. '/hadoop/hdfs/data,/data/hdfs'). Next I
>> > restarted
>> > the components with stale configs for that host, but the DataNode
>> wouldn't
>> > come back up, reporting 'connection refused'. When I remove the new
>> > data
>> > directory path from the property and restart, it starts fine.
>> >
>> > What am I doing wrong?
>> >
>> > Thanks,
>> > Charles
>> >
>>
>

Mime
View raw message