hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vitalii Tymchyshyn <tiv...@gmail.com>
Subject Re: Which release to use?
Date Tue, 19 Jul 2011 12:10:06 GMT
19.07.11 14:50, Steve Loughran написав(ла):
> On 19/07/11 12:44, Rita wrote:
>> Arun,
>> I second Joeś comment.
>> Thanks for giving us a heads up.
>> I will wait patiently until 0.23 is considered stable.
> API-wise, 0.21 is better. I know that as I'm working with 0.20.203 
> right now, and it is a step backwards.
> Regarding future releases, the best way to get it stable is 
> participate in release testing in your own infrastructure. Nothing 
> else will find the problems unique to your setup of hardware, network 
> and software

My little hadoop adoption story (or why I won't test 0.23)
I am among those who think that latest release is what is supported and 
so we got to 0.21 way.
BTW: I've tried to find some release roadmap, but could not find 
anything up to date.
We are using HDFR without Map/Reduce.
As far as I can see now 0.21 nowhere near beta quality with non-working 
new features like backup node or append. Also there is no option for 
such unlucky people to back off to 0.20 (at least "hadoop downgrade" 
search do not give any good results).
I did already fill 5 tickets in Jira, 3 of them with patches. On two 
there is no activity at all, on other three answer is the latest 
non-autogenerated message (and over 3 weeks old).
I did send few messages to this list, one to hdfs-user. No answers.
With this level of project activity, I can't afford to test a thing that 
have not got to 0.21 quality level yet. If I will have any problems, I 
can't afford to wait for months to be heard.
I am more or less stable on my own patched 0.21 for now and will either 
move forward if I will see more project activity or move somewhere else 
if it will become "less stable".

Best regards, Vitalii Tymchyshyn

View raw message