hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ryan LeCompte" <lecom...@gmail.com>
Subject Re: Error while uploading large file to S3 via Hadoop 0.18
Date Mon, 01 Sep 2008 22:49:01 GMT
Thanks, trying it now!

Ryan


On Mon, Sep 1, 2008 at 6:04 PM, Albert Chern <albert@netseer.com> wrote:
> Increase the retry buffer size in jets3t.properties and maybe up the number
> of retries while you're at it.  If there is no template file included in
> Hadoop's conf dir you can find it at the jets3t web site.  Make sure that
> it's from the same version that your copy of Hadoop is using.
>
> On Mon, Sep 1, 2008 at 1:32 PM, Ryan LeCompte <lecompte@gmail.com> wrote:
>
>> Hello,
>>
>> I'm trying to upload a fairly large file (18GB or so) to my AWS S3
>> account via bin/hadoop fs -put ... s3://...
>>
>> It copies for a good 15 or 20 minutes, and then eventually errors out
>> with a failed retry attempt (saying that it can't retry since it has
>> already written a certain number of bytes, etc. sorry don't have the
>> original error message at the moment). Has anyone experienced anything
>> similar? Can anyone suggest a workaround or a way to specify retries?
>> Should I use another tool for uploading large files to s3?
>>
>> Thanks,
>> Ryan
>>
>

Mime
View raw message