flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From kashif khan <drkashif8...@gmail.com>
Subject Re: Automatically upload files into HDFS
Date Mon, 19 Nov 2012 12:30:10 GMT
Thanks M. Tariq

I have tried to visit the link but I think is not accessible as generate
the following error message:

 Whoa there!

The request token for this page is invalid. It may have already been used,
or expired because it is too old. Please go back to the site or application
that sent you here and try again; it was probably just a mistake.

   - Go to Twitter <http://twitter.com/home>.

 You can revoke access to any application at any time from the Applications
tab <http://twitter.com/settings/applications> of your Settings page.

By authorizing an application you continue to operate under Twitter's Terms
of Service <http://twitter.com/tos>. In particular, some usage information
will be shared back with Twitter. For more, see our Privacy

Best regards,


On Mon, Nov 19, 2012 at 10:50 AM, Mohammad Tariq <dontariq@gmail.com> wrote:

> Hello Kashif,
>     You can visit this link and see if it is of any help to you. I have
> shared some of my initial experience here.
> http://api.twitter.com/oauth/authorize?oauth_token=ndACNGIkLSeMJdeMIeQYowyzpjDtvvmqo5ja9We7zo
> You may want to skip the build part and download the release directly and
> start off with that.
> Regards,
>     Mohammad Tariq
> On Mon, Nov 19, 2012 at 4:14 PM, kashif khan <drkashif8310@gmail.com>wrote:
>> HI,
>> I am generating files continuously in local folder of my base machine.
>> How I can now use the flume to stream the generated files from local folder
>> to HDFS.
>> I dont know how exactly configure the sources, sinks and hdfs.
>> 1) location of folder where files are generating: /usr/datastorage/
>> 2) name node address: htdfs://hadoop1.example.com:8020
>> Please let me help.
>> Many thanks
>> Best regards,
>> KK

View raw message