hadoop-hdfs-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From harryxiyou <harryxi...@gmail.com>
Subject Re: [cloudxy] RE: [Hadoop]Environment variable CLASSPATH not set!
Date Sun, 10 Feb 2013 00:58:19 GMT
On Sun, Feb 10, 2013 at 5:15 AM, John Gordon <John.Gordon@microsoft.com> wrote:
Hi John,

> Env variables hang off of the session context and are specific to both the
> user profile and their shell-specific preferences.  If your driver is
> loading in kernel mode, it cannot depend on env variables.

Our driver has relationships with Libvirt, HDFS and QEMU directly.
And, IIUC, these
are all loading in User Mode. I think GNULIB which called by Libvirt
may change the
Env variables.

> This will be a problem for the other environment variables like hadoop_home.
> Instead of using Java directly in kernel mode, I suggest splitting the
> problem:
> 1. fs abstraction for the kernel
>    a. Like the nfs filesystem kernel driver implementation for example -- a
> remote mount fs.
>    b. use a c impl of the protocol
>       I. To avoid issues, use hadoop 2.0 for protobuffs, since they yield a
> versioned protocol to avoid hangs and dumps when the protocol changes.
>       II.  OR push most of your implementation into a proxy service
>           a. Surface NFS directly, and just use the nfs kernel driver
>           b. Surface your own protocol to be consumed in the kernel mode
> driver.
> 2.  Start hdfs elsewhere, as a independent service in user mode like cups,
> httpd, or xinetd.
>     a.  Will have a session and the ability to configure env vars.
> Not sure if that exactly answers the question, but I hope it was helpful.

Absolutely helpful, we will take into account up ways you suggested.
Thanks for your help very much ;-)

Harry Wei

View raw message