ignite-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From C Reid <reidddc...@outlook.com>
Subject Re: Hadoop Accelerator doesn't work when use SnappyCodec compression
Date Thu, 26 Oct 2017 02:57:06 GMT

So far, using ignite as hdfs cache layer and as computation frameworks works well, but when
switch back to yarn framework will meet this snappy problem.

I'd prefer to set aside this issue, since there are some kerberos issues waiting for me.

Evgenii, thank you for the patience.


________________________________
From: C Reid <reidddchan@outlook.com>
Sent: 25 October 2017 11:20
To: user@ignite.apache.org; Evgenii Zhuravlev
Subject: Re: Hadoop Accelerator doesn't work when use SnappyCodec compression

As shown in the picture,
[cid:d5175e1f-6afb-438b-a24b-519786a22f49]Btw, yarn's version is 2.8.1, hdfs's version is
2.6.0.
________________________________
From: Evgenii Zhuravlev <e.zhuravlev.wk@gmail.com>
Sent: 24 October 2017 14:15
To: C Reid; user@ignite.apache.org
Subject: Fwd: Hadoop Accelerator doesn't work when use SnappyCodec compression


---------- Forwarded message ----------
From: Evgenii Zhuravlev <e.zhuravlev.wk@gmail.com<mailto:e.zhuravlev.wk@gmail.com>>
Date: 2017-10-20 12:31 GMT+03:00
Subject: Re: Hadoop Accelerator doesn't work when use SnappyCodec compression
To: C Reid <reidddchan@outlook.com<mailto:reidddchan@outlook.com>>


I've run a few days ago hive, hadoop and ignite with snappy compression without any problem.
It was hadoop 2.7.1, but your version should work too, I think. Apache Ignite codebase contains
tests for snappy codec. Here is one of them in attachments with small changes - please run
it in your environment and show us results.

Thanks,
Evgenii

2017-10-20 11:30 GMT+03:00 C Reid <reidddchan@outlook.com<mailto:reidddchan@outlook.com>>:
Yah, i tried all those methods found on the Google, and results were the same.

Also because it's just an "export LD_LIBRARY_PATH=..." expression in 'ignite.sh', i'm not
sure it takes efforts or not on a grid start up.

We are planing to run more than 1000+ grids in a cluster, but production env has plenty of
.snappy file, i'm struggling now...
Btw, my hadoop version is 2.6.0, does it matter?

Thanks for your patience.
________________________________
From: Evgenii Zhuravlev <e.zhuravlev.wk@gmail.com<mailto:e.zhuravlev.wk@gmail.com>>
Sent: 19 October 2017 17:07

To: user@ignite.apache.org<mailto:user@ignite.apache.org>
Subject: Re: Hadoop Accelerator doesn't work when use SnappyCodec compression

Could you also try to set LD_LIBRARY_PATH variable with path to the folder with native libraries?

2017-10-17 17:56 GMT+03:00 C Reid <reidddchan@outlook.com<mailto:reidddchan@outlook.com>>:
I just tried, got the same:
"Unable to load native-hadoop library for your platform... using builtin-java classes where
applicable"
"java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z"

I also tried adding all related native library under one of folders under jdk where all *.so
are located. But ignite just couldn't load them, it's strange.
________________________________
From: Evgenii Zhuravlev <e.zhuravlev.wk@gmail.com<mailto:e.zhuravlev.wk@gmail.com>>
Sent: 17 October 2017 21:25

To: user@ignite.apache.org<mailto:user@ignite.apache.org>
Subject: Re: Hadoop Accelerator doesn't work when use SnappyCodec compression

Have you tried to remove from path libraries from ${HADOOP_HOME}/lib/native and add only /usr/lib64/
folder?

2017-10-17 12:18 GMT+03:00 C Reid <reidddchan@outlook.com<mailto:reidddchan@outlook.com>>:
Tried, and did not work.

________________________________
From: Evgenii Zhuravlev <e.zhuravlev.wk@gmail.com<mailto:e.zhuravlev.wk@gmail.com>>
Sent: 17 October 2017 16:41
To: C Reid
Subject: Re: Hadoop Accelerator doesn't work when use SnappyCodec compression

I'd recommend adding /usr/lib64/ to JAVA_LIBRARY_PATH

Evgenii

2017-10-17 11:29 GMT+03:00 C Reid <reidddchan@outlook.com<mailto:reidddchan@outlook.com>>:
Yes, IgniteNode runs on the DataNode machine.

[hadoop@hadoop-offline033.dx.momo.com<mailto:hadoop@hadoop-offline033.dx.momo.com> ignite]$
echo $HADOOP_HOME
/opt/hadoop-2.8.1-all
[hadoop@hadoop-offline033.dx.momo.com<mailto:hadoop@hadoop-offline033.dx.momo.com> ignite]$
echo $IGNITE_HOME
/opt/apache-ignite-hadoop-2.2.0-bin

and in ignite.sh
JVM_OPTS="${JVM_OPTS} -Djava.library.path=${HADOOP_HOME}/lib/native:/usr/lib64/libsnappy.so.1:${HADOOP_HOME}/lib/native/libhadoop.so"

But exception is thrown as mentioned.
________________________________
From: Evgenii Zhuravlev <e.zhuravlev.wk@gmail.com<mailto:e.zhuravlev.wk@gmail.com>>
Sent: 17 October 2017 15:44

To: user@ignite.apache.org<mailto:user@ignite.apache.org>
Subject: Re: Hadoop Accelerator doesn't work when use SnappyCodec compression

Do you run Ignite on the same machine as hadoop?

I'd recommend you to check these env variables:
IGNITE_HOME, HADOOP_HOME and JAVA_LIBRARY_PATH. JAVA_LIBRARY_PATH should contain a path to
the folder of libsnappy files.

Evgenii

2017-10-17 8:45 GMT+03:00 C Reid <reidddchan@outlook.com<mailto:reidddchan@outlook.com>>:
Hi Evgenii,

Checked, as shown:

17/10/17 13:43:12 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop
library...
17/10/17 13:43:12 DEBUG util.NativeCodeLoader: Loaded the native-hadoop library
17/10/17 13:43:12 WARN bzip2.Bzip2Factory: Failed to load/initialize native-bzip2 library
system-native, will use pure-Java version
17/10/17 13:43:12 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib
library
Native library checking:
hadoop:  true /opt/hadoop-2.8.1-all/lib/native/libhadoop.so
zlib:    true /lib64/libz.so.1
snappy:  true /usr/lib64/libsnappy.so.1
lz4:     true revision:10301
bzip2:   false
openssl: true /usr/lib64/libcrypto.so

________________________________
From: Evgenii Zhuravlev <e.zhuravlev.wk@gmail.com<mailto:e.zhuravlev.wk@gmail.com>>
Sent: 17 October 2017 13:34
To: user@ignite.apache.org<mailto:user@ignite.apache.org>
Subject: Re: Hadoop Accelerator doesn't work when use SnappyCodec compression

Hi,

Have you checked "hadoop checknative -a" ? What it shows for snappy?

Evgenii

2017-10-17 7:12 GMT+03:00 C Reid <reidddchan@outlook.com<mailto:reidddchan@outlook.com>>:
Hi all igniters,

I have tried many ways to include native jar and snappy jar, but exceptions below kept thrown.
(I'm sure the hdfs and yarn support snappy by running job in yarn framework with SnappyCodec.)
Hopes to get some helps and suggestions from community.

[NativeCodeLoader] Unable to load native-hadoop library for your platform... using builtin-java
classes where applicable

and

java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
        at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method)
        at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
        at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136)
        at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150)
        at org.apache.hadoop.io.compress.CompressionCodec$Util.createOutputStreamWithCodecPool(CompressionCodec.java:131)
        at org.apache.hadoop.io.compress.SnappyCodec.createOutputStream(SnappyCodec.java:101)
        at org.apache.hadoop.mapreduce.li<http://org.apache.hadoop.mapreduce.li>b.output.TextOutputFormat.getRecordWriter(TextOutputFormat.java:126)
        at org.apache.ignite.internal.pro<http://org.apache.ignite.internal.pro>cessors.hadoop.impl.v2.HadoopV2Task.prepareWriter(HadoopV2Task.java:104)
        at org.apache.ignite.internal.pro<http://org.apache.ignite.internal.pro>cessors.hadoop.impl.v2.HadoopV2ReduceTask.run0(HadoopV2ReduceTask.java:64)
        at org.apache.ignite.internal.pro<http://org.apache.ignite.internal.pro>cessors.hadoop.impl.v2.HadoopV2Task.run(HadoopV2Task.java:55)
        at org.apache.ignite.internal.pro<http://org.apache.ignite.internal.pro>cessors.hadoop.impl.v2.HadoopV2TaskContext.run(HadoopV2TaskContext.java:266)
        at org.apache.ignite.internal.pro<http://org.apache.ignite.internal.pro>cessors.hadoop.taskexecutor.HadoopRunnableTask.runTask(HadoopRunnableTask.java:209)
        at org.apache.ignite.internal.pro<http://org.apache.ignite.internal.pro>cessors.hadoop.taskexecutor.HadoopRunnableTask.call0(HadoopRunnableTask.java:144)
        at org.apache.ignite.internal.pro<http://org.apache.ignite.internal.pro>cessors.hadoop.taskexecutor.HadoopRunnableTask$1.call(HadoopRunnableTask.java:116)
        at org.apache.ignite.internal.pro<http://org.apache.ignite.internal.pro>cessors.hadoop.taskexecutor.HadoopRunnableTask$1.call(HadoopRunnableTask.java:114)
        at org.apache.ignite.internal.pro<http://org.apache.ignite.internal.pro>cessors.hadoop.impl.v2.HadoopV2TaskContext.runAsJobOwner(HadoopV2TaskContext.java:573)
        at org.apache.ignite.internal.pro<http://org.apache.ignite.internal.pro>cessors.hadoop.taskexecutor.HadoopRunnableTask.call(HadoopRunnableTask.java:114)
        at org.apache.ignite.internal.pro<http://org.apache.ignite.internal.pro>cessors.hadoop.taskexecutor.HadoopRunnableTask.call(HadoopRunnableTask.java:46)
        at org.apache.ignite.internal.pro<http://org.apache.ignite.internal.pro>cessors.hadoop.taskexecutor.HadoopExecutorService$2.body(HadoopExecutorService.java:186)
        at org.apache.ignite.internal.util.worker.GridWorker.run(GridWorker.java:110)


Regards,

RC.








Mime
View raw message