ignite-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Evgenii Zhuravlev <e.zhuravlev...@gmail.com>
Subject Re: Hadoop Accelerator doesn't work when use SnappyCodec compression
Date Thu, 19 Oct 2017 09:07:54 GMT
Could you also try to set LD_LIBRARY_PATH variable with path to the folder
with native libraries?

2017-10-17 17:56 GMT+03:00 C Reid <reidddchan@outlook.com>:

> I just tried, got the same:
> "Unable to load native-hadoop library for your platform... using
> builtin-java classes where applicable"
> "java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.
> buildSupportsSnappy()Z"
>
> I also tried adding all related native library under one of folders under
> jdk where all *.so are located. But ignite just couldn't load them, it's
> strange.
> ------------------------------
> *From:* Evgenii Zhuravlev <e.zhuravlev.wk@gmail.com>
> *Sent:* 17 October 2017 21:25
>
> *To:* user@ignite.apache.org
> *Subject:* Re: Hadoop Accelerator doesn't work when use SnappyCodec
> compression
>
> Have you tried to remove from path libraries from ${HADOOP_HOME}/lib/native
> and add only /usr/lib64/ folder?
>
> 2017-10-17 12:18 GMT+03:00 C Reid <reidddchan@outlook.com>:
>
>> Tried, and did not work.
>>
>> ------------------------------
>> *From:* Evgenii Zhuravlev <e.zhuravlev.wk@gmail.com>
>> *Sent:* 17 October 2017 16:41
>> *To:* C Reid
>> *Subject:* Re: Hadoop Accelerator doesn't work when use SnappyCodec
>> compression
>>
>> I'd recommend adding /usr/lib64/ to JAVA_LIBRARY_PATH
>>
>> Evgenii
>>
>> 2017-10-17 11:29 GMT+03:00 C Reid <reidddchan@outlook.com>:
>>
>>> Yes, IgniteNode runs on the DataNode machine.
>>>
>>> [hadoop@hadoop-offline033.dx.momo.com ignite]$ echo $HADOOP_HOME
>>> /opt/hadoop-2.8.1-all
>>> [hadoop@hadoop-offline033.dx.momo.com ignite]$ echo $IGNITE_HOME
>>> /opt/apache-ignite-hadoop-2.2.0-bin
>>>
>>> and in ignite.sh
>>> JVM_OPTS="${JVM_OPTS} -Djava.library.path=${HADOOP_H
>>> OME}/lib/native:/usr/lib64/libsnappy.so.1:${HADOOP_HOME}/lib
>>> /native/libhadoop.so"
>>>
>>> But exception is thrown as mentioned.
>>> ------------------------------
>>> *From:* Evgenii Zhuravlev <e.zhuravlev.wk@gmail.com>
>>> *Sent:* 17 October 2017 15:44
>>>
>>> *To:* user@ignite.apache.org
>>> *Subject:* Re: Hadoop Accelerator doesn't work when use SnappyCodec
>>> compression
>>>
>>> Do you run Ignite on the same machine as hadoop?
>>>
>>> I'd recommend you to check these env variables:
>>> IGNITE_HOME, HADOOP_HOME and JAVA_LIBRARY_PATH. JAVA_LIBRARY_PATH
>>> should contain a path to the folder of libsnappy files.
>>>
>>> Evgenii
>>>
>>> 2017-10-17 8:45 GMT+03:00 C Reid <reidddchan@outlook.com>:
>>>
>>>> Hi Evgenii,
>>>>
>>>> Checked, as shown:
>>>>
>>>> 17/10/17 13:43:12 DEBUG util.NativeCodeLoader: Trying to load the
>>>> custom-built native-hadoop library...
>>>> 17/10/17 13:43:12 DEBUG util.NativeCodeLoader: Loaded the native-hadoop
>>>> library
>>>> 17/10/17 13:43:12 WARN bzip2.Bzip2Factory: Failed to load/initialize
>>>> native-bzip2 library system-native, will use pure-Java version
>>>> 17/10/17 13:43:12 INFO zlib.ZlibFactory: Successfully loaded &
>>>> initialized native-zlib library
>>>> Native library checking:
>>>> hadoop:  true /opt/hadoop-2.8.1-all/lib/native/libhadoop.so
>>>> zlib:    true /lib64/libz.so.1
>>>> snappy:  true /usr/lib64/libsnappy.so.1
>>>> lz4:     true revision:10301
>>>> bzip2:   false
>>>> openssl: true /usr/lib64/libcrypto.so
>>>>
>>>> ------------------------------
>>>> *From:* Evgenii Zhuravlev <e.zhuravlev.wk@gmail.com>
>>>> *Sent:* 17 October 2017 13:34
>>>> *To:* user@ignite.apache.org
>>>> *Subject:* Re: Hadoop Accelerator doesn't work when use SnappyCodec
>>>> compression
>>>>
>>>> Hi,
>>>>
>>>> Have you checked "hadoop checknative -a" ? What it shows for snappy?
>>>>
>>>> Evgenii
>>>>
>>>> 2017-10-17 7:12 GMT+03:00 C Reid <reidddchan@outlook.com>:
>>>>
>>>>> Hi all igniters,
>>>>>
>>>>> I have tried many ways to include native jar and snappy jar, but
>>>>> exceptions below kept thrown. (I'm sure the hdfs and yarn support snappy
by
>>>>> running job in yarn framework with SnappyCodec.) Hopes to get some helps
>>>>> and suggestions from community.
>>>>>
>>>>> [NativeCodeLoader] Unable to load native-hadoop library for your
>>>>> platform... using builtin-java classes where applicable
>>>>>
>>>>> and
>>>>>
>>>>> java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeC
>>>>> odeLoader.buildSupportsSnappy()Z
>>>>>         at org.apache.hadoop.util.NativeC
>>>>> odeLoader.buildSupportsSnappy(Native Method)
>>>>>         at org.apache.hadoop.io.compress.
>>>>> SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>>>>>         at org.apache.hadoop.io.compress.
>>>>> SnappyCodec.getCompressorType(SnappyCodec.java:136)
>>>>>         at org.apache.hadoop.io.compress.
>>>>> CodecPool.getCompressor(CodecPool.java:150)
>>>>>         at org.apache.hadoop.io.compress.
>>>>> CompressionCodec$Util.createOutputStreamWithCodecPool(Compre
>>>>> ssionCodec.java:131)
>>>>>         at org.apache.hadoop.io.compress.
>>>>> SnappyCodec.createOutputStream(SnappyCodec.java:101)
>>>>>         at org.apache.hadoop.mapreduce.li
>>>>> b.output.TextOutputFormat.getRecordWriter(TextOutputFormat.java:126)
>>>>>         at org.apache.ignite.internal.pro
>>>>> cessors.hadoop.impl.v2.HadoopV2Task.prepareWriter(HadoopV2Ta
>>>>> sk.java:104)
>>>>>         at org.apache.ignite.internal.pro
>>>>> cessors.hadoop.impl.v2.HadoopV2ReduceTask.run0(HadoopV2Reduc
>>>>> eTask.java:64)
>>>>>         at org.apache.ignite.internal.pro
>>>>> cessors.hadoop.impl.v2.HadoopV2Task.run(HadoopV2Task.java:55)
>>>>>         at org.apache.ignite.internal.pro
>>>>> cessors.hadoop.impl.v2.HadoopV2TaskContext.run(HadoopV2TaskC
>>>>> ontext.java:266)
>>>>>         at org.apache.ignite.internal.pro
>>>>> cessors.hadoop.taskexecutor.HadoopRunnableTask.runTask(Hadoo
>>>>> pRunnableTask.java:209)
>>>>>         at org.apache.ignite.internal.pro
>>>>> cessors.hadoop.taskexecutor.HadoopRunnableTask.call0(HadoopR
>>>>> unnableTask.java:144)
>>>>>         at org.apache.ignite.internal.pro
>>>>> cessors.hadoop.taskexecutor.HadoopRunnableTask$1.call(Hadoop
>>>>> RunnableTask.java:116)
>>>>>         at org.apache.ignite.internal.pro
>>>>> cessors.hadoop.taskexecutor.HadoopRunnableTask$1.call(Hadoop
>>>>> RunnableTask.java:114)
>>>>>         at org.apache.ignite.internal.pro
>>>>> cessors.hadoop.impl.v2.HadoopV2TaskContext.runAsJobOwner(Had
>>>>> oopV2TaskContext.java:573)
>>>>>         at org.apache.ignite.internal.pro
>>>>> cessors.hadoop.taskexecutor.HadoopRunnableTask.call(HadoopRu
>>>>> nnableTask.java:114)
>>>>>         at org.apache.ignite.internal.pro
>>>>> cessors.hadoop.taskexecutor.HadoopRunnableTask.call(HadoopRu
>>>>> nnableTask.java:46)
>>>>>         at org.apache.ignite.internal.pro
>>>>> cessors.hadoop.taskexecutor.HadoopExecutorService$2.body(Had
>>>>> oopExecutorService.java:186)
>>>>>         at org.apache.ignite.internal.uti
>>>>> l.worker.GridWorker.run(GridWorker.java:110)
>>>>>
>>>>>
>>>>> Regards,
>>>>>
>>>>> RC.
>>>>>
>>>>
>>>>
>>>
>>
>

Mime
View raw message