Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id A8111200D2E for ; Tue, 17 Oct 2017 07:34:55 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id A6637160BE9; Tue, 17 Oct 2017 05:34:55 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 9CDCD1609EF for ; Tue, 17 Oct 2017 07:34:54 +0200 (CEST) Received: (qmail 34340 invoked by uid 500); 17 Oct 2017 05:34:53 -0000 Mailing-List: contact user-help@ignite.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@ignite.apache.org Delivered-To: mailing list user@ignite.apache.org Received: (qmail 34330 invoked by uid 99); 17 Oct 2017 05:34:53 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 17 Oct 2017 05:34:53 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id C20A0CC8D5 for ; Tue, 17 Oct 2017 05:34:52 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -0.401 X-Spam-Level: X-Spam-Status: No, score=-0.401 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=2, RCVD_IN_DNSWL_NONE=-0.0001, RCVD_IN_MSPIKE_H2=-2.8, RCVD_IN_SORBS_SPAM=0.5, SPF_PASS=-0.001] autolearn=disabled Authentication-Results: spamd1-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id XVlHgZXm-HtO for ; Tue, 17 Oct 2017 05:34:51 +0000 (UTC) Received: from mail-qk0-f171.google.com (mail-qk0-f171.google.com [209.85.220.171]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTPS id A86FA60DF3 for ; Tue, 17 Oct 2017 05:34:51 +0000 (UTC) Received: by mail-qk0-f171.google.com with SMTP id 17so611901qkq.8 for ; Mon, 16 Oct 2017 22:34:51 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20161025; h=mime-version:in-reply-to:references:from:date:message-id:subject:to; bh=qAOlaYm19coG46DjJ0oDQ+kztCdjxMdfVRuyxY5VMzY=; b=m6hB61I5RQPUZ5PkyjeTxe8pFf4phsRDfe/cqt9+BKsg9PahKPRGbwrHUGw/2SqIPP IWXJVOiAnUMYa0VCg13LmqZ0CIt1kjUrwDDOVg9DtuL8HB4ZABjPLlcD7VUR02LT5QF7 p00asztBT74oO4aTGc/ZEiF9rd9YIIAsrF0u7cjbEatiFbcwLcLnghcCwXQ8JzG6L129 K5sLnfG5XPpTwm4BBtIt0f8yIqd7Ntf0DX+JqarnMWizvxkgWqGbn74RbZJihqe6cY0y 495FOvFvfiCtQRQ2jZfS4gLgK2ZOXS+gjGNgqSfJmrOTx+pbIEUU0Jv6ZS6HCF1JmwSp xITA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20161025; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to; bh=qAOlaYm19coG46DjJ0oDQ+kztCdjxMdfVRuyxY5VMzY=; b=ghPDYnKRRyaKgMqUR4DqhiD4NcNRHfTwv2R7B9FAtqWA0WwgK4G1WYopOw41csi9lw KrVnbBGIyrbAFC84J0KgpbC64mQUu9KphRdKkQIIlv2lNfWmYuy0s1UnMk969s8kB0mQ v98Xcf27dRFGssgEpKsvZdMIP3kT6pRe08HxkCKO1Lvk9a0Sc75mTwWH/F7h2CiOr0P9 qqLyX8yqwkBg0XaXhovEM8usxM3lrZmWjrgzvg6Vnyuwm47LyX6j5zXX3iUbV8bw+ufQ souF0eVRn/ebS07DDJQejgBJJ+mph6MNPVtgNn63yP9zTr1ufZrQcAxRgNA8R+YAcXSK t3bw== X-Gm-Message-State: AMCzsaUVpssGoPrvyR3EeipeXt6IwFRFMNHraouFyjca04ZD772Wro3q GheTYCBJvjjiwdgWu6NDmix8oR34UVQrnL/vr/0= X-Google-Smtp-Source: ABhQp+SQuaQ4nAa4NMQ2Ej0te2p+xO+KPDMe8bDfNlVbA8f2dy/aojwPzeRLND/3FlXn90FRxkcKz/EQB3s3JScOAd8= X-Received: by 10.55.99.148 with SMTP id x142mr16317600qkb.212.1508218484963; Mon, 16 Oct 2017 22:34:44 -0700 (PDT) MIME-Version: 1.0 Received: by 10.200.9.23 with HTTP; Mon, 16 Oct 2017 22:34:44 -0700 (PDT) In-Reply-To: References: From: Evgenii Zhuravlev Date: Tue, 17 Oct 2017 08:34:44 +0300 Message-ID: Subject: Re: Hadoop Accelerator doesn't work when use SnappyCodec compression To: user@ignite.apache.org Content-Type: multipart/alternative; boundary="94eb2c05ab420a3e8f055bb77b75" archived-at: Tue, 17 Oct 2017 05:34:55 -0000 --94eb2c05ab420a3e8f055bb77b75 Content-Type: text/plain; charset="UTF-8" Hi, Have you checked "hadoop checknative -a" ? What it shows for snappy? Evgenii 2017-10-17 7:12 GMT+03:00 C Reid : > Hi all igniters, > > I have tried many ways to include native jar and snappy jar, but > exceptions below kept thrown. (I'm sure the hdfs and yarn support snappy by > running job in yarn framework with SnappyCodec.) Hopes to get some helps > and suggestions from community. > > [NativeCodeLoader] Unable to load native-hadoop library for your > platform... using builtin-java classes where applicable > > and > > java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader. > buildSupportsSnappy()Z > at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native > Method) > at org.apache.hadoop.io.compress.SnappyCodec. > checkNativeCodeLoaded(SnappyCodec.java:63) > at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType( > SnappyCodec.java:136) > at org.apache.hadoop.io.compress.CodecPool.getCompressor( > CodecPool.java:150) > at org.apache.hadoop.io.compress.CompressionCodec$Util. > createOutputStreamWithCodecPool(CompressionCodec.java:131) > at org.apache.hadoop.io.compress.SnappyCodec.createOutputStream( > SnappyCodec.java:101) > at org.apache.hadoop.mapreduce.lib.output.TextOutputFormat. > getRecordWriter(TextOutputFormat.java:126) > at org.apache.ignite.internal.processors.hadoop.impl.v2. > HadoopV2Task.prepareWriter(HadoopV2Task.java:104) > at org.apache.ignite.internal.processors.hadoop.impl.v2. > HadoopV2ReduceTask.run0(HadoopV2ReduceTask.java:64) > at org.apache.ignite.internal.processors.hadoop.impl.v2. > HadoopV2Task.run(HadoopV2Task.java:55) > at org.apache.ignite.internal.processors.hadoop.impl.v2. > HadoopV2TaskContext.run(HadoopV2TaskContext.java:266) > at org.apache.ignite.internal.processors.hadoop.taskexecutor. > HadoopRunnableTask.runTask(HadoopRunnableTask.java:209) > at org.apache.ignite.internal.processors.hadoop.taskexecutor. > HadoopRunnableTask.call0(HadoopRunnableTask.java:144) > at org.apache.ignite.internal.processors.hadoop.taskexecutor. > HadoopRunnableTask$1.call(HadoopRunnableTask.java:116) > at org.apache.ignite.internal.processors.hadoop.taskexecutor. > HadoopRunnableTask$1.call(HadoopRunnableTask.java:114) > at org.apache.ignite.internal.processors.hadoop.impl.v2. > HadoopV2TaskContext.runAsJobOwner(HadoopV2TaskContext.java:573) > at org.apache.ignite.internal.processors.hadoop.taskexecutor. > HadoopRunnableTask.call(HadoopRunnableTask.java:114) > at org.apache.ignite.internal.processors.hadoop.taskexecutor. > HadoopRunnableTask.call(HadoopRunnableTask.java:46) > at org.apache.ignite.internal.processors.hadoop.taskexecutor. > HadoopExecutorService$2.body(HadoopExecutorService.java:186) > at org.apache.ignite.internal.util.worker.GridWorker.run( > GridWorker.java:110) > > > Regards, > > RC. > --94eb2c05ab420a3e8f055bb77b75 Content-Type: text/html; charset="UTF-8" Content-Transfer-Encoding: quoted-printable
Hi,

Have you checked "hadoop check= native -a" ?=C2=A0What it shows for snappy?

E= vgenii

2017-10-17 7:12 GMT+03:00 C Reid <reidddchan@outlook.com>:
Hi all igniters,

I have tried many ways to include native jar and snappy jar, but exceptions= below=C2=A0kept thrown. (I'm sure the=C2=A0hdfs and yarn support snapp= y by running job in yarn framework with SnappyCodec.)=C2=A0Hopes to get som= e helps and suggestions from community.

[NativeCodeLoader] Unable to load native-hadoop library for your platform..= . using builtin-java classes where applicable

and

java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.Nativ= eCodeLoader.buildSupportsSnappy()Z
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.util.NativeCodeL= oader.buildSupportsSnappy(Native Method)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.io.compress.Snap= pyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.io.compress.Snap= pyCodec.getCompressorType(SnappyCodec.java:136)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.io.compress.Code= cPool.getCompressor(CodecPool.java:150)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.io.compress.Comp= ressionCodec$Util.createOutputStreamWithCodecPool(CompressionCode= c.java:131)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.io.compress.Snap= pyCodec.createOutputStream(SnappyCodec.java:101)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.mapreduce.lib.ou= tput.TextOutputFormat.getRecordWriter(TextOutputFormat.java:126)<= /div>
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.ignite.internal.process= ors.hadoop.impl.v2.HadoopV2Task.prepareWriter(HadoopV2Task.java:1= 04)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.ignite.internal.process= ors.hadoop.impl.v2.HadoopV2ReduceTask.run0(HadoopV2ReduceTask.jav= a:64)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.ignite.internal.process= ors.hadoop.impl.v2.HadoopV2Task.run(HadoopV2Task.java:55)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.ignite.internal.process= ors.hadoop.impl.v2.HadoopV2TaskContext.run(HadoopV2TaskContext.ja= va:266)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.ignite.internal.process= ors.hadoop.taskexecutor.HadoopRunnableTask.runTask(HadoopRun= nableTask.java:209)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.ignite.internal.process= ors.hadoop.taskexecutor.HadoopRunnableTask.call0(HadoopRunna= bleTask.java:144)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.ignite.internal.process= ors.hadoop.taskexecutor.HadoopRunnableTask$1.call(HadoopRunn= ableTask.java:116)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.ignite.internal.process= ors.hadoop.taskexecutor.HadoopRunnableTask$1.call(HadoopRunn= ableTask.java:114)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.ignite.internal.process= ors.hadoop.impl.v2.HadoopV2TaskContext.runAsJobOwner(HadoopV= 2TaskContext.java:573)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.ignite.internal.process= ors.hadoop.taskexecutor.HadoopRunnableTask.call(HadoopRunnab= leTask.java:114)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.ignite.internal.process= ors.hadoop.taskexecutor.HadoopRunnableTask.call(HadoopRunnab= leTask.java:46)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.ignite.internal.process= ors.hadoop.taskexecutor.HadoopExecutorService$2.body(HadoopE= xecutorService.java:186)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.ignite.internal.util.wo= rker.GridWorker.run(GridWorker.java:110)


Regards,

RC.

--94eb2c05ab420a3e8f055bb77b75--