Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id D2D92103E5 for ; Thu, 20 Jun 2013 05:18:52 +0000 (UTC) Received: (qmail 20929 invoked by uid 500); 20 Jun 2013 05:18:47 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 20488 invoked by uid 500); 20 Jun 2013 05:18:46 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 20469 invoked by uid 99); 20 Jun 2013 05:18:45 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 20 Jun 2013 05:18:45 +0000 X-ASF-Spam-Status: No, hits=-0.5 required=5.0 tests=ASF_LIST_OPS,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of samir.helpdoc@gmail.com designates 209.85.220.173 as permitted sender) Received: from [209.85.220.173] (HELO mail-vc0-f173.google.com) (209.85.220.173) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 20 Jun 2013 05:18:38 +0000 Received: by mail-vc0-f173.google.com with SMTP id ht10so4455019vcb.32 for ; Wed, 19 Jun 2013 22:18:17 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=6paQ6A3zfaEmSepnj9BOr/b6+7u6U5Cv+F0cdhOGw/0=; b=pH9NEZs1xBJBw9neM3DWQ9DABZjGyItMUzvePLcT43e2QHN/3JRZkoX4ttVxOFxIlh XZrIneAtt8uFGuN7fV+oOVpJEPQiu8UJZlpIp/O4jR3P8ojnu7F/aZidCltZcjHEjU27 bQ0RImQO7Ttn61deDJnEkK5nhyHkh63Tpm6L34nl6TLz2mBQZfyhPVCyDUO2mkY+XgJ1 0grnSfoJ0WJY+2Ptn8Gxy/M6YZn+AnZKKzdy1I9jQKN8ITCuNGy9VBn3RNVKBYAO5fqC LVuTCZyUJnvQtg6EivH00wAuCTPVIzogRJD7/ZouHUqYAYORRMurOmlYJ4PFWPZBNFeb oBxA== MIME-Version: 1.0 X-Received: by 10.58.171.167 with SMTP id av7mr2186223vec.15.1371705497858; Wed, 19 Jun 2013 22:18:17 -0700 (PDT) Received: by 10.59.10.230 with HTTP; Wed, 19 Jun 2013 22:18:17 -0700 (PDT) Date: Thu, 20 Jun 2013 10:48:17 +0530 Message-ID: Subject: Error While Processing SequeceFile with Lzo Compressed in hive External table (CDH4.3) From: samir das mohapatra To: user@hadoop.apache.org, user-help@hadoop.apache.org, cdh-user@cloudera.com, cdh-user@cloudera.org Content-Type: multipart/alternative; boundary=047d7b677a18efbab504df8f1285 X-Virus-Checked: Checked by ClamAV on apache.org --047d7b677a18efbab504df8f1285 Content-Type: text/plain; charset=ISO-8859-1 Dear All, Any One would have face this type of Issue ? I am getting Some error while processing Sequecen file with LZO compresss in hive query In CDH4.3.x Distribution. Error Logs: SET hive.exec.compress.output=true; SET mapred.output.compression.codec=com.hadoop.compression.lzo.LzopCodec; -rw-r--r-- 3 myuser supergroup 25172 2013-06-19 21:25 /user/myDir/000000_0 -- Lzo Compressed sequence file -rw-r--r-- 3 myuser supergroup 71007 2013-06-19 21:42 /user/myDir/000000_0 -- Normal sequence file 1. Now the problem that if I create an External table on top of the directory to read the data it gives me an error : *Failed with exception java.io.IOException:java.io.EOFException: Premature EOF from inputStream* * * *Table Creation:* * * * * CREATE EXTERNAL TABLE IF NOT EXISTS MyTable ( userip string usertid string ) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' ESCAPED BY '\020' COLLECTION ITEMS TERMINATED BY '\002' MAP KEYS TERMINATED BY '\003' LINES TERMINATED BY '\012' STORED AS SEQUENCEFILE LOCATION '/path/to/file'; After that while querying to the table getting error: *Failed with exception java.io.IOException:java.io.EOFException: Premature EOF from inputStream* * * *Why it is like that?* * * *Regards,* *samir* * * * * * * *T* * * * * * * --047d7b677a18efbab504df8f1285 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Dear All,

=A0Any One would = have face this type of Issue ?

=A0 I a= m getting Some error while processing Sequecen file with LZO compresss in h= ive query =A0In CDH4.3.x Distribution.

Error Logs:

SET hive.exec.compress.output=3Dtrue;

SET mapred.output.compression.codec=3Dcom.hadoop.compression.lzo.LzopC=
odec;
=A0
-rw-r--r--=A0=A0 3 myuser supergroup=A0=A0=A0=
=A0=A0 25172 2013-06-19 21:2=
5 /user/myDir/000000_0=A0 -- Lzo Compressed sequence file
=A0
-rw-r--r--=A0=A0 3 myuser   supergroup=A0=A0=A0=A0=
=A0 71007<=
/span> 2013-06-19 21:42 /user/myDir/00000=
0_0=A0=A0 -- Normal sequence file
=A0

1.=A0=A0=A0=A0=A0=A0 Now the problem that if I create an External table on top of the directory to r= ead the data it gives me an error : Failed with exception java.io.IOException:java.io.EOFException: Premature EOF from inputStream

=

Table C= reation:


CREATE EXTERNAL TABLE IF NOT EXISTS MyTable

(

userip=A0 string

usertid string

)

ROW FORMAT DELIMITED

=A0= =A0 FIELDS TERMINATED BY '\001'=A0 ESCAPED BY '\020'

=A0=A0 COLLECTION ITEMS TERMINATED BY '= \002'

=A0=A0 MAP KEYS TERMINATED BY '\003'

=

=A0=A0 LINES TERMINATED BY '\012'

ST= ORED AS SEQUENCEFILE

LOCATION '/path/to/file';


After that while querying to the table getting error:=


Failed with exception java.io.IOException:java.io.EOFException: Prema= ture EOF from inputStream


Why it is like that?

Regards,
samir


=


T<= /p>




=A0

--047d7b677a18efbab504df8f1285--