hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hive QA (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-5783) Native Parquet Support in Hive
Date Fri, 07 Feb 2014 10:07:19 GMT

    [ https://issues.apache.org/jira/browse/HIVE-5783?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13894366#comment-13894366
] 

Hive QA commented on HIVE-5783:
-------------------------------



{color:red}Overall{color}: -1 no tests executed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12627508/HIVE-5783.patch

Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1230/testReport
Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/1230/console

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and
output '+ [[ -n '' ]]
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
+ M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
+ cd /data/hive-ptest/working/
+ tee /data/hive-ptest/logs/PreCommit-HIVE-Build-1230/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ svn = \s\v\n ]]
+ [[ -n '' ]]
+ [[ -d apache-svn-trunk-source ]]
+ [[ ! -d apache-svn-trunk-source/.svn ]]
+ [[ ! -d apache-svn-trunk-source ]]
+ cd apache-svn-trunk-source
+ svn revert -R .
++ egrep -v '^X|^Performing status on external'
++ awk '{print $2}'
++ svn status --no-ignore
+ rm -rf target datanucleus.log ant/target shims/target shims/0.20/target shims/0.20S/target
shims/0.23/target shims/aggregator/target shims/common/target shims/common-secure/target packaging/target
hbase-handler/target testutils/target jdbc/target metastore/target itests/target itests/hcatalog-unit/target
itests/test-serde/target itests/qtest/target itests/hive-unit/target itests/custom-serde/target
itests/util/target hcatalog/target hcatalog/storage-handlers/hbase/target hcatalog/server-extensions/target
hcatalog/core/target hcatalog/webhcat/svr/target hcatalog/webhcat/java-client/target hcatalog/hcatalog-pig-adapter/target
hwi/target common/target common/src/gen service/target contrib/target serde/target beeline/target
odbc/target cli/target ql/dependency-reduced-pom.xml ql/target
+ svn update

Fetching external item into 'hcatalog/src/test/e2e/harness'
External at revision 1565604.

At revision 1565604.
+ patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hive-ptest/working/scratch/build.patch
+ [[ -f /data/hive-ptest/working/scratch/build.patch ]]
+ chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh
+ /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/working/scratch/build.patch
The patch does not appear to apply with p0, p1, or p2
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12627508

> Native Parquet Support in Hive
> ------------------------------
>
>                 Key: HIVE-5783
>                 URL: https://issues.apache.org/jira/browse/HIVE-5783
>             Project: Hive
>          Issue Type: New Feature
>          Components: Serializers/Deserializers
>            Reporter: Justin Coffey
>            Assignee: Justin Coffey
>            Priority: Minor
>             Fix For: 0.13.0
>
>         Attachments: HIVE-5783.noprefix.patch, HIVE-5783.patch, HIVE-5783.patch, HIVE-5783.patch,
HIVE-5783.patch, HIVE-5783.patch, HIVE-5783.patch, HIVE-5783.patch, HIVE-5783.patch, HIVE-5783.patch,
HIVE-5783.patch, HIVE-5783.patch, HIVE-5783.patch, HIVE-5783.patch, HIVE-5783.patch, HIVE-5783.patch
>
>
> Problem Statement:
> Hive would be easier to use if it had native Parquet support. Our organization, Criteo,
uses Hive extensively. Therefore we built the Parquet Hive integration and would like to now
contribute that integration to Hive.
> About Parquet:
> Parquet is a columnar storage format for Hadoop and integrates with many Hadoop ecosystem
tools such as Thrift, Avro, Hadoop MapReduce, Cascading, Pig, Drill, Crunch, and Hive. Pig,
Crunch, and Drill all contain native Parquet integration.
> Changes Details:
> Parquet was built with dependency management in mind and therefore only a single Parquet
jar will be added as a dependency.



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)

Mime
View raw message