hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hive QA (JIRA)" <>
Subject [jira] [Commented] (HIVE-8347) Use base-64 encoding instead of custom encoding for serialized objects
Date Sat, 04 Oct 2014 22:36:33 GMT


Hive QA commented on HIVE-8347:

{color:red}Overall{color}: -1 at least one tests failed

Here are the results of testing the latest attachment:

{color:red}ERROR:{color} -1 due to 1 failed/errored test(s), 6524 tests executed
*Failed tests:*

Test results:
Console output:
Test logs:

Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 1 tests failed

This message is automatically generated.


> Use base-64 encoding instead of custom encoding for serialized objects
> ----------------------------------------------------------------------
>                 Key: HIVE-8347
>                 URL:
>             Project: Hive
>          Issue Type: Improvement
>          Components: HCatalog
>    Affects Versions: 0.13.1
>            Reporter: Mariappan Asokan
>         Attachments: HIVE-8347.patch
> Serialized objects that are shipped via Hadoop {{Configuration}} are encoded using custom
encoding (see {{HCatUtil.encodeBytes()}} and its complement {{HCatUtil.decodeBytes()}}) which
has 100% overhead.  In other words, each byte in the serialized object becomes 2 bytes after
encoding.  Perhaps, this might be one of the reasons for the problem reported in HCATALOG-453.
 The patch for HCATALOG-453 compressed serialized {{InputJobInfo}} objects to solve the problem.
> By using Base64 encoding, the overhead will be reduced to about 33%.  This will alleviate
the problem for all serialized objects.

This message was sent by Atlassian JIRA

View raw message