hadoop-pig-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Daniel Dai (JIRA)" <j...@apache.org>
Subject [jira] Commented: (PIG-337) If limit size exceeds number of records in the file, a few records get dropped
Date Fri, 25 Jul 2008 06:31:31 GMT

    [ https://issues.apache.org/jira/browse/PIG-337?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12616772#action_12616772
] 

Daniel Dai commented on PIG-337:
--------------------------------

More general, all input file with duplicate records is potentially affected, even if the limit
size is within the number of records in file. Will submit patch shortly. 

> If limit size exceeds number of records in the file, a few records get dropped
> ------------------------------------------------------------------------------
>
>                 Key: PIG-337
>                 URL: https://issues.apache.org/jira/browse/PIG-337
>             Project: Pig
>          Issue Type: Bug
>          Components: impl
>    Affects Versions: types_branch
>            Reporter: Alan Gates
>             Fix For: types_branch
>
>
> Given a file with 10k records, the following script returned 9996 records:
> a = load 'studenttab10k';
> b = limit a 100000;
> dump b;
> It looks like maybe the limit operator isn't returning its last record or something.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message