flink-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bastian Köcher (JIRA) <j...@apache.org>
Subject [jira] [Created] (FLINK-941) Possible deadlock after increasing my data set size
Date Mon, 16 Jun 2014 11:57:02 GMT
Bastian Köcher created FLINK-941:
------------------------------------

             Summary: Possible deadlock after increasing my data set size
                 Key: FLINK-941
                 URL: https://issues.apache.org/jira/browse/FLINK-941
             Project: Flink
          Issue Type: Bug
    Affects Versions: pre-apache-0.5.1
            Reporter: Bastian Köcher


If I increase my data set, my algorithm stops at some point and doesn't continue anymore.
I already waited a quite amount of time, but nothing happens. The linux processor explorer
also displays that the process is sleeping and waiting for something to happen, could maybe
be a deadlock.

I attached the source of my program, the class HAC_2 is the actual algorithm.
Changing the line 271 from "if(Integer.parseInt(tokens[0]) > 282)" to "if(Integer.parseInt(tokens[0])
> 283)" at my PC "enables" the bug. The numbers 282, 283 are the numbers of the documents
in my test data and this line skips all documents with an id greater than that.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message