flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Greg Hogan <c...@greghogan.com>
Subject Re: Flink error: Too few memory segments provided
Date Thu, 20 Oct 2016 15:53:46 GMT
By default Flink only allocates 2048 network buffers (64 MiB at 32
KiB/buffer). Have you increased the value for
taskmanager.network.numberOfBuffers in flink-conf.yaml?

On Thu, Oct 20, 2016 at 11:24 AM, otherwise777 <wouter@onzichtbaar.net>
wrote:

> I got this error in Gelly, which is a result of flink (i believe)
>
> Exception in thread "main"
> org.apache.flink.runtime.client.JobExecutionException: Job execution
> failed.
>         at
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$
> anonfun$applyOrElse$8.apply$mcV$sp(JobManager.scala:822)
>         at
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$
> anonfun$applyOrElse$8.apply(JobManager.scala:768)
>         at
> org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$
> anonfun$applyOrElse$8.apply(JobManager.scala:768)
>         at
> scala.concurrent.impl.Future$PromiseCompletingRunnable.
> liftedTree1$1(Future.scala:24)
>         at
> scala.concurrent.impl.Future$PromiseCompletingRunnable.run(
> Future.scala:24)
>         at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
>         at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(
> AbstractDispatcher.scala:401)
>         at scala.concurrent.forkjoin.ForkJoinTask.doExec(
> ForkJoinTask.java:260)
>         at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.
> runTask(ForkJoinPool.java:1339)
>         at scala.concurrent.forkjoin.ForkJoinPool.runWorker(
> ForkJoinPool.java:1979)
>         at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(
> ForkJoinWorkerThread.java:107)
> Caused by: java.lang.IllegalArgumentException: Too few memory segments
> provided. Hash Table needs at least 33 memory segments.
>         at
> org.apache.flink.runtime.operators.hash.CompactingHashTable.<init>(
> CompactingHashTable.java:206)
>         at
> org.apache.flink.runtime.operators.hash.CompactingHashTable.<init>(
> CompactingHashTable.java:191)
>         at
> org.apache.flink.runtime.iterative.task.IterationHeadTask.
> initCompactingHashTable(IterationHeadTask.java:175)
>         at
> org.apache.flink.runtime.iterative.task.IterationHeadTask.run(
> IterationHeadTask.java:272)
>         at org.apache.flink.runtime.operators.BatchTask.invoke(
> BatchTask.java:351)
>         at org.apache.flink.runtime.taskmanager.Task.run(Task.java:584)
>         at java.lang.Thread.run(Thread.java:745)
>
> I found a related topic:
> http://mail-archives.apache.org/mod_mbox/flink-dev/201503.
> mbox/%3CCAK5ODX4KJ9TB4yJ=BcNwsozbOoXwdB7HM9qvWoa1P9HK-
> Gb-Dg@mail.gmail.com%3E
> But i don't think the problem is the same,
>
> The code is as follows:
>
>         ExecutionEnvironment env =
> ExecutionEnvironment.getExecutionEnvironment();
>         DataSource twitterEdges =
> env.readCsvFile("./datasets/out.munmun_twitter_social").fieldDelimiter("
> ").ignoreComments("%").types(Long.class, Long.class);
>         Graph graph = Graph.fromTuple2DataSet(twitterEdges, new
> testinggraph.InitVertices(), env);
>         DataSet verticesWithCommunity = (DataSet)graph.run(new
> LabelPropagation(1));
>         System.out.println(verticesWithCommunity.count());
>
> And it has only a couple of edges.
>
> I tried adding a config file in the project to add a couple of settings
> found here:
> https://ci.apache.org/projects/flink/flink-docs-release-0.8/config.html
> but
> that didn't work either
>
> I have no idea how to fix this atm, it's not just the LabelPropagation that
> goes wrong, all gelly methods give this exact error if it's using an
> iteration.
>
>
>
>
>
> --
> View this message in context: http://apache-flink-user-
> mailing-list-archive.2336050.n4.nabble.com/Flink-error-Too-
> few-memory-segments-provided-tp9657.html
> Sent from the Apache Flink User Mailing List archive. mailing list archive
> at Nabble.com.
>

Mime
View raw message