kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jay Kreps <jay.kr...@gmail.com>
Subject Re: Payload size exception
Date Tue, 29 Jan 2013 16:04:54 GMT
There is a setting that controls the maximum message size. This is to
ensure the messages can be read on the server and by all consumers without
running out of memory or exceeding the consumer fetch size. In 0.7.x this
setting is controlled by the broker configuration max.message.size.

-Jay


On Tue, Jan 29, 2013 at 12:18 AM, Bo Sun <docsun.bo@gmail.com> wrote:

> hi all
> i've got a exception .
> kafka.common.MessageSizeTooLargeException: payload size of 1772597 larger
> than 1000000
>         at
>
> kafka.message.ByteBufferMessageSet.verifyMessageSize(ByteBufferMessageSet.scala:93)
>         at kafka.producer.SyncProducer.send(SyncProducer.scala:122)
>         at
>
> kafka.producer.ProducerPool$$anonfun$send$1.apply$mcVI$sp(ProducerPool.scala:114)
>         at
> kafka.producer.ProducerPool$$anonfun$send$1.apply(ProducerPool.scala:100)
>         at
> kafka.producer.ProducerPool$$anonfun$send$1.apply(ProducerPool.scala:100)
>         at
>
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:57)
>         at
> scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:43)
>         at kafka.producer.ProducerPool.send(ProducerPool.scala:100)
>         at kafka.producer.Producer.zkSend(Producer.scala:140)
>         at kafka.producer.Producer.send(Producer.scala:99)
>         at kafka.javaapi.producer.Producer.send(Producer.scala:103)
>
> i dont know why. plz help me .thanks
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message