avro-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Felix GV <fe...@mate1inc.com>
Subject Re: Is Avro right for me?
Date Thu, 06 Jun 2013 19:09:56 GMT
Also, if you end up choosing to use Kafka and persisting your messages into
Hadoop, then you should take a look at
Camus<https://github.com/linkedin/camus> (which
is also from LinkedIn).

If you do things the LinkedIn way right from the start (i.e.: using the
AVRO-1124 schema repo and encoding time in a standard way in a header
contained in all your schemas), then you can use Camus pretty much out of
the box without any tweaking, and the solution you'll get is very flexible
/ extensible (regarding the ability to evolve your schemas gracefully,
letting Camus discover new topics to persist automatically, etc).

For us it was a little more complicated since we had some legacy stuff that
was not exactly how Camus expected it, but it wasn't that complicated to
integrate either...


On Thu, Jun 6, 2013 at 2:51 PM, Felix GV <felix@mate1inc.com> wrote:

> You can serialize avro messages into json or binary format, and then pass
> them around to anything else (send them over HTTP, publish them to a
> message broker system like Kafka or Flume, write them directly into a data
> store, etc.). You can forget about the avro RPC, as it's just one way
> amongst many of doing this.
> You do need to manage schemas properly though. The easy way is to hardcode
> your schema on both ends, but then that makes it harder to evolve schemas
> (which avro can do very well otherwise). If you send single serialized avro
> messages around through a message broker system, then you should definitely
> consider using a version number for your schema at the beginning of the
> message, as Martin suggested. Then you can look up what schema each version
> number represents with something like the versioned schema repo in
> AVRO-1124 <https://issues.apache.org/jira/browse/AVRO-1124>.
> --
> Felix
> On Tue, Jun 4, 2013 at 11:10 PM, Mark <static.void.dev@gmail.com> wrote:
>> I have a question.  Say I want to use AVRO as my serialization format to
>> speak between service applications. Do I need to use AVRO RPC for this or
>> can I just exchange AVRO messages over HTTP?
>> Also whats the difference between an IPC client and an HTTP IPC client?
>> https://github.com/apache/avro/tree/trunk/lang/ruby/test
>> Thanks
>> On May 29, 2013, at 8:02 PM, Mike Percy <mpercy@apache.org> wrote:
>> There is no Ruby support for the Netty Avro RPC protocol that I know of.
>> But I'm not sure why that matters, other than the fact that the Flume
>> Thrift support it's not in an official release yet.
>> You could also take a look at the Flume HTTP source for a REST-based
>> interface, but to accept binary data instead of JSON (the default) you
>> would need to write a small bit of Java code and plug that in.
>> Make sure you differentiate between using Avro as a data storage format
>> and as an RPC mechanism. They are two very different things and don't need
>> to be tied together. Today, the data storage aspect is more mature and has
>> much wider language support.
>> Mike
>> On Wed, May 29, 2013 at 9:30 AM, Mark <static.void.dev@gmail.com> wrote:
>>> So basically Avro RPC is out of the question? Instead I would need to
>>> Avro Message -> Thrift -> Flume? Is that along the right lines or am I
>>> missing something?
>>> On May 28, 2013, at 5:02 PM, Mike Percy <mpercy@apache.org> wrote:
>>> Regarding Ruby support, we recently added support for Thrift RPC, so you
>>> can now send messages to Flume via Ruby and other non-JVM languages. We
>>> don't have out-of-the-box client APIs for those yet but would be happy to
>>> accept patches for it :)

View raw message