camel-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sandy <>
Subject Duplicate values from csv are inserted to DB using Apache Camel
Date Fri, 25 Mar 2016 04:57:27 GMT
down vote
I have a large chunk of CSV files(Each containing around millions of
records). So I use seda to use the multi-threading feature. I split 50000 in
chunks, process it and get a List of Entity objects, which I want to split
and persist to DB using jpa. Initially I was getting a Out of Heap Memory
Exception. But later I used a high configuration system and Heap issue was

But right now the issue is, I am getting duplicate records getting inserted
in the DB. say if there are 1000000 records in the csv, around 2000000
records are getting inserted to DB. There is no primary key for the records
in the Csv files. So I have used hibernate to generate a primary key for it.

Below is my code (came-context.xml)

<camelContext xmlns="">
            <from uri="file:C:\Users\PPP\Desktop\input?noop=true" />
            <to uri="seda:StageIt" />

            <from uri="seda:StageIt?concurrentConsumers=1" />
            <split streaming="true">
                <tokenize token="\n" group="50000"></tokenize>
                <to uri="seda:WriteToFile" />

            <from uri="seda:WriteToFile?concurrentConsumers=8" />

            <setHeader headerName="CamelFileName">
            <unmarshal ref="bindyDataformat">
                <bindy type="Csv"  classType="target.bindy.RealEstate"  />
                <to uri="jpa:target.bindy.RealEstate"/>

Please help.

View this message in context:
Sent from the Camel Development mailing list archive at

View raw message