ignite-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <itriA40...@itri.org.tw>
Subject RE: Performance of Ignite integrating with PostgreSQL
Date Tue, 27 Mar 2018 09:41:23 GMT
Hi Vinokurov,

I tried to run your code for 30 minutes monitored by “atop”.
And the average write speed is about 2151.55 KB per second.
Though the performance is better.
But there is still a gap with your testing result.
Is there anything I can improve?
Thanks.

There is my hardware specifications.
CPU:
      Intel(R) Xeon(R) CPU E3-1220 V2 @ 3.10GHz
      4 cores
Memory:
      16 GB

Atop observations:
disk           busy   read/s KB/read  writ/s   KB/writ  avque  avserv     _dsk_
sda            89%    29.7    14.8      116.3    18.5      13.1   6.13 ms


Print out parts of time per putAll:
221ms
23ms
22ms
60ms
56ms
71ms
140ms
105ms
117ms
69ms
91ms
89ms
32ms
271ms
24ms
23ms
55ms
90ms
69ms
1987ms
337ms
316ms
322ms
339ms
101ms
170ms
22ms
41ms
43ms
110ms
668ms
29ms
27ms
28ms
24ms
22ms




I changed my code
>> IgniteCache<String, String> igniteCache = ignite.getOrCreateCache("testCache
");
To
IgniteCache<String, String> igniteCache = ignite.cache("testCache");
And update to 2.4.0 version.

But the writing speed is still about 100 KB per second.


From: Pavel Vinokurov [mailto:vinokurov.pasha@gmail.com]
Sent: Thursday, March 22, 2018 11:07 PM
To: user@ignite.apache.org
Subject: Re: Performance of Ignite integrating with PostgreSQL

In your example you add the same key/values into cache, so it's just overwrites entries and
persists only 100 entries.
Please look at the project https://bitbucket.org/vinokurov-pavel/ignite-postgres . I have
~70-100 Mb/s on my SSD.

2018-03-22 11:55 GMT+03:00 <itriA40453@itri.org.tw<mailto:itriA40453@itri.org.tw>>:
Hi Vinokurov,

I changed my code
>> IgniteCache<String, String> igniteCache = ignite.getOrCreateCache("testCache
");
To
IgniteCache<String, String> igniteCache = ignite.cache("testCache");
And update to 2.4.0 version.

But the writing speed is still about 100 KB per second.


Below is jdbc connection initialization:
@Autowired
public NamedParameterJdbcTemplate jdbcTemplate;
@Override
public void start() throws IgniteException {
ConfigurableApplicationContext context = new ClassPathXmlApplicationContext("postgres-context.xml");
this.jdbcTemplate = context.getBean(NamedParameterJdbcTemplate.class);
}


The PostgreSQL configuration, “postgres-context.xml” :
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
       xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
       xmlns:context="http://www.springframework.org/schema/context"
       xsi:schemaLocation="
       http://www.springframework.org/schema/beans
       http://www.springframework.org/schema/beans/spring-beans.xsd
       http://www.springframework.org/schema/context
       http://www.springframework.org/schema/context/spring-context.xsd">

    <context:component-scan base-package="com.blu.imdg.jdbc"/>
    <context:property-placeholder location="classpath:jdbc.properties"/>

    <bean id="dataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource">
        <property name="driverClassName" value="${jdbc.driver}"/>
        <property name="url" value="${jdbc.url}"/>
        <property name="username" value="${jdbc.username}"/>
        <property name="password" value="${jdbc.password}"/>
    </bean>
    <bean id="jdbcTemplate"
          class="org.springframework.jdbc.core.namedparam.NamedParameterJdbcTemplate">
        <constructor-arg ref="dataSource"/>
    </bean>
</beans>



Thanks.


From: Vinokurov Pavel [mailto:vinokurov.pasha@gmail.com<mailto:vinokurov.pasha@gmail.com>]
Sent: Thursday, March 22, 2018 1:50 PM

To: user@ignite.apache.org<mailto:user@ignite.apache.org>
Subject: Re: Performance of Ignite integrating with PostgreSQL

Also it makes sense to use new 2.4 version.

2018-03-22 8:37 GMT+03:00 Vinokurov Pavel <vinokurov.pasha@gmail.com<mailto:vinokurov.pasha@gmail.com>>:
>> IgniteCache<String, String> igniteCache = ignite.getOrCreateCache("testCache
");
please, change to  ignite.cache("testCache") to be sure the we use configuration from the
file.

2018-03-22 8:19 GMT+03:00 Vinokurov Pavel <vinokurov.pasha@gmail.com<mailto:vinokurov.pasha@gmail.com>>:
You already showed the cache configuration, but could you show jdbc connection initialization

2018-03-22 7:59 GMT+03:00 Vinokurov Pavel <vinokurov.pasha@gmail.com<mailto:vinokurov.pasha@gmail.com>>:
Hi,

Could you please show the "PATH/example-cache.xml" file.

2018-03-21 9:40 GMT+03:00 <itriA40453@itri.org.tw<mailto:itriA40453@itri.org.tw>>:
Hi Vinokurov,

Thanks for your reply.
I try to write batches by 100 entries.
And I got a worse result.
The writing speed is down to 12.09 KB per second.
Below is my code which I try to use putAll and writeAll to rewrite.
Did I make some mistakes?



Main function:
        Ignite ignite = Ignition.start("PATH/example-cache.xml");
        IgniteCache<String, String> igniteCache = ignite.getOrCreateCache("testCache
");
        for(int i = 0; i < 100; i++)
        {
             parameterMap.put(Integer.toString(i), "writeAll_val");
        }

        while(true)
        {
             igniteCache.putAll(parameterMap);
        }


Write all to PostgreSQL through JDBC:
@Override
public void writeAll(Collection<Cache.Entry<? extends String, ? extends String>>
entries) throws CacheWriterException {
Iterator<Cache.Entry<? extends String, ? extends String>> it = entries.iterator();
Map<String, Object> parameterMap = new HashMap<>();
int count = 1;
while (it.hasNext()) {
Cache.Entry<? extends String, ? extends String> entry = it.next();
String valCount = "val";
valCount += Integer.toString(count);
parameterMap.put(valCount, entry.getValue());
count++;
it.remove();
}

        String sqlString = "INSERT INTO test_writeall(val) VALUES "
                   + "(:val1),(:val2),(:val3),(:val4),(:val5),(:val6),(:val7),(:val8),(:val9),(:val10),"
                   + "(:val11),(:val12),(:val13),(:val14),(:val15),(:val16),(:val17),(:val18),(:val19),(:val20),"
                   + "(:val21),(:val22),(:val23),(:val24),(:val25),(:val26),(:val27),(:val28),(:val29),(:val30),"
                   + "(:val31),(:val32),(:val33),(:val34),(:val35),(:val36),(:val37),(:val38),(:val39),(:val40),"
                   + "(:val41),(:val42),(:val43),(:val44),(:val45),(:val46),(:val47),(:val48),(:val49),(:val50),"
                   + "(:val51),(:val52),(:val53),(:val54),(:val55),(:val56),(:val57),(:val58),(:val59),(:val60),"
                   + "(:val61),(:val62),(:val63),(:val64),(:val65),(:val66),(:val67),(:val68),(:val69),(:val70),"
                   + "(:val71),(:val72),(:val73),(:val74),(:val75),(:val76),(:val77),(:val78),(:val79),(:val80),"
                   + "(:val81),(:val82),(:val83),(:val84),(:val85),(:val86),(:val87),(:val88),(:val89),(:val90),"
                   + "(:val91),(:val92),(:val93),(:val94),(:val95),(:val96),(:val97),(:val98),(:val99),(:val100);";

        jdbcTemplate.update(sqlString, parameterMap);
}



From: Vinokurov Pavel [mailto:vinokurov.pasha@gmail.com<mailto:vinokurov.pasha@gmail.com>]
Sent: Wednesday, March 14, 2018 5:42 PM
To: user@ignite.apache.org<mailto:user@ignite.apache.org>
Subject: Re: Performance of Ignite integrating with PostgreSQL

Hi,

You could try to use igniteCache.putAll  for write batches by 1000 entries.
Use following script in PostgresDBStore#writeAll method to put data into the database:
String sqlString = "INSERT INTO test(val) VALUES (:val1)(:val2)(:val3);";


2018-03-14 11:58 GMT+03:00 <itriA40453@itri.org.tw<mailto:itriA40453@itri.org.tw>>:
Hi,
I try to use Ignite to integrate with PostgreSQL.
And I use “atop” to monitor the data write to PostgreSQL.
Then observed that the writing speed is 1 MB per second.
This performance is not really good. Below is my configuration and code. Please help me to
improve it.
Thanks.

There is my cache configuration:
                              <bean class="org.apache.ignite.configuration.CacheConfiguration">
                                    <property name="name" value= "testCache"/>
                                    <property name="cacheMode" value="PARTITIONED"/>
                                    <property name="atomicityMode" value=" ATOMIC"/>
                                    <property name="atomicWriteOrderMode" value="PRIMARY"/>
                     <property name="readThrough" value="true"/>
                                        <property name="writeThrough" value="true"/>
                             <property name="writeBehindEnabled" value="true"/>

                             <property name="writeBehindFlushThreadCount" value="64"/>
                             <property name="writeBehindBatchSize" value="131072" />
                             <property name="writeBehindFlushSize" value="131072" />

                             <property name="offHeapMaxMemory" value="0" />
                         <property name="cacheStoreFactory">
                        <bean class="javax.cache.configuration.FactoryBuilder$SingletonFactory">
                             <constructor-arg>
                                 <bean class="com.blu.imdg.jdbc.PostgresDBStore">
                                 </bean>
                             </constructor-arg>
                                                </bean>
                                    </property>
                                    <property name="backups" value="0"/>
                                    <property name="indexedTypes">
                                                <list>
                                                  <value>java.lang.String</value>
                                                  <value>java.lang.String</value>
                       </list>
                                    </property>
                </bean>


Main function:
        Ignite ignite = Ignition.start("PATH/example-cache.xml");
        IgniteCache<String, String> igniteCache = ignite.getOrCreateCache("testCache
");
        int seqint = 0;
        while(true)
        {
                        igniteCache.put(Integer.toString(seqint), "valueString");
                        seqint++;
        }


Write behind to PostgreSQL through JDBC:
@Override
public void write(Cache.Entry<? extends String, ? extends String> entry) throws CacheWriterException
{
Map<String, Object> parameterMap = new HashMap<>();
parameterMap.put(“val”, entry.getValue());
String sqlString = "INSERT INTO test(val) VALUES (:val);";
jdbcTemplate.update(sqlString, parameterMap);
}



--
本信件可能包含工研院機密資訊,非指定之收件者,請勿使用或揭露本信件內容,並請銷毀此信件。
This email may contain confidential information. Please do not use or disclose it in any way
and delete it if you are not the intended recipient.



--

Regards

Pavel Vinokurov


--
本信件可能包含工研院機密資訊,非指定之收件者,請勿使用或揭露本信件內容,並請銷毀此信件。
This email may contain confidential information. Please do not use or disclose it in any way
and delete it if you are not the intended recipient.



--

Regards

Pavel Vinokurov



--

Regards

Pavel Vinokurov



--

Regards

Pavel Vinokurov



--

Regards

Pavel Vinokurov


--
本信件可能包含工研院機密資訊,非指定之收件者,請勿使用或揭露本信件內容,並請銷毀此信件。
This email may contain confidential information. Please do not use or disclose it in any way
and delete it if you are not the intended recipient.



--

Regards

Pavel Vinokurov


--
本信件可能包含工研院機密資訊,非指定之收件者,請勿使用或揭露本信件內容,並請銷毀此信件。
This email may contain confidential information. Please do not use or disclose it in any way
and delete it if you are not the intended recipient.
Mime
View raw message