asterixdb-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mohiuddin Qader <mabdu...@ucr.edu>
Subject Re: AsterixDB merge compaction issue with rtree index
Date Sun, 15 Jul 2018 09:21:44 GMT
After I debugged myself and looked up in logs, I tracked down the problem.
It looks like during merge compaction it fails due to data type issue for
points during MBR calculation at the end of bulk loading (don't know why)
and throws an exception. But the main bug here was that there is no follow
up to clean up the failed merged file and thus no subsequent compaction can
occur. I attach the screenshot.


org.apache.hyracks.algebricks.common.exceptions.NotImplementedException:
Value provider for type bigint is not implemented.




On Sat, Jul 14, 2018 at 9:53 PM, Mohiuddin Qader <mabdu002@ucr.edu> wrote:

> Hello everyone,
>
> I have been running some experiments on current master AsterixDB with
> default merge policy. I am using simple OpenStreetMap and Twitter dataset
> with id used as primary key and a Rtree index on location attribute.
>
> Then I used a feed to ingest the data to the database. After running some
> time (about 2 million insertion), the Rtree index
> (LSMRtreeWithAntiMatterTuples) are throwing a strange exception like file
> already exists: After this exception, cluster infinitely giving this
> exception repetitively and become unusable.
>
> 20:36:55.980 [Executor-24:asterix_nc1] ERROR org.apache.hyracks.storage.am
> .lsm.common.impls.LSMHarness - Failed merge operation on {"class" :
> "LSMRTreeWithAntiMatterTuples", "dir" : "/home/mohiuddin/asterix-
> hyracks/asterixdb/target/io/dir/asterix_nc1/target/tmp/
> asterix_nc1/iodevice1/storage/partition_0/experiments/OpenStreetMap/0/OSMlocation",
> "memory" : 2, "disk" : 5}
> org.apache.hyracks.api.exceptions.HyracksDataException: HYR0082: Failed
> to create the file /home/mohiuddin/asterix-hyracks/asterixdb/target/io/
> dir/asterix_nc1/target/tmp/asterix_nc1/iodevice1/storage/
> partition_0/experiments/OpenStreetMap/0/OSMlocation/
> 2018-07-14-20-36-09-733_2018-07-14-20-34-55-555 because it already exists
>     at org.apache.hyracks.api.exceptions.HyracksDataException.create(HyracksDataException.java:55)
> ~[classes/:?]
>     at org.apache.hyracks.api.util.IoUtil.create(IoUtil.java:87)
> ~[classes/:?]
>     at org.apache.hyracks.storage.common.buffercache.
> BufferCache.createFile(BufferCache.java:809) ~[classes/:?]
>     at org.apache.hyracks.storage.am.common.impls.
> AbstractTreeIndex.create(AbstractTreeIndex.java:83) ~[classes/:?]
>     at org.apache.hyracks.storage.am.lsm.common.impls.
> AbstractLSMDiskComponent.activate(AbstractLSMDiskComponent.java:158)
> ~[classes/:?]
>     at org.apache.hyracks.storage.am.lsm.common.impls.AbstractLSMIndex.
> createDiskComponent(AbstractLSMIndex.java:427) ~[classes/:?]
>     at org.apache.hyracks.storage.am.lsm.rtree.impls.
> LSMRTreeWithAntiMatterTuples.doMerge(LSMRTreeWithAntiMatterTuples.java:237)
> ~[classes/:?]
>     at org.apache.hyracks.storage.am.lsm.common.impls.
> AbstractLSMIndex.merge(AbstractLSMIndex.java:728) ~[classes/:?]
>     at org.apache.hyracks.storage.am.lsm.common.impls.LSMHarness.merge(LSMHarness.java:645)
> [classes/:?]
>     at org.apache.hyracks.storage.am.lsm.common.impls.
> LSMTreeIndexAccessor.merge(LSMTreeIndexAccessor.java:128) [classes/:?]
>     at org.apache.hyracks.storage.am.lsm.common.impls.MergeOperation.call(MergeOperation.java:45)
> [classes/:?]
>     at org.apache.hyracks.storage.am.lsm.common.impls.MergeOperation.call(MergeOperation.java:30)
> [classes/:?]
>     at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> [?:1.8.0_45-internal]
>     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> [?:1.8.0_45-internal]
>     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> [?:1.8.0_45-internal]
>     at java.lang.Thread.run(Thread.java:745) [?:1.8.0_45-internal]
>
>
> I have tried to reinstall everything multiple times, removed all old
> storage files, giving pause during and before ingestion through feeds.
> Nothing seems to be working, every time this exception is occurring after
> some ingestion. Can any of you guys have idea on what is happening? I have
> attached the DDL I was using.
>
>
>
> --
> Regards,
> Mohiuddin Abdul Qader
> Dept of Computer Science
> University of California Riverside
>



-- 
Regards,
Mohiuddin Abdul Qader
Dept of Computer Science
University of California Riverside

Mime
  • Unnamed multipart/related (inline, None, 0 bytes)
View raw message