ignite-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From barham <barry.ham...@gmail.com>
Subject HDFS Caching
Date Fri, 29 Apr 2016 16:35:11 GMT
I'm running Ignite 1.5.0 Hadoop Accelerator version on top of CDH 5.  I'm
trying to write my own SecondaryFileSystem, but as a first step, I created
one that just funnels all of the calls down to the
IgniteHadoopIgfsSecondaryFileSystem and I just log out every time one of my
methods is called.  I'm using the default configuration provided in the
Hadoop Accelerator binary distribution except I added my secondary file
system to the configuration.  

Every time I run hadoop fs -cat <filename> from the command line or 
ignite.fileSystem("igfs").open(<filename>) from inside a java app, my log
statement in my SecondaryFileSystem's open method is printed out.  Even if I
read the same file over and over.  To me, that means my files aren't being
cached inside Ignite (which is the reason I'm looking into Ignite).  I feel
like I must be missing something obvious.  I tried creating a tiny (10 byte)
ASCII text file and reading that in case my files were too big in HDFS.

Thanks for any help.

View this message in context: http://apache-ignite-users.70518.x6.nabble.com/HDFS-Caching-tp4695.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.

View raw message