Harsh brought up HBase and I was responding to that. 
Specifically HBase. 

In more general terms... when you're writing a file, in Hadoop, your file is really a set of files, one for each block. 

If the goal is to encrypt the file so no one can access it... why not just use the file permissions of 600? in a 700 directory? 
If you want to protect the data... you would be better off encrypting the data in the file first then writing it to the file. 

Again like HBase, you're encrypting at the data level. 

One other thing to consider. If you want to encrypt at the file level... you will lose all parallelism. Read: Single mapper access only. 

On Aug 10, 2012, at 1:46 AM, Farrokh Shahriari <mohandes.zebeleh.67@gmail.com> wrote:

Hi Adam,
Beause I have very important data,I wanna use encryption for preventing physical access of other users.
Michael,how can I use coprocessor for encryption? Is there any function for doing that ?

Thanks for helping me

On Tue, Aug 7, 2012 at 10:23 PM, Adam Brown <adam@hortonworks.com> wrote:
Hi Farrokh,

what is your concern here? non hadoop processes accessing the data? or
hadoop users accessing data they are not supposed to?


On Tue, Aug 7, 2012 at 1:13 AM, Farrokh Shahriari
<mohandes.zebeleh.67@gmail.com> wrote:
> Thanks,
> What if I want to use this encryption in a cluster with hbase running on top
> of hadoop? Can't hadoop be configured to automatically encrypt each file
> which is going to be written on it?
> If not I probably should be asking how to enable encryption on hbase, and
> asking this question on the hbase mailing list, right?
> On Tue, Aug 7, 2012 at 12:32 PM, Harsh J <harsh@cloudera.com> wrote:
>> Farrokh,
>> The codec org.apache.hadoop.io.compress.crypto.CyptoCodec needs to be
>> used. What you've done so far is merely add it to be loaded by Hadoop
>> at runtime, but you will need to use it in your programs if you wish
>> for it to get applied.
>> For example, for MapReduce outputs to be compressed, you may run an MR
>> job with the following option set on its configuration:
>> "-Dmapred.output.compression.codec=org.apache.hadoop.io.compress.crypto.CyptoCodec"
>> And then you can notice that your output files were all properly
>> encrypted with the above codec.
>> Likewise, if you're using direct HDFS writes, you will need to wrap
>> your outputstream with this codec. Look at the CompressionCodec API to
>> see how:
>> http://hadoop.apache.org/common/docs/stable/api/org/apache/hadoop/io/compress/CompressionCodec.html#createOutputStream(java.io.OutputStream)
>> (Where your CompressionCodec must be the
>> org.apache.hadoop.io.compress.crypto.CyptoCodec instance).
>> On Tue, Aug 7, 2012 at 1:11 PM, Farrokh Shahriari
>> <mohandes.zebeleh.67@gmail.com> wrote:
>> >
>> > Hello
>> > I use "Hadoop Crypto Compressor" from this
>> > site"https://github.com/geisbruch/HadoopCryptoCompressor" for encryption
>> > hdfs files.
>> > I've downloaded the complete code & create the jar file,Change the
>> > propertise in core-site.xml as the site says.
>> > But when I add a new file,nothing has happened & encryption isn't
>> > working.
>> > What can I do for encryption hdfs files ? Does anyone know how I should
>> > use this class ?
>> >
>> > Tnx
>> --
>> Harsh J

Adam Brown
Enablement Engineer