Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id CF8BD1179E for ; Wed, 16 Apr 2014 15:44:30 +0000 (UTC) Received: (qmail 41901 invoked by uid 500); 16 Apr 2014 15:44:13 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 41802 invoked by uid 500); 16 Apr 2014 15:44:12 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 41794 invoked by uid 99); 16 Apr 2014 15:44:12 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 16 Apr 2014 15:44:12 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_NONE,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of ashettia@hortonworks.com designates 209.85.192.174 as permitted sender) Received: from [209.85.192.174] (HELO mail-pd0-f174.google.com) (209.85.192.174) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 16 Apr 2014 15:44:06 +0000 Received: by mail-pd0-f174.google.com with SMTP id y13so10842474pdi.19 for ; Wed, 16 Apr 2014 08:43:43 -0700 (PDT) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:from:message-id:mime-version:subject:date :references:to:in-reply-to:content-type; bh=Z7R9MlcNCgqIRMktC5yo7cLtH2mDuyI9jmsoKMK6ih8=; b=g46kEwQMF51NP5/fd5SwYApSwApHfHumY2cr5GC4SkwDBNFwJrx2TXzKEHiKgpde2q B0Dk1RW9ybgsAzLsaEjYG7d1bmWbvhGlZaYUjSWsAmXnSg4M8VjlUc2v6oLHMamyIxlw BmEtTOe/11aLpiT5Qc7Sgqabdz64G15kMZ9icyxVkLli95maQe3FBkEjZAw2gdzGx48g FtRWbUrfqGIP+Lb4N5r4HksP4oQoJm5Tt88JPpxGGa8E0cL6BeAXhr3F1XZUdZdjumEX iTXjl5SP+tbksOZUptsEaHVMjCCKwYmIBZgo+che2Tf7V10YgWNRMJ9YUUaKe/9GB/wU PGBQ== X-Gm-Message-State: ALoCoQl88OPT6l1D/I+2yLppE+wgq3yPqP3alE/8KoUBYY+s84PFigdN9LFu+QyrPeUMvjODkpD1zpT5hALCvNP9CAvHDreZPMQRlMawgrOQHauknMm19oQ= X-Received: by 10.68.136.2 with SMTP id pw2mr2779896pbb.167.1397663022794; Wed, 16 Apr 2014 08:43:42 -0700 (PDT) Received: from [10.11.2.137] ([192.175.27.2]) by mx.google.com with ESMTPSA id ek2sm47761820pbd.30.2014.04.16.08.43.41 for (version=TLSv1 cipher=ECDHE-RSA-RC4-SHA bits=128/128); Wed, 16 Apr 2014 08:43:42 -0700 (PDT) From: Abdelrahman Shettia Message-Id: <00BBFD45-D11F-4D64-ABB5-E0B44F6F13B0@hortonworks.com> Mime-Version: 1.0 (Mac OS X Mail 7.2 \(1874\)) Subject: Re: using "-libjars" in Hadoop 2.2.1 Date: Wed, 16 Apr 2014 08:43:38 -0700 References: To: user@hadoop.apache.org In-Reply-To: X-Mailer: Apple Mail (2.1874) Content-Type: multipart/alternative; boundary="Apple-Mail=_DC935522-CDFB-4E33-AC5D-169B40579277" X-Virus-Checked: Checked by ClamAV on apache.org --Apple-Mail=_DC935522-CDFB-4E33-AC5D-169B40579277 Content-Type: text/plain; charset=ISO-8859-1 Hi Kim, It looks like it is pointing to hdfs location. Can you create the hdfs dir and put the jar there? Hope this helps Thanks, Rahman On Apr 16, 2014, at 8:39 AM, Rahul Singh wrote: > any help...all are welcome? > > > On Wed, Apr 16, 2014 at 1:13 PM, Rahul Singh wrote: > Hi, > I am running with the following command but still, jar is not available to mapper and reducers. > > hadoop jar /home/hduser/workspace/Minerva.jar my.search.Minerva /user/hduser/input_minerva_actual /user/hduser/output_merva_actual3 -libjars /home/hduser/Documents/Lib/json-simple-1.1.1.jar -Dmapreduce.user.classpath.first=true > > > Error Log > > 14/04/16 13:08:37 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 > 14/04/16 13:08:37 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 > 14/04/16 13:08:37 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this. > 14/04/16 13:08:37 INFO mapred.FileInputFormat: Total input paths to process : 1 > 14/04/16 13:08:37 INFO mapreduce.JobSubmitter: number of splits:10 > 14/04/16 13:08:37 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1397534064728_0028 > 14/04/16 13:08:38 INFO impl.YarnClientImpl: Submitted application application_1397534064728_0028 > 14/04/16 13:08:38 INFO mapreduce.Job: The url to track the job: http://L-Rahul-Tech:8088/proxy/application_1397534064728_0028/ > 14/04/16 13:08:38 INFO mapreduce.Job: Running job: job_1397534064728_0028 > 14/04/16 13:08:47 INFO mapreduce.Job: Job job_1397534064728_0028 running in uber mode : false > 14/04/16 13:08:47 INFO mapreduce.Job: map 0% reduce 0% > 14/04/16 13:08:58 INFO mapreduce.Job: Task Id : attempt_1397534064728_0028_m_000005_0, Status : FAILED > Error: java.lang.RuntimeException: Error in configuring object > at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109) > at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75) > at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:426) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:416) > at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) > Caused by: java.lang.reflect.InvocationTargetException > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:622) > at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106) > ... 9 more > Caused by: java.lang.NoClassDefFoundError: org/json/simple/parser/ParseException > at java.lang.Class.forName0(Native Method) > at java.lang.Class.forName(Class.java:270) > at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1821) > at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1786) > at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1880) > at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1906) > at org.apache.hadoop.mapred.JobConf.getMapperClass(JobConf.java:1107) > at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:38) > ... 14 more > Caused by: java.lang.ClassNotFoundException: org.json.simple.parser.ParseException > at java.net.URLClassLoader$1.run(URLClassLoader.java:217) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:205) > at java.lang.ClassLoader.loadClass(ClassLoader.java:323) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294) > at java.lang.ClassLoader.loadClass(ClassLoader.java:268) > ... 22 more > > When i analyzed the logs it says > "14/04/16 13:08:37 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this." > > But i have implemented the tool class as described below: > > package my.search; > > import org.apache.hadoop.conf.Configured; > import org.apache.hadoop.fs.Path; > import org.apache.hadoop.io.Text; > import org.apache.hadoop.mapred.FileInputFormat; > import org.apache.hadoop.mapred.FileOutputFormat; > import org.apache.hadoop.mapred.JobClient; > import org.apache.hadoop.mapred.JobConf; > import org.apache.hadoop.mapred.TextInputFormat; > import org.apache.hadoop.mapred.TextOutputFormat; > import org.apache.hadoop.util.Tool; > import org.apache.hadoop.util.ToolRunner; > > public class Minerva extends Configured implements Tool > { > public int run(String[] args) throws Exception { > JobConf conf = new JobConf(Minerva.class); > conf.setJobName("minerva sample job"); > > conf.setMapOutputKeyClass(Text.class); > conf.setMapOutputValueClass(TextArrayWritable.class); > > conf.setOutputKeyClass(Text.class); > conf.setOutputValueClass(Text.class); > > conf.setMapperClass(Map.class); > // conf.setCombinerClass(Reduce.class); > conf.setReducerClass(Reduce.class); > > conf.setInputFormat(TextInputFormat.class); > conf.setOutputFormat(TextOutputFormat.class); > > FileInputFormat.setInputPaths(conf, new Path(args[0])); > FileOutputFormat.setOutputPath(conf, new Path(args[1])); > > JobClient.runJob(conf); > > return 0; > } > > public static void main(String[] args) throws Exception { > int res = ToolRunner.run(new Minerva(), args); > System.exit(res); > } > } > > > Please let me know if you see any issues? > > > > On Thu, Apr 10, 2014 at 9:29 AM, Shengjun Xin wrote: > add '-Dmapreduce.user.classpath.first=true' to your command and try again > > > > On Wed, Apr 9, 2014 at 6:27 AM, Kim Chew wrote: > It seems to me that in Hadoop 2.2.1, using the "libjars" option does not search the jars located in the the local file system but HDFS. For example, > > hadoop jar target/myJar.jar Foo -libjars /home/kchew/test-libs/testJar.jar /user/kchew/inputs/raw.vector /user/kchew/outputs hdfs://remoteNN:8020 remoteJT:8021 > > 14/04/08 15:11:02 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId= > 14/04/08 15:11:02 INFO mapreduce.JobSubmitter: Cleaning up the staging area file:/tmp/hadoop-kchew/mapred/staging/kchew202924688/.staging/job_local202924688_0001 > 14/04/08 15:11:02 ERROR security.UserGroupInformation: PriviledgedActionException as:kchew (auth:SIMPLE) cause:java.io.FileNotFoundException: File does not exist: hdfs://remoteNN:8020/home/kchew/test-libs/testJar.jar > java.io.FileNotFoundException: File does not exist: hdfs:/remoteNN:8020/home/kchew/test-libs/testJar.jar > at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1110) > at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1102) > at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) > at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1102) > at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288) > at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224) > at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93) > at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57) > at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:264) > > So under Hadoop 2.2.1, do I have to explicitly set some configurations so when using the "libjars" option it will copy the file to hdfs from local fs? > > TIA > > Kim > > > > -- > Regards > Shengjun > > -- CONFIDENTIALITY NOTICE NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You. --Apple-Mail=_DC935522-CDFB-4E33-AC5D-169B40579277 Content-Transfer-Encoding: quoted-printable Content-Type: text/html; charset=US-ASCII Hi Kim,

It looks like it is pointing to hdfs location. Can you create the hdfs dir= and put the jar there? Hope this helps 
Thanks,
Rahman

On Apr 16, 2014, at 8:39 AM, Rahul Singh <smart.rahul.iiit@gmail.com> wrote:
any help...all are welcome?
<= br>
On Wed, Apr 16, 2014 at 1:13 PM, Rahul Si= ngh <smart.rahul.iiit@gmail.com> wrote:
Hi,
 I am running with the follo= wing command but still, jar is not available to mapper and reducers.

hadoop jar /home/hduser/workspace/Minerva.jar my.search.Minerva /user/h= duser/input_minerva_actual /user/hduser/output_merva_actual3 -libjars /home= /hduser/Documents/Lib/json-simple-1.1.1.jar -Dmapreduce.user.classpath.firs= t=3Dtrue


Error Log

14/04/16 13:08:37 INFO client.RMProxy: Conne= cting to ResourceManager at /0.0.0.0:8032
14/04/16 13:08:37 INFO client.RMProxy: Connecting = to ResourceManager at /0= .0.0.0:8032
14/04/16 13:08:37 WARN mapreduce.JobSubmitter: Hadoop command-line option p= arsing not performed. Implement the Tool interface and execute your applica= tion with ToolRunner to remedy this.
14/04/16 13:08:37 INFO mapred.FileI= nputFormat: Total input paths to process : 1
14/04/16 13:08:37 INFO mapreduce.JobSubmitter: number of splits:10
14/04= /16 13:08:37 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_13= 97534064728_0028
14/04/16 13:08:38 INFO impl.YarnClientImpl: Submitted a= pplication application_1397534064728_0028
14/04/16 13:08:38 INFO mapreduce.Job: The url to track the job: http://L-Rahul-Tech:8088/proxy/application_1397534064728_0028/ 14/04/16 13:08:38 INFO mapreduce.Job: Running job: job_1397534064728_0028 14/04/16 13:08:47 INFO mapreduce.Job: Job job_1397534064728_0028 running in= uber mode : false
14/04/16 13:08:47 INFO mapreduce.Job:  map 0% re= duce 0%
14/04/16 13:08:58 INFO mapreduce.Job: Task Id : attempt_13975340= 64728_0028_m_000005_0, Status : FAILED
Error: java.lang.RuntimeException: Error in configuring object
 &nb= sp;  at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUt= ils.java:109)
    at org.apache.hadoop.util.ReflectionUti= ls.setConf(ReflectionUtils.java:75)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(Re= flectionUtils.java:133)
    at org.apache.hadoop.mapred.M= apTask.runOldMapper(MapTask.java:426)
    at org.apache.h= adoop.mapred.MapTask.run(MapTask.java:342)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.ja= va:168)
    at java.security.AccessController.doPrivilege= d(Native Method)
    at javax.security.auth.Subject.doAs(= Subject.java:416)
    at org.apache.hadoop.security.UserG= roupInformation.doAs(UserGroupInformation.java:1548)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.jav= a:163)
Caused by: java.lang.reflect.InvocationTargetException
 &= nbsp;  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)<= br>    at sun.reflect.NativeMethodAccessorImpl.invoke(Native= MethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(Deleg= atingMethodAccessorImpl.java:43)
    at java.lang.reflect= .Method.invoke(Method.java:622)
    at org.apache.hadoop.= util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
    ... 9 more
Caused by: java.lang.NoClassDefFoundError:= org/json/simple/parser/ParseException
    at java.lang.C= lass.forName0(Native Method)
    at java.lang.Class.forNa= me(Class.java:270)
    at org.apache.hadoop.conf.Configur= ation.getClassByNameOrNull(Configuration.java:1821)
    at org.apache.hadoop.conf.Configuration.getClassByName(C= onfiguration.java:1786)
    at org.apache.hadoop.conf.Con= figuration.getClass(Configuration.java:1880)
    at org.a= pache.hadoop.conf.Configuration.getClass(Configuration.java:1906)
    at org.apache.hadoop.mapred.JobConf.getMapperClass(JobCo= nf.java:1107)
    at org.apache.hadoop.mapred.MapRunner.c= onfigure(MapRunner.java:38)
    ... 14 more
Caused by:= java.lang.ClassNotFoundException: org.json.simple.parser.ParseException     at java.net.URLClassLoader$1.run(URLClassLoader.java:217= )
    at java.security.AccessController.doPrivileged(Nati= ve Method)
    at java.net.URLClassLoader.findClass(URLCl= assLoader.java:205)
    at java.lang.ClassLoader.loadClas= s(ClassLoader.java:323)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.j= ava:294)
    at java.lang.ClassLoader.loadClass(ClassLoad= er.java:268)
    ... 22 more

When i analyzed= the logs it says
"14/04/16 13:08:37 WARN mapreduce.JobSubmitter: Hadoop= command-line option parsing not performed. Implement the Tool interface an= d execute your application with ToolRunner to remedy this."

But i have implemented the tool class as described below:
package my.search;

import org.apache.hadoop.conf.Configured;
imp= ort org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapred.FileInputFormat;
import org.apache.hadoo= p.mapred.FileOutputFormat;
import org.apache.hadoop.mapred.JobClient;import org.apache.hadoop.mapred.JobConf;
import org.apache.hadoop.mapre= d.TextInputFormat;
import org.apache.hadoop.mapred.TextOutputFormat;
import org.apache.hado= op.util.Tool;
import org.apache.hadoop.util.ToolRunner;

public cl= ass Minerva extends Configured implements Tool
{
    p= ublic int run(String[] args) throws Exception {
        JobConf conf =3D new JobConf(Minerva.= class);
        conf.setJobName("minerva s= ample job");

        conf.setMapOutput= KeyClass(Text.class);
        conf.setMapO= utputValueClass(TextArrayWritable.class);

        conf.setOutputKeyClass(Text.class= );
        conf.setOutputValueClass(Text.c= lass);

        conf.setMapperClass(Map= .class);
        // conf.setCombinerClass(= Reduce.class);
        conf.setReducerClas= s(Reduce.class);

        conf.setInputFormat(TextInputForm= at.class);
        conf.setOutputFormat(Te= xtOutputFormat.class);

        FileInp= utFormat.setInputPaths(conf, new Path(args[0]));
    &nbs= p;   FileOutputFormat.setOutputPath(conf, new Path(args[1]));

        JobClient.runJob(conf);
 =       
        r= eturn 0;
    }
   
  &nbs= p; public static void main(String[] args) throws Exception {
  = ;      int res =3D ToolRunner.run(new Minerva(), a= rgs);
        System.exit(res);
    }
}


Please let me know if you see an= y issues?



On Thu, Apr 10, 2014 at= 9:29 AM, Shengjun Xin <sxin@gopivotal.com> wrote:
add '-Dmapreduce.user.class= path.first=3Dtrue' to your command and try again



On Wed, Apr 9, 2014 at 6:27 AM, Kim= Chew <kchew534@gmail.com> wrote:
It seems to me that in Hadoop 2.2.1, using the "libjars"= option does not search the jars located in the the local file system but H= DFS. For example,

hadoop jar target/myJar.jar Foo -libjars /home/kchew/test-libs/testJar.= jar /user/kchew/inputs/raw.vector /user/kchew/outputs hdfs://remoteNN:8020 remoteJT:8021

14/04/08 15:11:02 INFO jvm.JvmMetrics: Initializing JVM Metrics with pr= ocessName=3DJobTracker, sessionId=3D
14/04/08 15:11:02 INFO mapreduce.Jo= bSubmitter: Cleaning up the staging area file:/tmp/hadoop-kchew/mapred/stag= ing/kchew202924688/.staging/job_local202924688_0001
14/04/08 15:11:02 ERROR security.UserGroupInformation: PriviledgedActionExc= eption as:kchew (auth:SIMPLE) cause:java.io.FileNotFoundException: File doe= s not exist: hdfs://remoteNN:8020/home/kchew/test-libs/testJar.jar
java.io.FileNotFoundException: File does not exist: hdfs:/remoteNN:8020/hom= e/kchew/test-libs/testJar.jar
    at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCal= l(DistributedFileSystem.java:1110)
    at org.apache.hado= op.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1102)    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(= FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.getFileS= tatus(DistributedFileSystem.java:1102)
    at org.apache.= hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(Clie= ntDistributedCacheManager.java:288)
    at org.apache.hadoop.mapreduce.filecache.ClientDistribut= edCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
&nb= sp;   at org.apache.hadoop.mapreduce.filecache.ClientDistributedC= acheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
    at org.apache.hadoop.mapreduce.filecache.ClientDistribut= edCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCac= heManager.java:57)
    at org.apache.hadoop.mapreduce.Job= Submitter.copyAndConfigureFiles(JobSubmitter.java:264)

So under Hadoop 2.2.1, do I have to explicitly set some configura= tions so when using the "libjars" option it will copy the file to hdfs from= local fs?

TIA

Kim
=



--
Regards
Shengjun




CONFIDENTIALITY NOTICE
NOTICE: This message is = intended for the use of the individual or entity to which it is addressed a= nd may contain information that is confidential, privileged and exempt from= disclosure under applicable law. If the reader of this message is not the = intended recipient, you are hereby notified that any printing, copying, dis= semination, distribution, disclosure or forwarding of this communication is= strictly prohibited. If you have received this communication in error, ple= ase contact the sender immediately and delete it from your system. Thank Yo= u. --Apple-Mail=_DC935522-CDFB-4E33-AC5D-169B40579277--