falcon-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Venkat R <verama...@yahoo.com.INVALID>
Subject Re: Replication Job throws GSSException
Date Fri, 11 Jul 2014 16:07:32 GMT
I am able to run oozie jobs on both the clustes (primarycluster and backupCluster, both secured).


I'm also able to access hdfs -ls command on primaryCluser from the backupCluster Oozie/Falcon
machine. 

It's that replication job that kick off in backupCluster compute node that fails when trying
to talk to the primaryCluster namenode. 


Both the Falcon cluster definition have the NN principal set.
Both the primary and backup cluster core.xml has both the oozie/falcon machines are included
in the proxy.oozie and proxy.falcon properties.

I will try the command you mentioned shortly and reply. 



On Friday, July 11, 2014 8:53 AM, Arpit Gupta <arpit@hortonworks.com> wrote:
 


Hmm we have been running this setup and it works for us. Are you able to
run any other job through oozie (without falcon)? If so can you do the
following.

kinit as some user and make the following call using curl

curl --negotiate -u : "
http://eat1-nertznn01.grid.linkedin.com:50070/webhdfs/v1/?op=GETDELEGATIONTOKEN&user.name=veramach
"

See if this works. I am at a loss right now will have to see what we are
doing in our configs.


On Thu, Jul 10, 2014 at 6:11 PM, Venkat R <veramacha@yahoo.com.invalid>
wrote:

> Hi Arpit,
>
> The jersey-server and jersey-core jars were missing and I copied them to
> WEB-INF and the coordinator is able to talk to source cluster name-node to
> indentify the new dirs and kick off the workflow.
>
> But the workflow fails with similar exception as hftp (unable to get the
> token) -- exception below:
>
> Thanks
> Venkat
>
> Failing Oozie Launcher, Main class
> [org.apache.falcon.latedata.LateDataHandler], main() threw exception,
> Authentication failed, url=
> http://eat1-nertznn01.grid.linkedin.com:50070/webhdfs/v1/?op=GETDELEGATIONTOKEN&user.name=veramach
> java.io.IOException: Authentication failed, url=
> http://eat1-nertznn01.grid.linkedin.com:50070/webhdfs/v1/?op=GETDELEGATIONTOKEN&user.name=veramach
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.init(WebHdfsFileSystem.java:490)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:531)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem.run(WebHdfsFileSystem.java:424)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getDelegationToken(WebHdfsFileSystem.java:953)
> at
> org.apache.hadoop.hdfs.web.TokenAspect.ensureTokenInitialized(TokenAspect.java:143)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getDelegationToken(WebHdfsFileSystem.java:227)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getAuthParameters(WebHdfsFileSystem.java:381)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toUrl(WebHdfsFileSystem.java:402)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem$FsPathRunner.getUrl(WebHdfsFileSystem.java:652)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.init(WebHdfsFileSystem.java:485)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:531)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem.run(WebHdfsFileSystem.java:424)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getHdfsFileStatus(WebHdfsFileSystem.java:678)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getFileStatus(WebHdfsFileSystem.java:689)
> at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
> at org.apache.hadoop.fs.Globber.glob(Globber.java:238)
> at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1624)
> at
> org.apache.falcon.latedata.LateDataHandler.usage(LateDataHandler.java:269)
> at
> org.apache.falcon.latedata.LateDataHandler.getFileSystemUsageMetric(LateDataHandler.java:252)
> at
> org.apache.falcon.latedata.LateDataHandler.computeStorageMetric(LateDataHandler.java:224)
> at
> org.apache.falcon.latedata.LateDataHandler.computeMetrics(LateDataHandler.java:170)
> at org.apache.falcon.latedata.LateDataHandler.run(LateDataHandler.java:147)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.falcon.latedata.LateDataHandler.main(LateDataHandler.java:60)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:226)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> Caused by:
> org.apache.hadoop.security.authentication.client.AuthenticationException:
> GSSException: No valid credentials provided (Mechanism level: Failed to
> find any Kerberos tgt)
> at
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator.doSpnegoSequence(KerberosAuthenticator.java:306)
> at
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:196)
> at
> org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:232)
> at
> org.apache.hadoop.hdfs.web.URLConnectionFactory.openConnection(URLConnectionFactory.java:164)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.openHttpUrlConnection(WebHdfsFileSystem.java:475)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$200(WebHdfsFileSystem.java:431)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:457)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:454)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.getHttpUrlConnection(WebHdfsFileSystem.java:453)
> at
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.init(WebHdfsFileSystem.java:487)
> ... 36 more
> Caused by: GSSException: No valid credentials provided (Mechanism level:
> Failed to find any Kerberos tgt)
> at
> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
> at
> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
> at
> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
> at
> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> at
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator$1.run(KerberosAuthenticator.java:285)
> at
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator$1.run(KerberosAuthenticator.java:261)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator.doSpnegoSequence(KerberosAuthenticator.java:261)
> ... 48 more
>
>
>
> On Thursday, July 10, 2014 5:58 PM, Arpit Gupta <arpit@hortonworks.com>
> wrote:
>
>
>
> This looks like the Oozie war file is missing sole jars that hadoop needs.
> What version of hadoop are you running and how did you do the Oozie war
> setup?
>
> On Thursday, July 10, 2014, Venkat R <veramacha@yahoo.com.invalid> wrote:
>
> > Switched to webhdfs, but the co-ordinator keeps failing with the
> following
> > exception and thinks the data on the other side is not present. I am
> > running Apache version of Oozie (4.0.1).
> > Any thoughts?
> >
> > Venkat
> >
> > ACTION[0000006-140710220847349-oozie-oozi-C@1] Error,
> > java.lang.NoClassDefFoundError: Could not initialize class
> > javax.ws.rs.core.MediaType
> > at
> >
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem.jsonParse(WebHdfsFileSystem.java:287)
> > at
> >
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.getResponse(WebHdfsFileSystem.java:630)
> > at
> >
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:535)
> > at
> >
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem.run(WebHdfsFileSystem.java:424)
> > at
> >
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getDelegationToken(WebHdfsFileSystem.java:953)
> > at
> >
> org.apache.hadoop.hdfs.web.TokenAspect.ensureTokenInitialized(TokenAspect.java:143)
> > at
> >
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getDelegationToken(WebHdfsFileSystem.java:227)
> > at
> >
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getAuthParameters(WebHdfsFileSystem.java:381)
> > at
> >
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toUrl(WebHdfsFileSystem.java:402)
> > at
> >
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem$FsPathRunner.getUrl(WebHdfsFileSystem.java:652)
> > at
> >
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.init(WebHdfsFileSystem.java:485)
> > at
> >
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:531)
> > at
> >
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem.run(WebHdfsFileSystem.java:424)
> > at
> >
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getHdfsFileStatus(WebHdfsFileSystem.java:678)
> > at
> >
> org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getFileStatus(WebHdfsFileSystem.java:689)
> > at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1399)
> > at org.apache.oozie.dependency.FSURIHandler.exists(FSURIHandler.java:100)
> > at
> >
> org.apache.oozie.command.coord.CoordActionInputCheckXCommand.pathExists(CoordActionInputCheckXCommand.java:484)
> > at
> >
> org.apache.oozie.command.coord.CoordActionInputCheckXCommand.checkListOfPaths(CoordActionInputCheckXCommand.java:455)
> > at
> >
> org.apache.oozie.command.coord.CoordActionInputCheckXCommand.checkResolvedUris(CoordActionInputCheckXCommand.java:425)
> > at
> >
> org.apache.oozie.command.coord.CoordActionInputCheckXCommand.checkInput(CoordActionInputCheckXCommand.java:255)
> > at
> >
> org.apache.oozie.command.coord.CoordActionInputCheckXCommand.execute(CoordActionInputCheckXCommand.java:130)
> > at
> >
> org.apache.oozie.command.coord.CoordActionInputCheckXCommand.execute(CoordActionInputCheckXCommand.java:65)
> > at org.apache.oozie.command.XCommand.call(XCommand.java:280)
> > at
> >
> org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:326)
> > at
> >
> org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:255)
> > at
> >
> org.apache.oozie.service.CallableQueueService$CallableWrapper.run(CallableQueueService.java:175)
> > at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> > at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> > at java.lang.Thread.run(Thread.java:662)
> >
> >
> > On Thursday, July 10, 2014 2:42 PM, Venkat R <veramacha@yahoo.com.INVALID
> >
> > wrote:
> >
> >
> >
> > ok, will try now and see.
> >
> >
> >
> > On Thursday, July 10, 2014 2:37 PM, Arpit Gupta <arpit@hortonworks.com
> > <javascript:;>> wrote:
> >
> >
> >
> > from the stack trace it looks like you are using hftp. We ran into issues
> > when running tests against secure hadoop + hftp
> >
> > https://issues.apache.org/jira/browse/HDFS-5842
> >
> > I recommend switch the readonly interface to webhdfs.
> >
> > --
> > Arpit Gupta
> > Hortonworks Inc.
> > http://hortonworks.com/
> >
> >
> > On Jul 10, 2014, at 2:16 PM, Arpit Gupta <arpit@hortonworks.com
> > <javascript:;>> wrote:
> >
> > > You need to provide the nn principal in the cluster.xml for each
> > cluster. The following property needs to be provided in each cluster's
> xml
> > >
> > > dfs.namenode.kerberos.principal
> > > --
> > > Arpit Gupta
> > > Hortonworks Inc.
> > > http://hortonworks.com/
> > >
> > > On Jul 10, 2014, at 2:08 PM, Venkat R <veramacha@yahoo.com
> > <javascript:;>> wrote:
> > >
> > >> Using the demo example. There is a replication job that copies dataset
> > from Source to Target cluster by launching a REPLICATION job on Target
> > Oozie cluster. But it fails with the following GSSException:
> > >>
> > >> I have added both the oozie servers (one for source and target
> > clusters) to the core-site.xml of both the clusters as proxyuser machines
> > as below:
> > >>
> > >> source-cluster and target-cluster : core-site.xml has the following:
> > >>
> > >>   <property>
> > >>     <name>hadoop.proxyuser.oozie.groups</name>
> > >>     <value>users</value>
> > >>   </property>
> > >>   <property>
> > >>     <name>hadoop.proxyuser.oozie.hosts</name>
> > >>     <value>eat1-hcl0758.grid.linkedin.com,
> > eat1-hcl0759.grid.linkedin.com</value>
> > >>   </property>
> > >>
> > >> Appreciate any pointers.
> > >> Venkat
> > >>
> > >> Failing Oozie Launcher, Main class
> > [org.apache.falcon.latedata.LateDataHandler], main() threw exception,
> > Unable to obtain remote token
> > >> java.io.IOException: Unable to obtain remote token
> > >>     at
> >
> org.apache.hadoop.hdfs.tools.DelegationTokenFetcher.getDTfromRemote(DelegationTokenFetcher.java:249)
> > >>     at
> > org.apache.hadoop.hdfs.web.HftpFileSystem$2.run(HftpFileSystem.java:251)
> > >>     at
> > org.apache.hadoop.hdfs.web.HftpFileSystem$2.run(HftpFileSystem.java:246)
> > >>     at java.security.AccessController.doPrivileged(Native Method)
> > >>     at javax.security.auth.Subject.doAs(Subject.java:415)
> > >>     at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> > >>     at
> >
> org.apache.hadoop.hdfs.web.HftpFileSystem.getDelegationToken(HftpFileSystem.java:246)
> > >>     at
> >
> org.apache.hadoop.hdfs.web.TokenAspect.ensureTokenInitialized(TokenAspect.java:143)
> > >>     at
> >
> org.apache.hadoop.hdfs.web.HftpFileSystem.addDelegationTokenParam(HftpFileSystem.java:336)
> > >>     at
> >
> org.apache.hadoop.hdfs.web.HftpFileSystem.openConnection(HftpFileSystem.java:323)
> > >>     at
> >
> org.apache.hadoop.hdfs.web.HftpFileSystem$LsParser.fetchList(HftpFileSystem.java:455)
> > >>     at
> >
> org.apache.hadoop.hdfs.web.HftpFileSystem$LsParser.getFileStatus(HftpFileSystem.java:470)
> > >>     at
> >
> org.apache.hadoop.hdfs.web.HftpFileSystem.getFileStatus(HftpFileSystem.java:499)
> > >>     at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
> > >>     at org.apache.hadoop.fs.Globber.glob(Globber.java:238)
> > >>     at
> org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1624)
> > >>     at
> >
> org.apache.falcon.latedata.LateDataHandler.usage(LateDataHandler.java:269)
> > >>     at
> >
> org.apache.falcon.latedata.LateDataHandler.getFileSystemUsageMetric(LateDataHandler.java:252)
> > >>     at
> >
> org.apache.falcon.latedata.LateDataHandler.computeStorageMetric(LateDataHandler.java:224)
> > >>     at
> >
> org.apache.falcon.latedata.LateDataHandler.computeMetrics(LateDataHandler.java:170)
> > >>     at
> > org.apache.falcon.latedata.LateDataHandler.run(LateDataHandler.java:147)
> > >>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> > >>     at
> > org.apache.falcon.latedata.LateDataHandler.main(LateDataHandler.java:60)
> > >>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >>     at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> > >>     at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >>     at java.lang.reflect.Method.invoke(Method.java:606)
> > >>     at
> >
> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:226)
> > >>     at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> > >>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
> > >>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
> > >>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
> > >>     at java.security.AccessController.doPrivileged(Native Method)
> > >>     at javax.security.auth.Subject.doAs(Subject.java:415)
> > >>     at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
> > >>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
> > >> Caused by:
> > org.apache.hadoop.security.authentication.client.AuthenticationException:
> > GSSException: No valid credentials provided (Mechanism level: Failed to
> > find any Kerberos tgt)
> > >>     at
> >
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator.doSpnegoSequence(KerberosAuthenticator.java:306)
> > >>     at
> >
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:196)
> > >>     at
> >
> org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:232)
> > >>     at
> >
> org.apache.hadoop.hdfs.web.URLConnectionFactory.openConnection(URLConnectionFactory.java:164)
> > >>     at
> >
> org.apache.hadoop.hdfs.tools.DelegationTokenFetcher.run(DelegationTokenFetcher.java:371)
> > >>     at
> >
> org.apache.hadoop.hdfs.tools.DelegationTokenFetcher.getDTfromRemote(DelegationTokenFetcher.java:238)
> > >>     ... 35 more
> > >> Caused by: GSSException: No valid credentials provided (Mechanism
> > level: Failed to find any Kerberos tgt)
> > >>     at
> >
> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
> > >>     at
> >
> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
> > >>     at
> >
> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
> > >>     at
> >
> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
> > >>     at
> > sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
> > >>     at
> > sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> > >>     at
> >
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator$1.run(KerberosAuthenticator.java:285)
> > >>     at
> >
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator$1.run(KerberosAuthenticator.java:261)
> > >>     at java.security.AccessController.doPrivileged(Native Method)
> > >>     at javax.security.auth.Subject.doAs(Subject.java:415)
> > >>     at
> >
> org.apache.hadoop.security.authentication.client.KerberosAuthenticator.doSpnegoSequence(KerberosAuthenticator.java:261)
> > >>     ... 40 more
> > >>
> > >> Oozie Launcher failed, finishing Hadoop job gracefully
> > >>
> > >
> >
> >
> > --
> > CONFIDENTIALITY NOTICE
> > NOTICE: This message is intended for the use of the individual or entity
> to
> > which it is addressed and may contain information that is confidential,
> > privileged and exempt from disclosure under applicable law. If the reader
> > of this message is not the intended recipient, you are hereby notified
> that
> > any printing, copying, dissemination, distribution, disclosure or
> > forwarding of this communication is strictly prohibited. If you have
> > received this communication in error, please contact the sender
> immediately
> > and delete it from your system. Thank You.
>
>
> --
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity to
> which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.
Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message