hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ian Jiang <ianjiang....@gmail.com>
Subject PriviledgedActionException when starting JobTracker and TaskTracker of hadoop 1.0.0 with security
Date Tue, 14 Feb 2012 05:34:58 GMT
I am setting up a 2 nodes hadoop cluster with security, according to
the "Hadoop Security Guide" of Cloudera.

The JobTracker throws a PriviledgedActionException when the
TaskTracker connects.

I posted to "hadoop-common-user" maillist a few days ago (Sat, 11 Feb
2012). But I got more debug info. today, because the kerberos
debugging was enable according to
https://ccp.cloudera.com/display/CDHDOC/Appendix+D+-+Enabling+Debugging+Output+for+the+Sun+Kerberos+Classes.
And the JobTracker and the TaskTracker were running at "slavez" and
"slavez" respectively.

(1) Firstly, the NameNode slavez starts, the DataNode slavez can join
in, and the HDFS works well. ( I am able to put files from local fs
into the HDFS, and to modify file attributes.)
(2) Then I starts the JobTracker at slavez, ant it begins waiting for
RPC from TaskTracker.
(3) When I starts the TaskTracker at the other node slavey, the
JobTracker gives a "security.UserGroupInformation" error:
ERROR security.UserGroupInformation: PriviledgedActionException
as:mapred/slavez.demodomain@HADOOP.DEMO.COM
cause:javax.security.sasl.SaslException: Failure to initialize
security context [Caused by GSSException: No valid credentials
provided (Mechanism level: Failed to find any Kerberos Key)]
(4) Ant the TaskTracker throws:
PriviledgedActionException as:mapred/slavey@HADOOP.DEMO.COM
cause:org.apache.hadoop.ipc.RemoteException: Failure to initialize
security context

The following are detail logs and any suggestion is appreciated!

Version info.
---------------
hadoop-1.0.0
kerberos-1.8.5
JDK 1.6.0_29
Redhat rhel 4u3


Cluster info.
---------------
host "slavez.demodomain" as NameNode
host "slavey.demodomain" as DataNode
host "master.dedomain" as Kerberos KDC and NTP server


Log of JobTracker
-----------------------


[mapred@slavez hadoop-1.0.0]$ ./bin/hadoop jobtracker
12/02/14 12:55:34 INFO mapred.JobTracker: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting JobTracker
STARTUP_MSG:   host = slavez.demodomain/192.168.0.14
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.0.0
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1214675; compiled by 'hortonfo' on Thu Dec 15 16:36:35 UTC 2011
************************************************************/
12/02/14 12:55:34 INFO impl.MetricsConfig: loaded properties from
hadoop-metrics2.properties
12/02/14 12:55:34 DEBUG impl.MetricsConfig: poking parent
PropertiesConfiguration for period
12/02/14 12:55:34 DEBUG impl.MetricsSourceAdapter: Updating attr cache...
12/02/14 12:55:34 DEBUG impl.MetricsSourceAdapter: Done. numMetrics=18
12/02/14 12:55:34 DEBUG impl.MetricsSourceAdapter: Updating info cache...
12/02/14 12:55:34 DEBUG impl.MetricsSourceAdapter: Done
12/02/14 12:55:34 INFO impl.MetricsSourceAdapter: MBean for source
MetricsSystem,sub=Stats registered.
12/02/14 12:55:34 DEBUG impl.MetricsSourceAdapter: Stacktrace:
java.lang.Throwable
12/02/14 12:55:34 INFO impl.MetricsSystemImpl: Scheduled snapshot
period at 10 second(s).
12/02/14 12:55:34 INFO impl.MetricsSystemImpl: JobTracker metrics system started
12/02/14 12:55:34 DEBUG impl.MetricsSourceAdapter: Updating attr cache...
12/02/14 12:55:34 DEBUG impl.MetricsSourceAdapter: Done. numMetrics=22
12/02/14 12:55:34 DEBUG impl.MetricsSourceAdapter: Updating info cache...
12/02/14 12:55:34 DEBUG impl.MetricsSourceAdapter: Done
12/02/14 12:55:34 INFO impl.MetricsSourceAdapter: MBean for source
QueueMetrics,q=default registered.
12/02/14 12:55:34 DEBUG impl.MetricsSourceAdapter: Stacktrace:
java.lang.Throwable
12/02/14 12:55:34 DEBUG impl.MetricsSystemImpl: Registered source
QueueMetrics,q=default
Config name: /etc/krb5.conf
12/02/14 12:55:35 DEBUG impl.MetricsSourceAdapter: Updating attr cache...
12/02/14 12:55:35 DEBUG impl.MetricsSourceAdapter: Done. numMetrics=6
12/02/14 12:55:35 DEBUG impl.MetricsSourceAdapter: Updating info cache...
12/02/14 12:55:35 DEBUG impl.MetricsSourceAdapter: Done
12/02/14 12:55:35 INFO impl.MetricsSourceAdapter: MBean for source ugi
registered.
12/02/14 12:55:35 DEBUG impl.MetricsSourceAdapter: Stacktrace:
java.lang.Throwable
12/02/14 12:55:35 DEBUG impl.MetricsSystemImpl: Registered source ugi
>>> KeyTabInputStream, readName(): HADOOP.DEMO.COM
>>> KeyTabInputStream, readName(): mapred
>>> KeyTabInputStream, readName(): slavez.demodomain
>>> KeyTab: load() entry length: 83; type: 16
>>> KeyTabInputStream, readName(): HADOOP.DEMO.COM
>>> KeyTabInputStream, readName(): mapred
>>> KeyTabInputStream, readName(): slavez.demodomain
>>> KeyTab: load() entry length: 67; type: 1
>>> KeyTabInputStream, readName(): HADOOP.DEMO.COM
>>> KeyTabInputStream, readName(): mapred
>>> KeyTabInputStream, readName(): slavez
>>> KeyTab: load() entry length: 72; type: 16
>>> KeyTabInputStream, readName(): HADOOP.DEMO.COM
>>> KeyTabInputStream, readName(): mapred
>>> KeyTabInputStream, readName(): slavez
>>> KeyTab: load() entry length: 56; type: 1
>>> KeyTabInputStream, readName(): HADOOP.DEMO.COM
>>> KeyTabInputStream, readName(): host
>>> KeyTabInputStream, readName(): slavez.demodomain
>>> KeyTab: load() entry length: 81; type: 16
>>> KeyTabInputStream, readName(): HADOOP.DEMO.COM
>>> KeyTabInputStream, readName(): host
>>> KeyTabInputStream, readName(): slavez.demodomain
>>> KeyTab: load() entry length: 65; type: 1
>>> KeyTabInputStream, readName(): HADOOP.DEMO.COM
>>> KeyTabInputStream, readName(): host
>>> KeyTabInputStream, readName(): slavez
>>> KeyTab: load() entry length: 70; type: 16
>>> KeyTabInputStream, readName(): HADOOP.DEMO.COM
>>> KeyTabInputStream, readName(): host
>>> KeyTabInputStream, readName(): slavez
>>> KeyTab: load() entry length: 54; type: 1
>>> KeyTabInputStream, readName(): HADOOP.DEMO.COM
>>> KeyTabInputStream, readName(): mapred
>>> KeyTabInputStream, readName(): slavey.demodomain
>>> KeyTab: load() entry length: 83; type: 16
>>> KeyTabInputStream, readName(): HADOOP.DEMO.COM
>>> KeyTabInputStream, readName(): mapred
>>> KeyTabInputStream, readName(): slavey.demodomain
>>> KeyTab: load() entry length: 67; type: 1
>>> KeyTabInputStream, readName(): HADOOP.DEMO.COM
>>> KeyTabInputStream, readName(): mapred
>>> KeyTabInputStream, readName(): slavey
>>> KeyTab: load() entry length: 72; type: 16
>>> KeyTabInputStream, readName(): HADOOP.DEMO.COM
>>> KeyTabInputStream, readName(): mapred
>>> KeyTabInputStream, readName(): slavey
>>> KeyTab: load() entry length: 56; type: 1
>>> KeyTabInputStream, readName(): HADOOP.DEMO.COM
>>> KeyTabInputStream, readName(): host
>>> KeyTabInputStream, readName(): slavey.demodomain
>>> KeyTab: load() entry length: 81; type: 16
>>> KeyTabInputStream, readName(): HADOOP.DEMO.COM
>>> KeyTabInputStream, readName(): host
>>> KeyTabInputStream, readName(): slavey.demodomain
>>> KeyTab: load() entry length: 65; type: 1
>>> KeyTabInputStream, readName(): HADOOP.DEMO.COM
>>> KeyTabInputStream, readName(): host
>>> KeyTabInputStream, readName(): slavey
>>> KeyTab: load() entry length: 70; type: 16
>>> KeyTabInputStream, readName(): HADOOP.DEMO.COM
>>> KeyTabInputStream, readName(): host
>>> KeyTabInputStream, readName(): slavey
>>> KeyTab: load() entry length: 54; type: 1
Added key: 1version: 1
Added key: 16version: 1
Ordering keys wrt default_tkt_enctypes list
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 3 1 23 16 17.
>>> KdcAccessibility: reset
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 3 1 23 16 17.
>>> KrbAsReq calling createMessage
>>> KrbAsReq in createMessage
>>> KrbKdcReq send: kdc=master.demodomain UDP:88, timeout=30000, number of retries
=3, #bytes=168
>>> KDCCommunication: kdc=master.demodomain UDP:88, timeout=30000,Attempt =1, #bytes=168
>>> KrbKdcReq send: #bytes read=298
>>> KrbKdcReq send: #bytes read=298
>>> KdcAccessibility: remove master.demodomain:88
>>> KDCRep: init() encoding tag is 126 req type is 11
>>>KRBError:
	 cTime is Tue Feb 14 12:55:35 CST 2012 1329195335000
	 sTime is Tue Feb 14 12:55:35 CST 2012 1329195335000
	 suSec is 464901
	 error code is 25
	 error Message is Additional pre-authentication required
	 crealm is HADOOP.DEMO.COM
	 cname is mapred/slavez.demodomain
	 realm is HADOOP.DEMO.COM
	 sname is krbtgt/HADOOP.DEMO.COM
	 eData provided.
	 msgType is 30
>>>Pre-Authentication Data:
	 PA-DATA type = 2
	 PA-ENC-TIMESTAMP
>>>Pre-Authentication Data:
	 PA-DATA type = 136
>>>Pre-Authentication Data:
	 PA-DATA type = 11
	 PA-ETYPE-INFO etype = 16
	 PA-ETYPE-INFO salt = null
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 16
	 PA-ETYPE-INFO2 salt = null
>>>Pre-Authentication Data:
	 PA-DATA type = 13
>>>Pre-Authentication Data:
	 PA-DATA type = 133
KRBError received: NEEDED_PREAUTH
AcquireTGT: PREAUTH FAILED/REQUIRED, re-send AS-REQ
>>>KrbAsReq salt is HADOOP.DEMO.COMmapredslavez.demodomain
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 3 1 23 16 17.
Pre-Authenticaton: find key for etype = 16
AS-REQ: Add PA_ENC_TIMESTAMP now
>>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>> KrbAsReq calling createMessage
>>> KrbAsReq in createMessage
>>> KrbKdcReq send: kdc=master.demodomain UDP:88, timeout=30000, number of retries
=3, #bytes=260
>>> KDCCommunication: kdc=master.demodomain UDP:88, timeout=30000,Attempt =1, #bytes=260
>>> KrbKdcReq send: #bytes read=718
>>> KrbKdcReq send: #bytes read=718
>>> KdcAccessibility: remove master.demodomain:88
>>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>> KrbAsRep cons in KrbAsReq.getReply mapred/slavez.demodomain
12/02/14 12:55:35 INFO security.UserGroupInformation: Login successful
for user mapred/slavez.demodomain@HADOOP.DEMO.COM using keytab file
/etc/hadoop/conf/mapred.keytab
12/02/14 12:55:35 INFO
delegation.AbstractDelegationTokenSecretManager: Updating the current
master key for generating delegation tokens
12/02/14 12:55:35 INFO mapred.JobTracker: Scheduler configured with
(memSizeForMapSlotOnJT, memSizeForReduceSlotOnJT,
limitMaxMemForMapTasks, limitMaxMemForReduceTasks) (-1, -1, -1, -1)
12/02/14 12:55:35 INFO util.HostsFileReader: Refreshing hosts
(include/exclude) list
12/02/14 12:55:35 INFO mapred.JobTracker: Starting jobtracker with
owner as mapred
12/02/14 12:55:35 INFO
delegation.AbstractDelegationTokenSecretManager: Starting expired
delegation token remover thread, tokenRemoverScanInterval=60 min(s)
12/02/14 12:55:35 INFO
delegation.AbstractDelegationTokenSecretManager: Updating the current
master key for generating delegation tokens
12/02/14 12:55:35 DEBUG impl.MetricsSourceAdapter: Updating attr cache...
12/02/14 12:55:35 DEBUG impl.MetricsSourceAdapter: Done. numMetrics=3
12/02/14 12:55:35 DEBUG impl.MetricsSourceAdapter: Updating info cache...
12/02/14 12:55:35 DEBUG impl.MetricsSourceAdapter: Done
12/02/14 12:55:35 INFO impl.MetricsSourceAdapter: MBean for source
RpcDetailedActivityForPort9001 registered.
12/02/14 12:55:35 DEBUG impl.MetricsSourceAdapter: Stacktrace:
java.lang.Throwable
12/02/14 12:55:35 DEBUG impl.MetricsSystemImpl: Registered source
RpcDetailedActivityForPort9001
12/02/14 12:55:35 DEBUG impl.MetricsSourceAdapter: Updating attr cache...
12/02/14 12:55:35 DEBUG impl.MetricsSourceAdapter: Done. numMetrics=15
12/02/14 12:55:35 DEBUG impl.MetricsSourceAdapter: Updating info cache...
12/02/14 12:55:35 DEBUG impl.MetricsSourceAdapter: Done
12/02/14 12:55:35 INFO impl.MetricsSourceAdapter: MBean for source
RpcActivityForPort9001 registered.
12/02/14 12:55:35 DEBUG impl.MetricsSourceAdapter: Stacktrace:
java.lang.Throwable
12/02/14 12:55:35 DEBUG impl.MetricsSystemImpl: Registered source
RpcActivityForPort9001
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'java.runtime.name' is Java(TM) SE Runtime Environment
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'sun.boot.library.path' is /usr/java/jdk1.6.0_29/jre/lib/amd64
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'java.vm.version'
is 20.4-b02
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'hadoop.root.logger' is INFO,console
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'java.vm.vendor'
is Sun Microsystems Inc.
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'java.vendor.url'
is http://java.sun.com/
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'java.rmi.server.randomIDs' is true
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'path.separator' is :
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'java.vm.name' is
Java HotSpot(TM) 64-Bit Server VM
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'file.encoding.pkg' is sun.io
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'sun.java.launcher' is SUN_STANDARD
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'user.country' is US
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'sun.os.patch.level' is unknown
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'java.vm.specification.name' is Java Virtual Machine Specification
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'user.dir' is
/home/hadoop0214/hadoop-cluster/hadoop-1.0.0
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'java.runtime.version' is 1.6.0_29-b11
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'java.awt.graphicsenv' is sun.awt.X11GraphicsEnvironment
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'java.endorsed.dirs' is /usr/java/jdk1.6.0_29/jre/lib/endorsed
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'os.arch' is amd64
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'java.io.tmpdir' is /tmp
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'line.separator' is

12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'hadoop.log.file'
is hadoop.log
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'java.vm.specification.vendor' is Sun Microsystems Inc.
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'os.name' is Linux
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'hadoop.id.str' is
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'sun.jnu.encoding' is UTF-8
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'java.library.path' is
/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/native/Linux-amd64-64
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'hadoop.home.dir'
is /home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/..
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'java.specification.name' is Java Platform API Specification
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'java.class.version' is 50.0
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'sun.management.compiler' is HotSpot 64-Bit Tiered Compilers
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'os.version' is
2.6.9-34.ELsmp
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'user.home' is /home/mapred
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'user.timezone' is PRC
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'java.awt.printerjob' is sun.print.PSPrinterJob
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'file.encoding' is UTF-8
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'java.specification.version' is 1.6
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'java.class.path'
is /home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../conf:/usr/java/default/lib/tools.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/..:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../hadoop-core-1.0.0.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/asm-3.2.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/aspectjrt-1.6.5.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/aspectjtools-1.6.5.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/commons-beanutils-1.7.0.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/commons-cli-1.2.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/commons-codec-1.4.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/commons-collections-3.2.1.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/commons-configuration-1.6.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/commons-daemon-1.0.1.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/commons-digester-1.8.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/commons-el-1.0.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/commons-httpclient-3.0.1.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/commons-lang-2.4.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/commons-logging-1.1.1.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/commons-logging-api-1.0.4.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/commons-math-2.1.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/commons-net-1.4.1.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/core-3.1.1.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/hadoop-capacity-scheduler-1.0.0.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/hadoop-fairscheduler-1.0.0.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/hadoop-thriftfs-1.0.0.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/hsqldb-1.8.0.10.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/jackson-core-asl-1.0.1.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/jackson-mapper-asl-1.0.1.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/jasper-compiler-5.5.12.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/jasper-runtime-5.5.12.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/jdeb-0.8.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/jersey-core-1.8.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/jersey-json-1.8.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/jersey-server-1.8.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/jets3t-0.6.1.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/jetty-6.1.26.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/jetty-util-6.1.26.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/jsch-0.1.42.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/junit-4.5.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/kfs-0.2.2.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/log4j-1.2.15.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/mockito-all-1.8.5.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/oro-2.0.8.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/servlet-api-2.5-20081211.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/slf4j-api-1.4.3.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/xmlenc-0.52.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/jsp-2.1/jsp-2.1.jar:/home/hadoop0214/hadoop-cluster/hadoop-1.0.0/libexec/../lib/jsp-2.1/jsp-api-2.1.jar
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'user.name' is mapred
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'com.sun.management.jmxremote' is
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'sun.security.krb5.debug' is true
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'java.vm.specification.version' is 1.0
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'sun.java.command'
is org.apache.hadoop.mapred.JobTracker
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'java.home' is
/usr/java/jdk1.6.0_29/jre
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'sun.arch.data.model' is 64
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'hadoop.security.logger' is INFO,DRFAS
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'user.language' is en
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'java.specification.vendor' is Sun Microsystems Inc.
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'hadoop.log.dir'
is /home/hadoop0214/hadoop-cluster/logs
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'java.vm.info' is mixed mode
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'javax.net.debug' is ssl
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'java.version' is 1.6.0_29
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'java.ext.dirs' is
/usr/java/jdk1.6.0_29/jre/lib/ext:/usr/java/packages/lib/ext
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'sun.boot.class.path' is
/usr/java/jdk1.6.0_29/jre/lib/resources.jar:/usr/java/jdk1.6.0_29/jre/lib/rt.jar:/usr/java/jdk1.6.0_29/jre/lib/sunrsasign.jar:/usr/java/jdk1.6.0_29/jre/lib/jsse.jar:/usr/java/jdk1.6.0_29/jre/lib/jce.jar:/usr/java/jdk1.6.0_29/jre/lib/charsets.jar:/usr/java/jdk1.6.0_29/jre/lib/modules/jdk.boot.jar:/usr/java/jdk1.6.0_29/jre/classes
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'java.vendor' is
Sun Microsystems Inc.
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'file.separator' is /
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'java.vendor.url.bug' is http://java.sun.com/cgi-bin/bugreport.cgi
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'sun.io.unicode.encoding' is UnicodeLittle
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'sun.cpu.endian' is little
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'proc_jobtracker' is
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property
'hadoop.policy.file' is hadoop-policy.xml
12/02/14 12:55:35 DEBUG mapred.JobTracker: Property 'sun.cpu.isalist' is
12/02/14 12:55:35 INFO ipc.Server: Starting SocketReader
12/02/14 12:55:35 INFO mortbay.log: Logging to
org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
org.mortbay.log.Slf4jLog
12/02/14 12:55:35 INFO http.HttpServer: Added global filtersafety
(class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
12/02/14 12:55:35 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable
12/02/14 12:55:35 INFO http.HttpServer: Port returned by
webServer.getConnectors()[0].getLocalPort() before open() is -1.
Opening the listener on 50030
12/02/14 12:55:35 INFO http.HttpServer: listener.getLocalPort()
returned 50030 webServer.getConnectors()[0].getLocalPort() returned
50030
12/02/14 12:55:35 INFO http.HttpServer: Jetty bound to port 50030
12/02/14 12:55:35 INFO mortbay.log: jetty-6.1.26
12/02/14 12:55:35 WARN mortbay.log: Can't reuse
/tmp/Jetty_0_0_0_0_50030_job____yn7qmk, using
/tmp/Jetty_0_0_0_0_50030_job____yn7qmk_6821405796694386531
12/02/14 12:55:36 INFO mortbay.log: Started SelectChannelConnector@0.0.0.0:50030
12/02/14 12:55:36 DEBUG mapred.JobTracker: instrumentation class=null
12/02/14 12:55:36 DEBUG impl.MetricsSourceAdapter: Updating attr cache...
12/02/14 12:55:36 DEBUG impl.MetricsSourceAdapter: Done. numMetrics=20
12/02/14 12:55:36 DEBUG impl.MetricsSourceAdapter: Updating info cache...
12/02/14 12:55:36 DEBUG impl.MetricsSourceAdapter: Done
12/02/14 12:55:36 INFO impl.MetricsSourceAdapter: MBean for source jvm
registered.
12/02/14 12:55:36 DEBUG impl.MetricsSourceAdapter: Stacktrace:
java.lang.Throwable
12/02/14 12:55:36 DEBUG impl.MetricsSystemImpl: Registered source jvm
12/02/14 12:55:36 DEBUG impl.MetricsSourceAdapter: Updating attr cache...
12/02/14 12:55:36 DEBUG impl.MetricsSourceAdapter: Done. numMetrics=34
12/02/14 12:55:36 DEBUG impl.MetricsSourceAdapter: Updating info cache...
12/02/14 12:55:36 DEBUG impl.MetricsSourceAdapter: Done
12/02/14 12:55:36 INFO impl.MetricsSourceAdapter: MBean for source
JobTrackerMetrics registered.
12/02/14 12:55:36 DEBUG impl.MetricsSourceAdapter: Stacktrace:
java.lang.Throwable
12/02/14 12:55:36 DEBUG impl.MetricsSystemImpl: Registered source
JobTrackerMetrics
12/02/14 12:55:36 INFO mapred.JobTracker: JobTracker up at: 9001
12/02/14 12:55:36 INFO mapred.JobTracker: JobTracker webserver: 50030
Found ticket for mapred/slavez.demodomain@HADOOP.DEMO.COM to go to
krbtgt/HADOOP.DEMO.COM@HADOOP.DEMO.COM expiring on Wed Feb 15 12:55:35
CST 2012
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for mapred/slavez.demodomain@HADOOP.DEMO.COM to go to
krbtgt/HADOOP.DEMO.COM@HADOOP.DEMO.COM expiring on Wed Feb 15 12:55:35
CST 2012
Service ticket not found in the subject
>>> Credentials acquireServiceCreds: same realm
Using builtin default etypes for default_tgs_enctypes
default etypes for default_tgs_enctypes: 3 1 23 16 17.
>>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>> KrbKdcReq send: kdc=master.demodomain UDP:88, timeout=30000, number of retries
=3, #bytes=708
>>> KDCCommunication: kdc=master.demodomain UDP:88, timeout=30000,Attempt =1, #bytes=708
>>> KrbKdcReq send: #bytes read=695
>>> KrbKdcReq send: #bytes read=695
>>> KdcAccessibility: remove master.demodomain:88
>>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>> KrbApReq: APOptions are 00100000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
Krb5Context setting mySeqNumber to: 922394223
Created InitSecContextToken:
0000: 01 00 6E 82 02 71 30 82   02 6D A0 03 02 01 05 A1  ..n..q0..m......
0010: 03 02 01 0E A2 07 03 05   00 20 00 00 00 A3 82 01  ......... ......
0020: 71 61 82 01 6D 30 82 01   69 A0 03 02 01 05 A1 11  qa..m0..i.......
0030: 1B 0F 48 41 44 4F 4F 50   2E 44 45 4D 4F 2E 43 4F  ..HADOOP.DEMO.CO
0040: 4D A2 19 30 17 A0 03 02   01 00 A1 10 30 0E 1B 04  M..0........0...
0050: 68 64 66 73 1B 06 73 6C   61 76 65 7A A3 82 01 32  hdfs..slavez...2
0060: 30 82 01 2E A0 03 02 01   10 A1 03 02 01 01 A2 82  0...............
0070: 01 20 04 82 01 1C 4E E0   01 9C 3D F4 AA C8 92 C6  . ....N...=.....
0080: 1D 71 FC 4B 82 F9 6E E4   46 01 64 50 7E A9 63 42  .q.K..n.F.dP..cB
0090: 1D 9F 7A 52 18 43 6D 27   76 8A 71 57 1A E1 12 90  ..zR.Cm'v.qW....
00A0: 94 50 40 9B 5E F3 C6 FB   1A 69 EC C6 C0 D5 78 CB  .P@.^....i....x.
00B0: BB 52 01 96 91 FE 50 32   06 5D 0E FB C2 70 81 61  .R....P2.]...p.a
00C0: 9B 54 E8 4C EF 1B CE 32   E3 8B 0A E9 6F 71 80 F7  .T.L...2....oq..
00D0: B4 FC 66 01 DE EF 20 C0   FA FC 92 5A 4B DB F7 DB  ..f... ....ZK...
00E0: A9 FA 93 CF 12 42 D1 6C   18 4B 8F 93 E9 E6 69 C8  .....B.l.K....i.
00F0: E5 1B 3D 25 E6 6B 50 44   66 99 7A F6 F3 B9 C1 A6  ..=%.kPDf.z.....
0100: 65 D4 DB D8 94 56 37 4E   FF D4 B7 7A FD 39 F7 37  e....V7N...z.9.7
0110: A8 E5 8A E2 85 40 BA 32   4C D5 0C 7A 82 4C D0 90  .....@.2L..z.L..
0120: 8E 98 5B AB EF 54 B0 9E   04 34 32 09 01 6C 09 56  ..[..T...42..l.V
0130: 0B 5C 18 75 31 3E 47 87   7E A4 77 15 11 45 D8 D7  .\.u1>G...w..E..
0140: 06 DD 9B 01 81 43 05 73   ED 5A EE C3 D3 8B B4 99  .....C.s.Z......
0150: 8A F4 40 17 24 BA 50 BB   85 D1 A8 D8 BD 64 48 9F  ..@.$.P......dH.
0160: FD 81 51 C0 28 D1 01 23   DC 4C D0 60 B0 1D C7 25  ..Q.(..#.L.`...%
0170: A2 9C 65 00 88 BF B2 47   96 A4 6A 13 A9 8B DA 1A  ..e....G..j.....
0180: B7 6C 3B CD A8 9E 52 BD   90 94 D3 4B 47 F5 55 90  .l;...R....KG.U.
0190: CE F7 A4 81 E2 30 81 DF   A0 03 02 01 10 A2 81 D7  .....0..........
01A0: 04 81 D4 DB 7B AC 74 4E   CD 97 18 A9 0A 12 05 F8  ......tN........
01B0: 7F 57 DE 58 AF 4A CA 3B   96 E9 51 30 74 AD 3B 87  .W.X.J.;..Q0t.;.
01C0: F3 89 91 EB AE A3 1E A0   43 92 B1 C8 51 4F 02 3D  ........C...QO.=
01D0: C5 C8 7A F1 2A 0C 28 DE   31 4C 47 1C 7C B5 C1 A0  ..z.*.(.1LG.....
01E0: 6C AC 12 C0 0A CD 14 9B   73 19 87 C5 D5 55 A9 13  l.......s....U..
01F0: 60 35 DA 85 46 E0 3A 04   38 B9 6E A1 B8 36 8D F7  `5..F.:.8.n..6..
0200: 78 43 04 29 08 A3 5D 30   C8 74 EE 73 1F E6 6C 3C  xC.)..]0.t.s..l<
0210: 78 69 55 58 81 14 A1 4F   B7 70 B2 B1 E1 5F 3D AA  xiUX...O.p..._=.
0220: 2D 07 21 35 A4 E7 09 F4   F5 36 93 3F 19 1F 6F A3  -.!5.....6.?..o.
0230: A8 79 8B 9C F2 F0 B9 B2   B0 C1 A6 66 CE E0 D0 F6  .y.........f....
0240: DC 2A D9 75 D3 80 F3 1A   D1 73 DD 18 FE 81 10 E7  .*.u.....s......
0250: 19 E6 9D 06 DB A3 A5 9D   04 45 CF D7 9A 2B F9 8C  .........E...+..
0260: 67 A4 B2 B7 B4 88 E6 88   2F 2A 93 FE 75 76 73 8B  g......./*..uvs.
0270: 46 20 12 A9 41 64 DB                               F ..Ad.

Entered Krb5Context.initSecContext with state=STATE_IN_PROCESS
>>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
Krb5Context setting peerSeqNumber to: 98166231
Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02
01 04 00 ff ff ff ff 76 d9 83 3b 8f 61 47 db 5a 31 af 59 c5 cd ce 7e
f6 06 eb 6a e0 ca 84 67 51 a0 76 d2 23 a2 10 c0 cb f2 b6 f0 01 01 00
00 04 04 04 04 ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02 01
04 00 ff ff ff ff a7 02 43 11 48 f2 2f 8f 10 6f 8a 61 b3 dc 58 6b fc
58 94 12 b5 87 57 93 fd 49 c9 56 ef 26 15 74 a2 86 87 e3 01 01 00 00
04 04 04 04 ]
12/02/14 12:55:36 INFO mapred.JobTracker: Cleaning up the system directory
12/02/14 12:55:36 WARN fs.FileSystem: "slavez.demodomain" is a
deprecated filesystem name. Use "hdfs://slavez.demodomain/" instead.
12/02/14 12:55:36 INFO mapred.JobHistory: Creating DONE folder at
file:/home/hadoop0214/hadoop-cluster/logs/history/done
12/02/14 12:55:36 WARN fs.FileSystem: "slavez.demodomain" is a
deprecated filesystem name. Use "hdfs://slavez.demodomain/" instead.
12/02/14 12:55:36 INFO mapred.JobTracker: History server being
initialized in embedded mode
12/02/14 12:55:36 WARN fs.FileSystem: "slavez.demodomain" is a
deprecated filesystem name. Use "hdfs://slavez.demodomain/" instead.
12/02/14 12:55:36 WARN fs.FileSystem: "slavez.demodomain" is a
deprecated filesystem name. Use "hdfs://slavez.demodomain/" instead.
12/02/14 12:55:36 INFO mapred.JobHistoryServer: Started job history
server at: localhost:50030
12/02/14 12:55:36 INFO mapred.JobTracker: Job History Server web
address: localhost:50030
12/02/14 12:55:36 INFO mapred.CompletedJobStatusStore: Completed job
store is inactive
12/02/14 12:55:36 INFO mapred.JobTracker: Refreshing hosts information
12/02/14 12:55:36 INFO util.HostsFileReader: Setting the includes file to
12/02/14 12:55:36 INFO util.HostsFileReader: Setting the excludes file to
12/02/14 12:55:36 INFO util.HostsFileReader: Refreshing hosts
(include/exclude) list
12/02/14 12:55:36 INFO mapred.JobTracker: Decommissioning 0 nodes
12/02/14 12:55:36 INFO ipc.Server: IPC Server Responder: starting
12/02/14 12:55:36 INFO ipc.Server: IPC Server listener on 9001: starting
12/02/14 12:55:36 INFO ipc.Server: IPC Server handler 0 on 9001: starting
12/02/14 12:55:36 INFO ipc.Server: IPC Server handler 1 on 9001: starting
12/02/14 12:55:36 INFO ipc.Server: IPC Server handler 2 on 9001: starting
12/02/14 12:55:36 INFO ipc.Server: IPC Server handler 3 on 9001: starting
12/02/14 12:55:36 INFO ipc.Server: IPC Server handler 4 on 9001: starting
12/02/14 12:55:36 INFO ipc.Server: IPC Server handler 5 on 9001: starting
12/02/14 12:55:36 INFO ipc.Server: IPC Server handler 6 on 9001: starting
12/02/14 12:55:36 INFO ipc.Server: IPC Server handler 7 on 9001: starting
12/02/14 12:55:36 INFO ipc.Server: IPC Server handler 8 on 9001: starting
12/02/14 12:55:36 INFO mapred.JobTracker: Starting RUNNING
12/02/14 12:55:36 INFO ipc.Server: IPC Server handler 9 on 9001: starting


(waitting for TaskTracker...)



(TaskTracker started)


12/02/14 12:58:10 ERROR security.UserGroupInformation:
PriviledgedActionException as:mapred/slavez.demodomain@HADOOP.DEMO.COM
cause:javax.security.sasl.SaslException: Failure to initialize
security context [Caused by GSSException: No valid credentials
provided (Mechanism level: Failed to find any Kerberos Key)]
12/02/14 12:58:10 INFO ipc.Server: IPC Server listener on 9001:
readAndProcess threw exception javax.security.sasl.SaslException:
Failure to initialize security context [Caused by GSSException: No
valid credentials provided (Mechanism level: Failed to find any
Kerberos Key)]. Count of bytes read: 0
javax.security.sasl.SaslException: Failure to initialize security
context [Caused by GSSException: No valid credentials provided
(Mechanism level: Failed to find any Kerberos Key)]
	at com.sun.security.sasl.gsskerb.GssKrb5Server.<init>(GssKrb5Server.java:95)
	at com.sun.security.sasl.gsskerb.FactoryImpl.createSaslServer(FactoryImpl.java:67)
	at javax.security.sasl.Sasl.createSaslServer(Sasl.java:491)
	at org.apache.hadoop.ipc.Server$Connection$1.run(Server.java:989)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1083)
	at org.apache.hadoop.ipc.Server$Connection.saslReadAndProcess(Server.java:986)
	at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1180)
	at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:537)
	at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:344)
	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
	at java.lang.Thread.run(Thread.java:662)
Caused by: GSSException: No valid credentials provided (Mechanism
level: Failed to find any Kerberos Key)
	at sun.security.jgss.krb5.Krb5AcceptCredential.getInstance(Krb5AcceptCredential.java:95)
	at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:111)
	at sun.security.jgss.GSSManagerImpl.getCredentialElement(GSSManagerImpl.java:178)
	at sun.security.jgss.GSSCredentialImpl.add(GSSCredentialImpl.java:384)
	at sun.security.jgss.GSSCredentialImpl.<init>(GSSCredentialImpl.java:42)
	at sun.security.jgss.GSSManagerImpl.createCredential(GSSManagerImpl.java:139)
	at com.sun.security.sasl.gsskerb.GssKrb5Server.<init>(GssKrb5Server.java:78)
	... 13 more
12/02/14 12:58:12 ERROR security.UserGroupInformation:
PriviledgedActionException as:mapred/slavez.demodomain@HADOOP.DEMO.COM
cause:javax.security.sasl.SaslException: Failure to initialize
security context [Caused by GSSException: No valid credentials
provided (Mechanism level: Failed to find any Kerberos Key)]
12/02/14 12:58:12 INFO ipc.Server: IPC Server listener on 9001:
readAndProcess threw exception javax.security.sasl.SaslException:
Failure to initialize security context [Caused by GSSException: No
valid credentials provided (Mechanism level: Failed to find any
Kerberos Key)]. Count of bytes read: 0
javax.security.sasl.SaslException: Failure to initialize security
context [Caused by GSSException: No valid credentials provided
(Mechanism level: Failed to find any Kerberos Key)]
	at com.sun.security.sasl.gsskerb.GssKrb5Server.<init>(GssKrb5Server.java:95)
	at com.sun.security.sasl.gsskerb.FactoryImpl.createSaslServer(FactoryImpl.java:67)
	at javax.security.sasl.Sasl.createSaslServer(Sasl.java:491)
	at org.apache.hadoop.ipc.Server$Connection$1.run(Server.java:989)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1083)
	at org.apache.hadoop.ipc.Server$Connection.saslReadAndProcess(Server.java:986)
	at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1180)
	at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:537)
	at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:344)
	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
	at java.lang.Thread.run(Thread.java:662)
Caused by: GSSException: No valid credentials provided (Mechanism
level: Failed to find any Kerberos Key)
	at sun.security.jgss.krb5.Krb5AcceptCredential.getInstance(Krb5AcceptCredential.java:95)
	at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:111)
	at sun.security.jgss.GSSManagerImpl.getCredentialElement(GSSManagerImpl.java:178)
	at sun.security.jgss.GSSCredentialImpl.add(GSSCredentialImpl.java:384)
	at sun.security.jgss.GSSCredentialImpl.<init>(GSSCredentialImpl.java:42)
	at sun.security.jgss.GSSManagerImpl.createCredential(GSSManagerImpl.java:139)
	at com.sun.security.sasl.gsskerb.GssKrb5Server.<init>(GssKrb5Server.java:78)
	... 13 more

Mime
View raw message