hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: HBaseTestingUtility startMiniCluster throw exception
Date Sat, 02 Aug 2014 18:02:40 GMT
In your config, I see:
    <property>
        <name>hbase.rootdir</name>
        <value>file:///scratch/mingtzha/hbase/test</value>
    </property>
    <property>
        <name>hbase.cluster.distributed</name>
        <value>true</value>
    </property>
The default value for hbase.cluster.distributed is false (for standalone
mode).

Since your code is for test, you should keep hbase.cluster.distributed as
false.

Cheers


On Sat, Aug 2, 2014 at 9:51 AM, Mingtao Zhang <mail2mingtao@gmail.com>
wrote:

> HI,
>
> I am really stuck with this. Putting the stack trace, java file, hbase-site
> file and pom file here.
>
> I have 0 knowledge about hadoop and expecting it's transparent for my
> integration test :(.
>
> Thanks in advance!
>
> Best Regards,
> Mingtao
>
> The stack trace:
>
> 09:42:33.191 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
>
> TLD=jar:file:/home/mingtzha/.m2/repository/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar!/META-INF/x.tld
> 09:42:33.194 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
>
> TLD=jar:file:/home/mingtzha/.m2/repository/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar!/META-INF/c-1_0-rt.tld
> 09:42:33.194 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> resolveEntity(-//Sun Microsystems, Inc.//DTD JSP Tag Library 1.2//EN,
> http://java.sun.com/dtd/web-jsptaglibrary_1_2.dtd)
> 09:42:33.194 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - Can't
> exact match entity in redirect map, trying web-jsptaglibrary_1_2.dtd
> 09:42:33.195 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> Redirected entity http://java.sun.com/dtd/web-jsptaglibrary_1_2.dtd -->
>
> jar:file:/home/mingtzha/.m2/repository/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar!/javax/servlet/jsp/resources/web-jsptaglibrary_1_2.dtd
> 09:42:33.200 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
>
> TLD=jar:file:/home/mingtzha/.m2/repository/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar!/META-INF/fmt.tld
> 09:42:33.204 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> Container Server@9f51be6 +
> org.mortbay.jetty.servlet.HashSessionIdManager@445e0565 as
> sessionIdManager
> 09:42:33.204 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - Init
> SecureRandom.
> 09:42:33.204 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> org.mortbay.jetty.servlet.HashSessionIdManager@445e0565
> 09:42:33.205 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> org.mortbay.jetty.servlet.HashSessionManager@738f651f
> 09:42:33.206 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> filterNameMap={safety=safety, krb5Filter=krb5Filter}
> 09:42:33.206 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> pathFilters=[(F=safety,[/*],[],15)]
> 09:42:33.206 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> servletFilterMap=null
> 09:42:33.206 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> servletPathMap={*.XSP=jsp, *.jsp=jsp, /getimage=getimage,
> /cancelDelegationToken=cancelDelegationToken, *.JSPF=jsp, *.jspx=jsp,
> /listPaths/*=listPaths, /conf=conf, *.xsp=jsp, /=default, /fsck=fsck,
> /stacks=stacks, /logLevel=logLevel, *.JSPX=jsp, *.jspf=jsp, /data/*=data,
> /contentSummary/*=contentSummary,
> /renewDelegationToken=renewDelegationToken,
> /getDelegationToken=getDelegationToken, /fileChecksum/*=checksum,
> *.JSP=jsp, /jmx=jmx}
> 09:42:33.206 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> servletNameMap={getDelegationToken=getDelegationToken, jsp=jsp, jmx=jmx,
> data=data, checksum=checksum, conf=conf, stacks=stacks, fsck=fsck,
> cancelDelegationToken=cancelDelegationToken, listPaths=listPaths,
> default=default, logLevel=logLevel, contentSummary=contentSummary,
> getimage=getimage, renewDelegationToken=renewDelegationToken}
> 09:42:33.206 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> starting ServletHandler@3fd5e2ae
> 09:42:33.206 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> ServletHandler@3fd5e2ae
> 09:42:33.206 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> starting SecurityHandler@51f35aea
> 09:42:33.207 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> SecurityHandler@51f35aea
> 09:42:33.207 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> starting SessionHandler@73152e3f
> 09:42:33.207 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> SessionHandler@73152e3f
> 09:42:33.207 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> starting org.mortbay.jetty.webapp.WebAppContext@7cbc11d
>
> {/,jar:file:/home/mingtzha/.m2/repository/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar!/webapps/hdfs}
> 09:42:33.207 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> starting ErrorPageErrorHandler@4b38117e
> 09:42:33.207 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> ErrorPageErrorHandler@4b38117e
> 09:42:33.207 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - loaded
> class
> org.apache.hadoop.security.Krb5AndCertsSslSocketConnector$Krb5SslFilter
> from sun.misc.Launcher$AppClassLoader@23137792
> 09:42:33.207 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - Holding
> class
> org.apache.hadoop.security.Krb5AndCertsSslSocketConnector$Krb5SslFilter
> 09:42:33.208 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> krb5Filter
> 09:42:33.208 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - loaded
> class org.apache.hadoop.http.HttpServer$QuotingInputFilter from
> sun.misc.Launcher$AppClassLoader@23137792
> 09:42:33.208 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - Holding
> class org.apache.hadoop.http.HttpServer$QuotingInputFilter
> 09:42:33.210 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> safety
> 09:42:33.211 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> conf
> 09:42:33.211 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> cancelDelegationToken
> 09:42:33.211 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> contentSummary
> 09:42:33.211 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> checksum
> 09:42:33.211 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> data
> 09:42:33.211 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> fsck
> 09:42:33.211 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> getDelegationToken
> 09:42:33.212 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> getimage
> 09:42:33.212 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> listPaths
> 09:42:33.212 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> renewDelegationToken
> 09:42:33.212 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> stacks
> 09:42:33.212 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> jmx
> 09:42:33.212 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> logLevel
> 09:42:33.212 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - loaded
> class org.apache.jasper.servlet.JspServlet from
> sun.misc.Launcher$AppClassLoader@23137792
> 09:42:33.212 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - Holding
> class org.apache.jasper.servlet.JspServlet
> 09:42:33.250 [main] [39mDEBUG [0;39m [1;35mo.a.j.compiler.JspRuntimeContext
> [0;39m - PWC5965: Parent class loader is: ContextLoader@WepAppsContext([])
> / sun.misc.Launcher$AppClassLoader@23137792
> 09:42:33.252 [main] [39mDEBUG [0;39m
> [1;35morg.apache.jasper.servlet.JspServlet [0;39m - PWC5964: Scratch dir
> for the JSP engine is:
> /tmp/Jetty_localhost_localdomain_1543_hdfs____.om70mh/jsp
> 09:42:33.252 [main] [39mDEBUG [0;39m
> [1;35morg.apache.jasper.servlet.JspServlet [0;39m - PWC5966: IMPORTANT: Do
> not modify the generated servlets
> 09:42:33.252 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> jsp
> 09:42:33.252 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - loaded
> class org.mortbay.jetty.servlet.DefaultServlet
> 09:42:33.252 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - loaded
> class org.mortbay.jetty.servlet.DefaultServlet from
> sun.misc.Launcher$AppClassLoader@23137792
> 09:42:33.252 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - Holding
> class org.mortbay.jetty.servlet.DefaultServlet
> 09:42:33.258 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> org.mortbay.jetty.servlet.DefaultServlet$NIOResourceCache@576f8821
> 09:42:33.258 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> org.mortbay.jetty.ResourceCache@5b525b5f
> 09:42:33.258 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> resource base =
> file:/tmp/Jetty_localhost_localdomain_1543_hdfs____.om70mh/webapp/
> 09:42:33.258 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> default
> 09:42:33.258 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> org.mortbay.jetty.webapp.WebAppContext@7cbc11d
>
> {/,jar:file:/home/mingtzha/.m2/repository/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar!/webapps/hdfs}
> 09:42:33.258 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> Container org.mortbay.jetty.servlet.Context@4e048dc6
> {/logs,file:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/integration-test/repository-itest/target/test-data/830f8900-2879-4ed0-b011-550620ca032f/hadoop-log-dir}
> + ErrorHandler@7bece8cf as errorHandler
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> filterNameMap={safety=safety}
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> pathFilters=[(F=safety,[/*],[],15)]
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> servletFilterMap=null
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> servletPathMap={/=org.apache.hadoop.http.AdminAuthorizedServlet-1117590713}
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
>
> servletNameMap={org.apache.hadoop.http.AdminAuthorizedServlet-1117590713=org.apache.hadoop.http.AdminAuthorizedServlet-1117590713}
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> starting ServletHandler@cf7ea2e
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> ServletHandler@cf7ea2e
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> starting org.mortbay.jetty.servlet.Context@4e048dc6
>
> {/logs,file:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/integration-test/repository-itest/target/test-data/830f8900-2879-4ed0-b011-550620ca032f/hadoop-log-dir}
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> starting ErrorHandler@7bece8cf
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> ErrorHandler@7bece8cf
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - Holding
> class org.apache.hadoop.http.HttpServer$QuotingInputFilter
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> safety
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - Holding
> class org.apache.hadoop.http.AdminAuthorizedServlet
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> org.apache.hadoop.http.AdminAuthorizedServlet-1117590713
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> org.mortbay.jetty.servlet.Context@4e048dc6
>
> {/logs,file:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/integration-test/repository-itest/target/test-data/830f8900-2879-4ed0-b011-550620ca032f/hadoop-log-dir}
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> Container org.mortbay.jetty.servlet.Context@6e4f7806
> {/static,jar:file:/home/mingtzha/.m2/repository/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar!/webapps/static}
> + ErrorHandler@7ea8ad98 as errorHandler
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> filterNameMap={safety=safety}
> 09:42:33.259 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> pathFilters=[(F=safety,[/*],[],15)]
> 09:42:33.260 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> servletFilterMap=null
> 09:42:33.260 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> servletPathMap={/*=org.mortbay.jetty.servlet.DefaultServlet-1788226358}
> 09:42:33.260 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
>
> servletNameMap={org.mortbay.jetty.servlet.DefaultServlet-1788226358=org.mortbay.jetty.servlet.DefaultServlet-1788226358}
> 09:42:33.260 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> starting ServletHandler@23510a7e
> 09:42:33.260 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> ServletHandler@23510a7e
> 09:42:33.260 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> starting org.mortbay.jetty.servlet.Context@6e4f7806
>
> {/static,jar:file:/home/mingtzha/.m2/repository/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar!/webapps/static}
> 09:42:33.260 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> starting ErrorHandler@7ea8ad98
> 09:42:33.260 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> ErrorHandler@7ea8ad98
> 09:42:33.260 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - Holding
> class org.apache.hadoop.http.HttpServer$QuotingInputFilter
> 09:42:33.260 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> safety
> 09:42:33.260 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - Holding
> class org.mortbay.jetty.servlet.DefaultServlet
> 09:42:33.260 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> org.mortbay.jetty.servlet.DefaultServlet-1788226358
> 09:42:33.260 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> org.mortbay.jetty.servlet.Context@6e4f7806
>
> {/static,jar:file:/home/mingtzha/.m2/repository/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar!/webapps/static}
> 09:42:33.260 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> starting ContextHandlerCollection@5a4950dd
> 09:42:33.260 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> ContextHandlerCollection@5a4950dd
> 09:42:33.260 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m -
> starting Server@9f51be6
> 09:42:33.264 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> org.mortbay.jetty.nio.SelectChannelConnector$1@501a7f06
> 09:42:33.272 [main] [34mINFO [0;39m [1;35morg.mortbay.log [0;39m - Started
> SelectChannelConnector@localhost.localdomain:1543
> 09:42:33.273 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> SelectChannelConnector@localhost.localdomain:1543
> 09:42:33.273 [main] [39mDEBUG [0;39m [1;35morg.mortbay.log [0;39m - started
> Server@9f51be6
> 09:42:33.273 [main] [34mINFO [0;39m
> [1;35mo.a.h.hdfs.server.namenode.NameNode [0;39m - Web-server up at:
> localhost.localdomain:1543
> 09:42:33.274 [IPC Server listener on 41118] [34mINFO [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server listener on 41118:
> starting
> 09:42:33.274 [IPC Server Responder] [34mINFO [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder: starting
> 09:42:33.275 [IPC Server handler 0 on 41118] [34mINFO [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 0 on 41118:
> starting
> 09:42:33.276 [IPC Server handler 1 on 41118] [34mINFO [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 1 on 41118:
> starting
> 09:42:33.277 [IPC Server handler 3 on 41118] [34mINFO [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 3 on 41118:
> starting
> 09:42:33.277 [IPC Server handler 4 on 41118] [34mINFO [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 4 on 41118:
> starting
> 09:42:33.277 [IPC Server handler 2 on 41118] [34mINFO [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 2 on 41118:
> starting
> 09:42:33.281 [IPC Server handler 5 on 41118] [34mINFO [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 5 on 41118:
> starting
> 09:42:33.281 [IPC Server handler 6 on 41118] [34mINFO [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 6 on 41118:
> starting
> 09:42:33.281 [IPC Server handler 7 on 41118] [34mINFO [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 7 on 41118:
> starting
> 09:42:33.281 [IPC Server handler 8 on 41118] [34mINFO [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 8 on 41118:
> starting
> 09:42:33.283 [IPC Server handler 9 on 41118] [34mINFO [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 9 on 41118:
> starting
> 09:42:33.287 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.fs.FileSystem
> [0;39m - Creating filesystem for hdfs://slc05muw.us.**.com:41118
> 09:42:33.321 [main] [39mDEBUG [0;39m
> [1;35mo.apache.hadoop.io.retry.RetryUtils [0;39m -
> multipleLinearRandomRetry = null
> 09:42:33.328 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.Client
> [0;39m - The ping interval is60000ms.
> 09:42:33.330 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.Client
> [0;39m - Use SIMPLE authentication for protocol ClientProtocol
> 09:42:33.330 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.Client
> [0;39m - Connecting to slc05muw.us.**.com/10.241.3.35:41118
> 09:42:33.337 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.Client
> [0;39m - IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118
> from mingtzha sending #0
> 09:42:33.337 [IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118 from mingtzha] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Client [0;39m - IPC Client (47) connection to
> slc05muw.us.**.com/10.241.3.35:41118 from mingtzha: starting, having
> connections 1
> 09:42:33.337 [IPC Server listener on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - Server connection from
> 10.241.3.35:24701; # active connections: 1; # queued calls: 0
> 09:42:33.338 [pool-1-thread-1] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - Successfully authorized
> org.apache.hadoop.hdfs.protocol.ClientProtocol-mingtzha
> 09:42:33.338 [pool-1-thread-1] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m -  got #0
> 09:42:33.338 [IPC Server handler 0 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 0 on 41118:
> has #0 from 10.241.3.35:24701
> 09:42:33.339 [IPC Server handler 0 on 41118] [39mDEBUG [0;39m
> [1;35mo.a.h.security.UserGroupInformation [0;39m - PriviledgedAction
> as:mingtzha from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
> 09:42:33.339 [IPC Server handler 0 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - Served: getProtocolVersion
> queueTime= 1 procesingTime= 0
> 09:42:33.340 [IPC Server handler 0 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #0 from 10.241.3.35:24701
> 09:42:33.340 [IPC Server handler 0 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #0 from 10.241.3.35:24701 Wrote 22 bytes.
> 09:42:33.340 [IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118 from mingtzha] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Client [0;39m - IPC Client (47) connection to
> slc05muw.us.**.com/10.241.3.35:41118 from mingtzha got value #0
> 09:42:33.341 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.RPC [0;39m
> - Call: getProtocolVersion 17
> 09:42:33.341 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.hdfs.DFSClient
> [0;39m - Short circuit read is false
> 09:42:33.341 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.hdfs.DFSClient
> [0;39m - Connect to datanode via hostname is false
> 09:42:33.343 [main] [39mDEBUG [0;39m
> [1;35mo.apache.hadoop.io.retry.RetryUtils [0;39m -
> multipleLinearRandomRetry = null
> 09:42:33.343 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.Client
> [0;39m - IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118
> from mingtzha sending #1
> 09:42:33.344 [pool-1-thread-1] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m -  got #1
> 09:42:33.344 [IPC Server handler 1 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 1 on 41118:
> has #1 from 10.241.3.35:24701
> 09:42:33.344 [IPC Server handler 1 on 41118] [39mDEBUG [0;39m
> [1;35mo.a.h.security.UserGroupInformation [0;39m - PriviledgedAction
> as:mingtzha from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
> 09:42:33.344 [IPC Server handler 1 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - Served: getProtocolVersion
> queueTime= 0 procesingTime= 0
> 09:42:33.344 [IPC Server handler 1 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #1 from 10.241.3.35:24701
> 09:42:33.344 [IPC Server handler 1 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #1 from 10.241.3.35:24701 Wrote 22 bytes.
> 09:42:33.344 [IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118 from mingtzha] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Client [0;39m - IPC Client (47) connection to
> slc05muw.us.**.com/10.241.3.35:41118 from mingtzha got value #1
> 09:42:33.344 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.RPC [0;39m
> - Call: getProtocolVersion 1
> 09:42:33.345 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.hdfs.DFSClient
> [0;39m - Short circuit read is false
> 09:42:33.345 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.hdfs.DFSClient
> [0;39m - Connect to datanode via hostname is false
> 09:42:33.345 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.Client
> [0;39m - IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118
> from mingtzha sending #2
> 09:42:33.345 [pool-1-thread-1] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m -  got #2
> 09:42:33.345 [IPC Server handler 3 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 3 on 41118:
> has #2 from 10.241.3.35:24701
> 09:42:33.345 [IPC Server handler 3 on 41118] [39mDEBUG [0;39m
> [1;35mo.a.h.security.UserGroupInformation [0;39m - PriviledgedAction
> as:mingtzha from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
> 09:42:33.356 [IPC Server handler 3 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.security.Groups [0;39m - Returning fetched groups
> for 'mingtzha'
> 09:42:33.356 [IPC Server handler 3 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - Served: getDatanodeReport
> queueTime= 0 procesingTime= 11
> 09:42:33.357 [IPC Server handler 3 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #2 from 10.241.3.35:24701
> 09:42:33.357 [IPC Server handler 3 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #2 from 10.241.3.35:24701 Wrote 61 bytes.
> 09:42:33.357 [IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118 from mingtzha] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Client [0;39m - IPC Client (47) connection to
> slc05muw.us.**.com/10.241.3.35:41118 from mingtzha got value #2
> 09:42:33.357 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.RPC [0;39m
> - Call: getDatanodeReport 12
> Cluster is active
> 09:42:33.376 [main] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.ZooKeeperServer [0;39m - Server
> environment:zookeeper.version=3.4.5-1392090, built on 09/30/2012 17:52 GMT
> 09:42:33.376 [main] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.ZooKeeperServer [0;39m - Server environment:
> host.name=slc05muw.us.**.com
> 09:42:33.376 [main] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.ZooKeeperServer [0;39m - Server
> environment:java.version=1.7.0_45
> 09:42:33.376 [main] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.ZooKeeperServer [0;39m - Server
> environment:java.vendor=** Corporation
> 09:42:33.376 [main] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.ZooKeeperServer [0;39m - Server
> environment:java.home=/scratch/mingtzha/jdk1.7.0_45/jre
> 09:42:33.376 [main] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.ZooKeeperServer [0;39m - Server
>
> environment:java.class.path=/scratch/mingtzha/eclipses/eclipse/plugins/org.testng.eclipse_6.8.6.20141201_2240/lib/testng.jar:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/integration-test/repository-itest/target/test-classes:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/integration-test/repository-itest/target/classes:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/app/test/test-integ/target/classes:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/app/test/test-core/target/classes:/home/mingtzha/.m2/repository/org/testng/testng/6.8.7/testng-6.8.7.jar:/home/mingtzha/.m2/repository/junit/junit/4.10/junit-4.10.jar:/home/mingtzha/.m2/repository/org/hamcrest/hamcrest-core/1.1/hamcrest-core-1.1.jar:/home/mingtzha/.m2/repository/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar:/home/mingtzha/.m2/repository/com/beust/jcommander/1.27/jcommander-1.27.jar:/home/mingtzha/.m2/repository/org/mockito/mockito-all/1.9.5/mockito-all-1.9.5.jar:/home/mingtzha/.m2/repository/org/assertj/assertj-core/1.5.0/assertj-core-1.5.0.jar:/home/mingtzha/.m2/repository/org/glassfish/hk2/hk2-testng/2.3.0-b01/hk2-testng-2.3.0-b01.jar:/home/mingtzha/.m2/repository/org/glassfish/hk2/hk2/2.3.0-b01/hk2-2.3.0-b01.jar:/home/mingtzha/.m2/repository/org/glassfish/hk2/config-types/2.3.0-b01/config-types-2.3.0-b01.jar:/home/mingtzha/.m2/repository/org/glassfish/hk2/core/2.3.0-b01/core-2.3.0-b01.jar:/home/mingtzha/.m2/repository/org/glassfish/hk2/hk2-config/2.3.0-b01/hk2-config-2.3.0-b01.jar:/home/mingtzha/.m2/repository/org/jvnet/tiger-types/1.4/tiger-types-1.4.jar:/home/mingtzha/.m2/repository/org/glassfish/hk2/external/bean-validator/2.3.0-b01/bean-validator-2.3.0-b01.jar:/home/mingtzha/.m2/repository/org/glassfish/hk2/hk2-runlevel/2.3.0-b01/hk2-runlevel-2.3.0-b01.jar:/home/mingtzha/.m2/repository/org/glassfish/hk2/class-model/2.3.0-b01/class-model-2.3.0-b01.jar:/home/mingtzha/.m2/repository/org/glassfish/hk2/external/asm-all-repackaged/2.3.0-b01/asm-all-repackaged-2.3.0-b01.jar:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/app/config/config-core/target/classes:/home/mingtzha/.m2/repository/org/yaml/snakeyaml/1.13/snakeyaml-1.13.jar:/home/mingtzha/.m2/repository/org/apache/kafka/kafka_2.10/0.8.0/kafka_2.10-0.8.0.jar:/home/mingtzha/.m2/repository/org/scala-lang/scala-library/2.10.1/scala-library-2.10.1.jar:/home/mingtzha/.m2/repository/net/sf/jopt-simple/jopt-simple/3.2/jopt-simple-3.2.jar:/home/mingtzha/.m2/repository/org/scala-lang/scala-compiler/2.10.1/scala-compiler-2.10.1.jar:/home/mingtzha/.m2/repository/org/scala-lang/scala-reflect/2.10.1/scala-reflect-2.10.1.jar:/home/mingtzha/.m2/repository/com/101tec/zkclient/0.3/zkclient-0.3.jar:/home/mingtzha/.m2/repository/org/xerial/snappy/snappy-java/
>
> 1.0.4.1/snappy-java-1.0.4.1.jar:/home/mingtzha/.m2/repository/com/yammer/metrics/metrics-annotation/2.2.0/metrics-annotation-2.2.0.jar:/home/mingtzha/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/integration-test/itest-core/target/classes:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/app/core/core-model/target/classes:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/app/core/core-api/target/classes:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/app/core/core-data/target/classes:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/app/core/core-avro/target/classes:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/app/config/config-dev/target/classes:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/app/config/config-shared/target/classes:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/app/repository/repository-core/target/classes:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/app/repository/repository-spi/target/classes:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/app/core/core-common/target/classes:/home/mingtzha/.m2/repository/com/googlecode/owasp-java-html-sanitizer/owasp-java-html-sanitizer/r209/owasp-java-html-sanitizer-r209.jar:/home/mingtzha/.m2/repository/com/google/code/findbugs/jsr305/3.0.0/jsr305-3.0.0.jar:/home/mingtzha/.m2/repository/com/fasterxml/uuid/java-uuid-generator/3.1.3/java-uuid-generator-3.1.3.jar:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/app/repository/repository-hbase/target/classes:/home/mingtzha/.m2/repository/org/apache/avro/avro/1.7.5/avro-1.7.5.jar:/home/mingtzha/.m2/repository/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar:/home/mingtzha/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/home/mingtzha/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:/home/mingtzha/.m2/repository/org/apache/hbase/hbase/0.94.15/hbase-0.94.15.jar:/home/mingtzha/.m2/repository/org/apache/hbase/hbase/0.94.21/hbase-0.94.21-tests.jar:/home/mingtzha/.m2/repository/com/yammer/metrics/metrics-core/2.1.2/metrics-core-2.1.2.jar:/home/mingtzha/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/home/mingtzha/.m2/repository/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/home/mingtzha/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/home/mingtzha/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/home/mingtzha/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/home/mingtzha/.m2/repository/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar:/home/mingtzha/.m2/repository/com/github/stephenc/high-scale-lib/high-scale-lib/1.1.1/high-scale-lib-1.1.1.jar:/home/mingtzha/.m2/repository/commons-codec/commons-codec/1.4/commons-codec-1.4.jar:/home/mingtzha/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/home/mingtzha/.m2/repository/commons-lang/commons-lang/2.5/commons-lang-2.5.jar:/home/mingtzha/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar:/home/mingtzha/.m2/repository/org/apache/avro/avro-ipc/1.5.3/avro-ipc-1.5.3.jar:/home/mingtzha/.m2/repository/org/jboss/netty/netty/3.2.4.Final/netty-3.2.4.Final.jar:/home/mingtzha/.m2/repository/org/apache/velocity/velocity/1.7/velocity-1.7.jar:/home/mingtzha/.m2/repository/org/apache/zookeeper/zookeeper/3.4.5/zookeeper-3.4.5.jar:/home/mingtzha/.m2/repository/org/apache/thrift/libthrift/0.8.0/libthrift-0.8.0.jar:/home/mingtzha/.m2/repository/org/apache/httpcomponents/httpclient/4.1.2/httpclient-4.1.2.jar:/home/mingtzha/.m2/repository/org/apache/httpcomponents/httpcore/4.1.3/httpcore-4.1.3.jar:/home/mingtzha/.m2/repository/org/jruby/jruby-complete/1.6.5/jruby-complete-1.6.5.jar:/home/mingtzha/.m2/repository/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar:/home/mingtzha/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/home/mingtzha/.m2/repository/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar:/home/mingtzha/.m2/repository/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar:/home/mingtzha/.m2/repository/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar:/home/mingtzha/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.8.8/jackson-core-asl-1.8.8.jar:/home/mingtzha/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.8.8/jackson-mapper-asl-1.8.8.jar:/home/mingtzha/.m2/repository/org/codehaus/jackson/jackson-jaxrs/1.8.8/jackson-jaxrs-1.8.8.jar:/home/mingtzha/.m2/repository/org/codehaus/jackson/jackson-xc/1.8.8/jackson-xc-1.8.8.jar:/home/mingtzha/.m2/repository/tomcat/jasper-compiler/5.5.23/jasper-compiler-5.5.23.jar:/home/mingtzha/.m2/repository/tomcat/jasper-runtime/5.5.23/jasper-runtime-5.5.23.jar:/home/mingtzha/.m2/repository/org/jamon/jamon-runtime/2.3.1/jamon-runtime-2.3.1.jar:/home/mingtzha/.m2/repository/com/google/protobuf/protobuf-java/2.4.0a/protobuf-java-2.4.0a.jar:/home/mingtzha/.m2/repository/com/sun/jersey/jersey-core/1.8/jersey-core-1.8.jar:/home/mingtzha/.m2/repository/com/sun/jersey/jersey-json/1.8/jersey-json-1.8.jar:/home/mingtzha/.m2/repository/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar:/home/mingtzha/.m2/repository/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar:/home/mingtzha/.m2/repository/com/sun/jersey/jersey-server/1.8/jersey-server-1.8.jar:/home/mingtzha/.m2/repository/asm/asm/3.1/asm-3.1.jar:/home/mingtzha/.m2/repository/javax/xml/bind/jaxb-api/2.1/jaxb-api-2.1.jar:/home/mingtzha/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar:/home/mingtzha/.m2/repository/stax/stax-api/1.0.1/stax-api-1.0.1.jar:/home/mingtzha/.m2/repository/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar:/home/mingtzha/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/home/mingtzha/.m2/repository/org/apache/commons/commons-math/2.1/commons-math-2.1.jar:/home/mingtzha/.m2/repository/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar:/home/mingtzha/.m2/repository/commons-el/commons-el/1.0/commons-el-1.0.jar:/home/mingtzha/.m2/repository/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar:/home/mingtzha/.m2/repository/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar:/home/mingtzha/.m2/repository/oro/oro/2.0.8/oro-2.0.8.jar:/home/mingtzha/.m2/repository/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar:/home/mingtzha/.m2/repository/org/apache/hadoop/hadoop-test/1.2.1/hadoop-test-1.2.1.jar:/home/mingtzha/.m2/repository/org/apache/ftpserver/ftplet-api/1.0.0/ftplet-api-1.0.0.jar:/home/mingtzha/.m2/repository/org/apache/mina/mina-core/2.0.0-M5/mina-core-2.0.0-M5.jar:/home/mingtzha/.m2/repository/org/apache/ftpserver/ftpserver-core/1.0.0/ftpserver-core-1.0.0.jar:/home/mingtzha/.m2/repository/org/apache/ftpserver/ftpserver-deprecated/1.0.0-M2/ftpserver-deprecated-1.0.0-M2.jar:/home/mingtzha/.m2/repository/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar:/home/mingtzha/.m2/repository/org/slf4j/slf4j-ext/1.7.5/slf4j-ext-1.7.5.jar:/home/mingtzha/.m2/repository/ch/qos/cal10n/cal10n-api/0.7.4/cal10n-api-0.7.4.jar:/home/mingtzha/.m2/repository/org/slf4j/jcl-over-slf4j/1.7.5/jcl-over-slf4j-1.7.5.jar:/home/mingtzha/.m2/repository/org/slf4j/log4j-over-slf4j/1.7.5/log4j-over-slf4j-1.7.5.jar:/home/mingtzha/.m2/repository/org/slf4j/jul-to-slf4j/1.7.5/jul-to-slf4j-1.7.5.jar:/home/mingtzha/.m2/repository/ch/qos/logback/logback-classic/1.0.13/logback-classic-1.0.13.jar:/home/mingtzha/.m2/repository/ch/qos/logback/logback-core/1.0.13/logback-core-1.0.13.jar:/home/mingtzha/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/home/mingtzha/.m2/repository/org/fusesource/jansi/jansi/1.11/jansi-1.11.jar:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/config-zookeeper/target/classes:/home/mingtzha/.m2/repository/com/google/guava/guava/16.0.1/guava-16.0.1.jar:/home/mingtzha/.m2/repository/joda-time/joda-time/2.3/joda-time-2.3.jar:/home/mingtzha/.m2/repository/org/glassfish/hk2/hk2-locator/2.3.0-b01/hk2-locator-2.3.0-b01.jar:/home/mingtzha/.m2/repository/org/glassfish/hk2/external/javax.inject/2.3.0-b01/javax.inject-2.3.0-b01.jar:/home/mingtzha/.m2/repository/org/glassfish/hk2/external/aopalliance-repackaged/2.3.0-b01/aopalliance-repackaged-2.3.0-b01.jar:/home/mingtzha/.m2/repository/org/glassfish/hk2/hk2-api/2.3.0-b01/hk2-api-2.3.0-b01.jar:/home/mingtzha/.m2/repository/javax/inject/javax.inject/1/javax.inject-1.jar:/home/mingtzha/.m2/repository/org/glassfish/hk2/hk2-utils/2.3.0-b01/hk2-utils-2.3.0-b01.jar:/home/mingtzha/.m2/repository/org/javassist/javassist/3.18.1-GA/javassist-3.18.1-GA.jar
> 09:42:33.377 [main] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.ZooKeeperServer [0;39m - Server
>
> environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
> 09:42:33.377 [main] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.ZooKeeperServer [0;39m - Server
> environment:java.io.tmpdir=/tmp
> 09:42:33.377 [main] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.ZooKeeperServer [0;39m - Server
> environment:java.compiler=<NA>
> 09:42:33.377 [main] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.ZooKeeperServer [0;39m - Server environment:
> os.name=Linux
> 09:42:33.377 [main] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.ZooKeeperServer [0;39m - Server
> environment:os.arch=amd64
> 09:42:33.377 [main] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.ZooKeeperServer [0;39m - Server
> environment:os.version=2.6.39-300.20.1.el6uek.x86_64
> 09:42:33.377 [main] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.ZooKeeperServer [0;39m - Server environment:
> user.name=mingtzha
> 09:42:33.377 [main] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.ZooKeeperServer [0;39m - Server
> environment:user.home=/home/mingtzha
> 09:42:33.377 [main] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.ZooKeeperServer [0;39m - Server
>
> environment:user.dir=/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/integration-test/repository-itest
> 09:42:33.380 [main] [39mDEBUG [0;39m
> [1;35mo.a.z.s.persistence.FileTxnSnapLog [0;39m - Opening
>
> datadir:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/integration-test/repository-itest/target/test-data/830f8900-2879-4ed0-b011-550620ca032f/dfscluster_de01abd7-7001-4642-9a00-f1100be0d193/zookeeper_0
>
> snapDir:/scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/integration-test/repository-itest/target/test-data/830f8900-2879-4ed0-b011-550620ca032f/dfscluster_de01abd7-7001-4642-9a00-f1100be0d193/zookeeper_0
> 09:42:33.394 [main] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.ZooKeeperServer [0;39m - Created server with
> tickTime 2000 minSessionTimeout 4000 maxSessionTimeout 40000 datadir
>
> /scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/integration-test/repository-itest/target/test-data/830f8900-2879-4ed0-b011-550620ca032f/dfscluster_de01abd7-7001-4642-9a00-f1100be0d193/zookeeper_0/version-2
> snapdir
>
> /scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/integration-test/repository-itest/target/test-data/830f8900-2879-4ed0-b011-550620ca032f/dfscluster_de01abd7-7001-4642-9a00-f1100be0d193/zookeeper_0/version-2
> 09:42:33.400 [main] [34mINFO [0;39m [1;35mo.a.z.server.NIOServerCnxnFactory
> [0;39m - binding to port 0.0.0.0/0.0.0.0:51126
> 09:42:33.405 [main] [34mINFO [0;39m
> [1;35mo.a.z.s.persistence.FileTxnSnapLog [0;39m - Snapshotting: 0x0 to
>
> /scratch/mingtzha/repository/12.1/branches/12.1.4.0.3/webcentersites/siteanalytics/integration-test/repository-itest/target/test-data/830f8900-2879-4ed0-b011-550620ca032f/dfscluster_de01abd7-7001-4642-9a00-f1100be0d193/zookeeper_0/version-2/snapshot.0
> 09:42:33.431 [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:51126] [34mINFO [0;39m
> [1;35mo.a.z.server.NIOServerCnxnFactory [0;39m - Accepted socket connection
> from /10.241.3.35:44625
> 09:42:33.437 [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:51126] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.NIOServerCnxn [0;39m - Processing stat command
> from /10.241.3.35:44625
> 09:42:33.442 [Thread-25] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.NIOServerCnxn [0;39m - Stat command output
> 09:42:33.442 [Thread-25] [34mINFO [0;39m
> [1;35mo.a.zookeeper.server.NIOServerCnxn [0;39m - Closed socket connection
> for client /10.241.3.35:44625 (no session established for client)
> 09:42:33.442 [main] [34mINFO [0;39m [1;35mo.a.h.h.z.MiniZooKeeperCluster
> [0;39m - Started MiniZK Cluster and connect 1 ZK server on client port:
> 51126
> 09:42:33.443 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.hdfs.DFSClient
> [0;39m - /user/mingtzha/hbase: masked=rwxr-xr-x
> 09:42:33.443 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.Client
> [0;39m - IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118
> from mingtzha sending #3
> 09:42:33.444 [pool-1-thread-1] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m -  got #3
> 09:42:33.445 [IPC Server handler 4 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 4 on 41118:
> has #3 from 10.241.3.35:24701
> 09:42:33.445 [IPC Server handler 4 on 41118] [39mDEBUG [0;39m
> [1;35mo.a.h.security.UserGroupInformation [0;39m - PriviledgedAction
> as:mingtzha from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
> 09:42:33.445 [IPC Server handler 4 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - *DIR* NameNode.mkdirs:
> /user/mingtzha/hbase
> 09:42:33.445 [IPC Server handler 4 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* mkdirs:
> /user/mingtzha/hbase
> 09:42:33.445 [IPC Server handler 4 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.security.Groups [0;39m - Returning cached groups
> for 'mingtzha'
> 09:42:33.447 [IPC Server handler 4 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* FSDirectory.mkdirs:
> created directory /user
> 09:42:33.447 [IPC Server handler 4 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* FSDirectory.mkdirs:
> created directory /user/mingtzha
> 09:42:33.447 [IPC Server handler 4 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* FSDirectory.mkdirs:
> created directory /user/mingtzha/hbase
> 09:42:33.448 [IPC Server handler 4 on 41118] [39mDEBUG [0;39m
> [1;35mo.a.h.h.server.namenode.FSNamesystem [0;39m - Preallocated 1048576
> bytes at the end of the edit log (offset 4)
> 09:42:33.452 [IPC Server handler 4 on 41118] [39mDEBUG [0;39m
> [1;35mo.a.h.h.server.namenode.FSNamesystem [0;39m - Preallocated 1048576
> bytes at the end of the edit log (offset 4)
> 09:42:33.455 [IPC Server handler 4 on 41118] [34mINFO [0;39m
> [1;35mo.a.h.h.s.n.FSNamesystem.audit [0;39m - ugi=mingtzha    ip=/
> 10.241.3.35    cmd=mkdirs    src=/user/mingtzha/hbase    dst=null
> perm=mingtzha:supergroup:rwxr-xr-x
> 09:42:33.455 [IPC Server handler 4 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - Served: mkdirs queueTime= 0
> procesingTime= 10
> 09:42:33.455 [IPC Server handler 4 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #3 from 10.241.3.35:24701
> 09:42:33.455 [IPC Server handler 4 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #3 from 10.241.3.35:24701 Wrote 18 bytes.
> 09:42:33.455 [IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118 from mingtzha] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Client [0;39m - IPC Client (47) connection to
> slc05muw.us.**.com/10.241.3.35:41118 from mingtzha got value #3
> 09:42:33.455 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.RPC [0;39m
> - Call: mkdirs 12
> 09:42:33.461 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.hdfs.DFSClient
> [0;39m - /user/mingtzha/hbase/hbase.version: masked=rwxr-xr-x
> 09:42:33.468 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.hdfs.DFSClient
> [0;39m - computePacketChunkSize: src=/user/mingtzha/hbase/hbase.version,
> chunkSize=516, chunksPerPacket=127, packetSize=65557
> 09:42:33.469 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.Client
> [0;39m - IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118
> from mingtzha sending #4
> 09:42:33.469 [pool-1-thread-1] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m -  got #4
> 09:42:33.470 [IPC Server handler 2 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 2 on 41118:
> has #4 from 10.241.3.35:24701
> 09:42:33.470 [IPC Server handler 2 on 41118] [39mDEBUG [0;39m
> [1;35mo.a.h.security.UserGroupInformation [0;39m - PriviledgedAction
> as:mingtzha from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
> 09:42:33.479 [IPC Server handler 2 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - *DIR* NameNode.create:
> /user/mingtzha/hbase/hbase.version for DFSClient_NONMAPREDUCE_-237185081_1
> at 10.241.3.35
> 09:42:33.479 [IPC Server handler 2 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* startFile:
> src=/user/mingtzha/hbase/hbase.version,
> holder=DFSClient_NONMAPREDUCE_-237185081_1, clientMachine=10.241.3.35,
> createParent=true, replication=0, overwrite=true, append=false
> 09:42:33.479 [IPC Server handler 2 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.security.Groups [0;39m - Returning cached groups
> for 'mingtzha'
> 09:42:33.479 [IPC Server handler 2 on 41118] [31mWARN [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* startFile: failed to
> create file /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
> 09:42:33.480 [IPC Server handler 2 on 41118] [1;31mERROR [0;39m
> [1;35mo.a.h.security.UserGroupInformation [0;39m -
> PriviledgedActionException as:mingtzha cause:java.io.IOException: failed to
> create file /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
> 09:42:33.480 [IPC Server handler 2 on 41118] [34mINFO [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 2 on 41118,
> call create(/user/mingtzha/hbase/hbase.version, rwxr-xr-x,
> DFSClient_NONMAPREDUCE_-237185081_1, true, 0, 67108864) from
> 10.241.3.35:24701: error: java.io.IOException: failed to create file
> /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
> java.io.IOException: failed to create file
> /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1591)
> ~[hadoop-core-1.2.1.jar:na]
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1527)
> ~[hadoop-core-1.2.1.jar:na]
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:710)
> ~[hadoop-core-1.2.1.jar:na]
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:689)
> ~[hadoop-core-1.2.1.jar:na]
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[na:1.7.0_45]
>     at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> ~[na:1.7.0_45]
>     at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[na:1.7.0_45]
>     at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_45]
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587)
> ~[hadoop-core-1.2.1.jar:na]
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1432)
> ~[hadoop-core-1.2.1.jar:na]
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1428)
> ~[hadoop-core-1.2.1.jar:na]
>     at java.security.AccessController.doPrivileged(Native Method)
> ~[na:1.7.0_45]
>     at javax.security.auth.Subject.doAs(Subject.java:415) ~[na:1.7.0_45]
>     at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
> ~[hadoop-core-1.2.1.jar:na]
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
> ~[hadoop-core-1.2.1.jar:na]
> 09:42:33.481 [IPC Server handler 2 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #4 from 10.241.3.35:24701
> 09:42:33.481 [IPC Server handler 2 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #4 from 10.241.3.35:24701 Wrote 1285 bytes.
> 09:42:33.482 [IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118 from mingtzha] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Client [0;39m - IPC Client (47) connection to
> slc05muw.us.**.com/10.241.3.35:41118 from mingtzha got value #4
> 09:42:33.482 [main] [31mWARN [0;39m
> [1;35morg.apache.hadoop.hbase.util.FSUtils [0;39m - Unable to create
> version file at hdfs://slc05muw.us.**.com:41118/user/mingtzha/hbase,
> retrying: java.io.IOException: failed to create file
> /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1591)
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1527)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:710)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:689)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1432)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1428)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
>
> 09:42:33.483 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.Client
> [0;39m - IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118
> from mingtzha sending #5
> 09:42:33.483 [pool-1-thread-1] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m -  got #5
> 09:42:33.483 [IPC Server handler 5 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 5 on 41118:
> has #5 from 10.241.3.35:24701
> 09:42:33.483 [IPC Server handler 5 on 41118] [39mDEBUG [0;39m
> [1;35mo.a.h.security.UserGroupInformation [0;39m - PriviledgedAction
> as:mingtzha from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
> 09:42:33.483 [IPC Server handler 5 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - *DIR* Namenode.delete:
> src=/user/mingtzha/hbase/hbase.version, recursive=false
> 09:42:33.483 [IPC Server handler 5 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* delete:
> /user/mingtzha/hbase/hbase.version
> 09:42:33.484 [IPC Server handler 5 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.security.Groups [0;39m - Returning cached groups
> for 'mingtzha'
> 09:42:33.484 [IPC Server handler 5 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* FSDirectory.delete:
> /user/mingtzha/hbase/hbase.version
> 09:42:33.484 [IPC Server handler 5 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR*
> FSDirectory.unprotectedDelete: failed to remove
> /user/mingtzha/hbase/hbase.version because it does not exist
> 09:42:33.484 [IPC Server handler 5 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - Served: delete queueTime= 0
> procesingTime= 1
> 09:42:33.484 [IPC Server handler 5 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #5 from 10.241.3.35:24701
> 09:42:33.484 [IPC Server handler 5 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #5 from 10.241.3.35:24701 Wrote 18 bytes.
> 09:42:33.484 [IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118 from mingtzha] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Client [0;39m - IPC Client (47) connection to
> slc05muw.us.**.com/10.241.3.35:41118 from mingtzha got value #5
> 09:42:33.484 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.RPC [0;39m
> - Call: delete 1
> 09:42:33.484 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.hdfs.DFSClient
> [0;39m - /user/mingtzha/hbase/hbase.version: masked=rwxr-xr-x
> 09:42:33.484 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.hdfs.DFSClient
> [0;39m - computePacketChunkSize: src=/user/mingtzha/hbase/hbase.version,
> chunkSize=516, chunksPerPacket=127, packetSize=65557
> 09:42:33.485 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.Client
> [0;39m - IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118
> from mingtzha sending #6
> 09:42:33.485 [pool-1-thread-1] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m -  got #6
> 09:42:33.485 [IPC Server handler 6 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 6 on 41118:
> has #6 from 10.241.3.35:24701
> 09:42:33.485 [IPC Server handler 6 on 41118] [39mDEBUG [0;39m
> [1;35mo.a.h.security.UserGroupInformation [0;39m - PriviledgedAction
> as:mingtzha from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
> 09:42:33.485 [IPC Server handler 6 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - *DIR* NameNode.create:
> /user/mingtzha/hbase/hbase.version for DFSClient_NONMAPREDUCE_-237185081_1
> at 10.241.3.35
> 09:42:33.486 [IPC Server handler 6 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* startFile:
> src=/user/mingtzha/hbase/hbase.version,
> holder=DFSClient_NONMAPREDUCE_-237185081_1, clientMachine=10.241.3.35,
> createParent=true, replication=0, overwrite=true, append=false
> 09:42:33.486 [IPC Server handler 6 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.security.Groups [0;39m - Returning cached groups
> for 'mingtzha'
> 09:42:33.486 [IPC Server handler 6 on 41118] [31mWARN [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* startFile: failed to
> create file /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
> 09:42:33.486 [IPC Server handler 6 on 41118] [1;31mERROR [0;39m
> [1;35mo.a.h.security.UserGroupInformation [0;39m -
> PriviledgedActionException as:mingtzha cause:java.io.IOException: failed to
> create file /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
> 09:42:33.486 [IPC Server handler 6 on 41118] [34mINFO [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 6 on 41118,
> call create(/user/mingtzha/hbase/hbase.version, rwxr-xr-x,
> DFSClient_NONMAPREDUCE_-237185081_1, true, 0, 67108864) from
> 10.241.3.35:24701: error: java.io.IOException: failed to create file
> /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
> java.io.IOException: failed to create file
> /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1591)
> ~[hadoop-core-1.2.1.jar:na]
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1527)
> ~[hadoop-core-1.2.1.jar:na]
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:710)
> ~[hadoop-core-1.2.1.jar:na]
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:689)
> ~[hadoop-core-1.2.1.jar:na]
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[na:1.7.0_45]
>     at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> ~[na:1.7.0_45]
>     at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[na:1.7.0_45]
>     at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_45]
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587)
> ~[hadoop-core-1.2.1.jar:na]
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1432)
> ~[hadoop-core-1.2.1.jar:na]
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1428)
> ~[hadoop-core-1.2.1.jar:na]
>     at java.security.AccessController.doPrivileged(Native Method)
> ~[na:1.7.0_45]
>     at javax.security.auth.Subject.doAs(Subject.java:415) ~[na:1.7.0_45]
>     at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
> ~[hadoop-core-1.2.1.jar:na]
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
> ~[hadoop-core-1.2.1.jar:na]
> 09:42:33.487 [IPC Server handler 6 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #6 from 10.241.3.35:24701
> 09:42:33.487 [IPC Server handler 6 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #6 from 10.241.3.35:24701 Wrote 1285 bytes.
> 09:42:33.487 [IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118 from mingtzha] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Client [0;39m - IPC Client (47) connection to
> slc05muw.us.**.com/10.241.3.35:41118 from mingtzha got value #6
> 09:42:33.487 [main] [31mWARN [0;39m
> [1;35morg.apache.hadoop.hbase.util.FSUtils [0;39m - Unable to create
> version file at hdfs://slc05muw.us.**.com:41118/user/mingtzha/hbase,
> retrying: java.io.IOException: failed to create file
> /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1591)
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1527)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:710)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:689)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1432)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1428)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
>
> 09:42:33.487 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.Client
> [0;39m - IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118
> from mingtzha sending #7
> 09:42:33.488 [pool-1-thread-1] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m -  got #7
> 09:42:33.488 [IPC Server handler 7 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 7 on 41118:
> has #7 from 10.241.3.35:24701
> 09:42:33.488 [IPC Server handler 7 on 41118] [39mDEBUG [0;39m
> [1;35mo.a.h.security.UserGroupInformation [0;39m - PriviledgedAction
> as:mingtzha from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
> 09:42:33.488 [IPC Server handler 7 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - *DIR* Namenode.delete:
> src=/user/mingtzha/hbase/hbase.version, recursive=false
> 09:42:33.488 [IPC Server handler 7 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* delete:
> /user/mingtzha/hbase/hbase.version
> 09:42:33.488 [IPC Server handler 7 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.security.Groups [0;39m - Returning cached groups
> for 'mingtzha'
> 09:42:33.488 [IPC Server handler 7 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* FSDirectory.delete:
> /user/mingtzha/hbase/hbase.version
> 09:42:33.488 [IPC Server handler 7 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR*
> FSDirectory.unprotectedDelete: failed to remove
> /user/mingtzha/hbase/hbase.version because it does not exist
> 09:42:33.488 [IPC Server handler 7 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - Served: delete queueTime= 0
> procesingTime= 0
> 09:42:33.489 [IPC Server handler 7 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #7 from 10.241.3.35:24701
> 09:42:33.489 [IPC Server handler 7 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #7 from 10.241.3.35:24701 Wrote 18 bytes.
> 09:42:33.489 [IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118 from mingtzha] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Client [0;39m - IPC Client (47) connection to
> slc05muw.us.**.com/10.241.3.35:41118 from mingtzha got value #7
> 09:42:33.489 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.RPC [0;39m
> - Call: delete 2
> 09:42:33.489 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.hdfs.DFSClient
> [0;39m - /user/mingtzha/hbase/hbase.version: masked=rwxr-xr-x
> 09:42:33.489 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.hdfs.DFSClient
> [0;39m - computePacketChunkSize: src=/user/mingtzha/hbase/hbase.version,
> chunkSize=516, chunksPerPacket=127, packetSize=65557
> 09:42:33.489 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.Client
> [0;39m - IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118
> from mingtzha sending #8
> 09:42:33.489 [pool-1-thread-1] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m -  got #8
> 09:42:33.490 [IPC Server handler 8 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 8 on 41118:
> has #8 from 10.241.3.35:24701
> 09:42:33.490 [IPC Server handler 8 on 41118] [39mDEBUG [0;39m
> [1;35mo.a.h.security.UserGroupInformation [0;39m - PriviledgedAction
> as:mingtzha from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
> 09:42:33.490 [IPC Server handler 8 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - *DIR* NameNode.create:
> /user/mingtzha/hbase/hbase.version for DFSClient_NONMAPREDUCE_-237185081_1
> at 10.241.3.35
> 09:42:33.490 [IPC Server handler 8 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* startFile:
> src=/user/mingtzha/hbase/hbase.version,
> holder=DFSClient_NONMAPREDUCE_-237185081_1, clientMachine=10.241.3.35,
> createParent=true, replication=0, overwrite=true, append=false
> 09:42:33.490 [IPC Server handler 8 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.security.Groups [0;39m - Returning cached groups
> for 'mingtzha'
> 09:42:33.491 [IPC Server handler 8 on 41118] [31mWARN [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* startFile: failed to
> create file /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
> 09:42:33.491 [IPC Server handler 8 on 41118] [1;31mERROR [0;39m
> [1;35mo.a.h.security.UserGroupInformation [0;39m -
> PriviledgedActionException as:mingtzha cause:java.io.IOException: failed to
> create file /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
> 09:42:33.491 [IPC Server handler 8 on 41118] [34mINFO [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 8 on 41118,
> call create(/user/mingtzha/hbase/hbase.version, rwxr-xr-x,
> DFSClient_NONMAPREDUCE_-237185081_1, true, 0, 67108864) from
> 10.241.3.35:24701: error: java.io.IOException: failed to create file
> /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
> java.io.IOException: failed to create file
> /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1591)
> ~[hadoop-core-1.2.1.jar:na]
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1527)
> ~[hadoop-core-1.2.1.jar:na]
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:710)
> ~[hadoop-core-1.2.1.jar:na]
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:689)
> ~[hadoop-core-1.2.1.jar:na]
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[na:1.7.0_45]
>     at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> ~[na:1.7.0_45]
>     at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[na:1.7.0_45]
>     at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_45]
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587)
> ~[hadoop-core-1.2.1.jar:na]
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1432)
> ~[hadoop-core-1.2.1.jar:na]
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1428)
> ~[hadoop-core-1.2.1.jar:na]
>     at java.security.AccessController.doPrivileged(Native Method)
> ~[na:1.7.0_45]
>     at javax.security.auth.Subject.doAs(Subject.java:415) ~[na:1.7.0_45]
>     at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
> ~[hadoop-core-1.2.1.jar:na]
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
> ~[hadoop-core-1.2.1.jar:na]
> 09:42:33.492 [IPC Server handler 8 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #8 from 10.241.3.35:24701
> 09:42:33.492 [IPC Server handler 8 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #8 from 10.241.3.35:24701 Wrote 1285 bytes.
> 09:42:33.492 [IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118 from mingtzha] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Client [0;39m - IPC Client (47) connection to
> slc05muw.us.**.com/10.241.3.35:41118 from mingtzha got value #8
> 09:42:33.492 [main] [31mWARN [0;39m
> [1;35morg.apache.hadoop.hbase.util.FSUtils [0;39m - Unable to create
> version file at hdfs://slc05muw.us.**.com:41118/user/mingtzha/hbase,
> retrying: java.io.IOException: failed to create file
> /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1591)
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1527)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:710)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:689)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1432)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1428)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
>
> 09:42:33.492 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.Client
> [0;39m - IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118
> from mingtzha sending #9
> 09:42:33.493 [pool-1-thread-1] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m -  got #9
> 09:42:33.493 [IPC Server handler 9 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 9 on 41118:
> has #9 from 10.241.3.35:24701
> 09:42:33.493 [IPC Server handler 9 on 41118] [39mDEBUG [0;39m
> [1;35mo.a.h.security.UserGroupInformation [0;39m - PriviledgedAction
> as:mingtzha from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
> 09:42:33.493 [IPC Server handler 9 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - *DIR* Namenode.delete:
> src=/user/mingtzha/hbase/hbase.version, recursive=false
> 09:42:33.493 [IPC Server handler 9 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* delete:
> /user/mingtzha/hbase/hbase.version
> 09:42:33.493 [IPC Server handler 9 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.security.Groups [0;39m - Returning cached groups
> for 'mingtzha'
> 09:42:33.493 [IPC Server handler 9 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* FSDirectory.delete:
> /user/mingtzha/hbase/hbase.version
> 09:42:33.493 [IPC Server handler 9 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR*
> FSDirectory.unprotectedDelete: failed to remove
> /user/mingtzha/hbase/hbase.version because it does not exist
> 09:42:33.493 [IPC Server handler 9 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - Served: delete queueTime= 0
> procesingTime= 0
> 09:42:33.493 [IPC Server handler 9 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #9 from 10.241.3.35:24701
> 09:42:33.494 [IPC Server handler 9 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #9 from 10.241.3.35:24701 Wrote 18 bytes.
> 09:42:33.494 [IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118 from mingtzha] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Client [0;39m - IPC Client (47) connection to
> slc05muw.us.**.com/10.241.3.35:41118 from mingtzha got value #9
> 09:42:33.494 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.RPC [0;39m
> - Call: delete 2
> 09:42:33.494 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.hdfs.DFSClient
> [0;39m - /user/mingtzha/hbase/hbase.version: masked=rwxr-xr-x
> 09:42:33.494 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.hdfs.DFSClient
> [0;39m - computePacketChunkSize: src=/user/mingtzha/hbase/hbase.version,
> chunkSize=516, chunksPerPacket=127, packetSize=65557
> 09:42:33.494 [main] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.Client
> [0;39m - IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118
> from mingtzha sending #10
> 09:42:33.494 [pool-1-thread-1] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m -  got #10
> 09:42:33.495 [IPC Server handler 0 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 0 on 41118:
> has #10 from 10.241.3.35:24701
> 09:42:33.495 [IPC Server handler 0 on 41118] [39mDEBUG [0;39m
> [1;35mo.a.h.security.UserGroupInformation [0;39m - PriviledgedAction
> as:mingtzha from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
> 09:42:33.495 [IPC Server handler 0 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - *DIR* NameNode.create:
> /user/mingtzha/hbase/hbase.version for DFSClient_NONMAPREDUCE_-237185081_1
> at 10.241.3.35
> 09:42:33.495 [IPC Server handler 0 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* startFile:
> src=/user/mingtzha/hbase/hbase.version,
> holder=DFSClient_NONMAPREDUCE_-237185081_1, clientMachine=10.241.3.35,
> createParent=true, replication=0, overwrite=true, append=false
> 09:42:33.495 [IPC Server handler 0 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.security.Groups [0;39m - Returning cached groups
> for 'mingtzha'
> 09:42:33.495 [IPC Server handler 0 on 41118] [31mWARN [0;39m
> [1;35morg.apache.hadoop.hdfs.StateChange [0;39m - DIR* startFile: failed to
> create file /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
> 09:42:33.495 [IPC Server handler 0 on 41118] [1;31mERROR [0;39m
> [1;35mo.a.h.security.UserGroupInformation [0;39m -
> PriviledgedActionException as:mingtzha cause:java.io.IOException: failed to
> create file /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
> 09:42:33.496 [IPC Server handler 0 on 41118] [34mINFO [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server handler 0 on 41118,
> call create(/user/mingtzha/hbase/hbase.version, rwxr-xr-x,
> DFSClient_NONMAPREDUCE_-237185081_1, true, 0, 67108864) from
> 10.241.3.35:24701: error: java.io.IOException: failed to create file
> /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
> java.io.IOException: failed to create file
> /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1591)
> ~[hadoop-core-1.2.1.jar:na]
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1527)
> ~[hadoop-core-1.2.1.jar:na]
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:710)
> ~[hadoop-core-1.2.1.jar:na]
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:689)
> ~[hadoop-core-1.2.1.jar:na]
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[na:1.7.0_45]
>     at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> ~[na:1.7.0_45]
>     at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[na:1.7.0_45]
>     at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_45]
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587)
> ~[hadoop-core-1.2.1.jar:na]
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1432)
> ~[hadoop-core-1.2.1.jar:na]
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1428)
> ~[hadoop-core-1.2.1.jar:na]
>     at java.security.AccessController.doPrivileged(Native Method)
> ~[na:1.7.0_45]
>     at javax.security.auth.Subject.doAs(Subject.java:415) ~[na:1.7.0_45]
>     at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
> ~[hadoop-core-1.2.1.jar:na]
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
> ~[hadoop-core-1.2.1.jar:na]
> 09:42:33.496 [IPC Server handler 0 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #10 from 10.241.3.35:24701
> 09:42:33.496 [IPC Server handler 0 on 41118] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server Responder:
> responding to #10 from 10.241.3.35:24701 Wrote 1285 bytes.
> 09:42:33.497 [IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118 from mingtzha] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Client [0;39m - IPC Client (47) connection to
> slc05muw.us.**.com/10.241.3.35:41118 from mingtzha got value #10
> 09:42:33.497 [main] [34mINFO [0;39m [1;35mtest [0;39m -  > Finished
> HBaseTestSample.setup
> 09:42:33.506 [main] [34mINFO [0;39m [1;35mtest [0;39m -  > Started
> HBaseTestSample.testInsert
> 09:42:33.506 [main] [34mINFO [0;39m [1;35mtest [0;39m -  > Finished
> HBaseTestSample.testInsert
> FAILED CONFIGURATION: @BeforeMethod setup
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: failed to
> create file /user/mingtzha/hbase/hbase.version on client 10.241.3.35.
> Requested replication 0 is less than the required minimum 1
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1591)
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1527)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:710)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:689)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1432)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1428)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)
>
>     at org.apache.hadoop.ipc.Client.call(Client.java:1113)
>     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
>     at com.sun.proxy.$Proxy10.create(Unknown Source)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
>
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
>     at
>
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
>     at com.sun.proxy.$Proxy10.create(Unknown Source)
>     at
>
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:3451)
>     at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:870)
>     at
>
> org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:205)
>     at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:564)
>     at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:545)
>     at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:452)
>     at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:444)
>     at org.apache.hadoop.hbase.util.FSUtils.setVersion(FSUtils.java:475)
>     at org.apache.hadoop.hbase.util.FSUtils.setVersion(FSUtils.java:360)
>     at
>
> org.apache.hadoop.hbase.HBaseTestingUtility.createRootDir(HBaseTestingUtility.java:774)
>     at
>
> org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:646)
>     at
>
> org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:628)
>     at
>
> org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:576)
>     at
>
> org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:563)
>     at
>
> com.**.sites.analytics.repository.itest.endeca.HBaseTestSample.setup(HBaseTestSample.java:101)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
>
> org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:84)
>     at
>
> org.testng.internal.MethodInvocationHelper$2.runConfigurationMethod(MethodInvocationHelper.java:292)
>     at
>
> org.jvnet.testing.hk2testng.HK2TestListenerAdapter.run(HK2TestListenerAdapter.java:97)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
>
> org.testng.internal.MethodInvocationHelper.invokeConfigurable(MethodInvocationHelper.java:304)
>     at
> org.testng.internal.Invoker.invokeConfigurationMethod(Invoker.java:556)
>     at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:213)
>     at org.testng.internal.Invoker.invokeMethod(Invoker.java:653)
>     at org.testng.internal.Invoker.invokeTestMethod(Invoker.java:901)
>     at org.testng.internal.Invoker.invokeTestMethods(Invoker.java:1231)
>     at
>
> org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:127)
>     at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:111)
>     at org.testng.TestRunner.privateRun(TestRunner.java:767)
>     at org.testng.TestRunner.run(TestRunner.java:617)
>     at org.testng.SuiteRunner.runTest(SuiteRunner.java:334)
>     at org.testng.SuiteRunner.runSequentially(SuiteRunner.java:329)
>     at org.testng.SuiteRunner.privateRun(SuiteRunner.java:291)
>     at org.testng.SuiteRunner.run(SuiteRunner.java:240)
>     at org.testng.SuiteRunnerWorker.runSuite(SuiteRunnerWorker.java:52)
>     at org.testng.SuiteRunnerWorker.run(SuiteRunnerWorker.java:86)
>     at org.testng.TestNG.runSuitesSequentially(TestNG.java:1224)
>     at org.testng.TestNG.runSuitesLocally(TestNG.java:1149)
>     at org.testng.TestNG.run(TestNG.java:1057)
>     at org.testng.remote.RemoteTestNG.run(RemoteTestNG.java:111)
>     at org.testng.remote.RemoteTestNG.initAndRun(RemoteTestNG.java:204)
>     at org.testng.remote.RemoteTestNG.main(RemoteTestNG.java:175)
>
> SKIPPED CONFIGURATION: @AfterMethod destroy
> SKIPPED: testInsert
>
> ===============================================
>     Default test
>     Tests run: 1, Failures: 0, Skips: 1
>     Configuration Failures: 1, Skips: 1
> ===============================================
>
> 09:42:33.535 [main] [34mINFO [0;39m [1;35mtest [0;39m - Finished Suite
> [Default suite]
>
> ===============================================
> Default suite
> Total tests run: 1, Failures: 0, Skips: 1
> Configuration Failures: 1, Skips: 1
> ===============================================
>
> [TestNG] Time taken by org.testng.reporters.XMLReporter@71aeef97: 6 ms
> [TestNG] Time taken by [FailedReporter passed=0 failed=0 skipped=0]: 4 ms
> [TestNG] Time taken by org.testng.reporters.jq.Main@2b430201: 24 ms
> [TestNG] Time taken by org.testng.reporters.JUnitReportReporter@3309b429:
> 4
> ms
> [TestNG] Time taken by org.testng.reporters.SuiteHTMLReporter@7224eaaa: 8
> ms
> [TestNG] Time taken by org.testng.reporters.EmailableReporter2@53b74706: 3
> ms
> 09:42:33.588 [Thread-0] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.fs.FileSystem [0;39m - Starting clear of FileSystem
> cache with 1 elements.
> 09:42:33.588 [Thread-0] [39mDEBUG [0;39m [1;35morg.apache.hadoop.ipc.Client
> [0;39m - Stopping client
> 09:42:33.589 [IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118 from mingtzha] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Client [0;39m - IPC Client (47) connection to
> slc05muw.us.**.com/10.241.3.35:41118 from mingtzha: closed
> 09:42:33.589 [IPC Client (47) connection to slc05muw.us.**.com/
> 10.241.3.35:41118 from mingtzha] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Client [0;39m - IPC Client (47) connection to
> slc05muw.us.**.com/10.241.3.35:41118 from mingtzha: stopped, remaining
> connections 0
> 09:42:33.589 [pool-1-thread-1] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.ipc.Server [0;39m - IPC Server listener on 41118:
> disconnecting client 10.241.3.35:24701. Number of active connections: 1
> 09:42:33.689 [Thread-0] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.fs.FileSystem [0;39m - Removing filesystem for
> hdfs://slc05muw.us.**.com:41118
> 09:42:33.689 [Thread-0] [39mDEBUG [0;39m
> [1;35morg.apache.hadoop.fs.FileSystem [0;39m - Done clearing cache
>
> The java code:
>
> import java.io.BufferedReader;
> import java.io.InputStreamReader;
>
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.hbase.HBaseConfiguration;
> import org.apache.hadoop.hbase.HBaseTestingUtility;
> import org.apache.hadoop.hbase.client.Delete;
> import org.apache.hadoop.hbase.client.Get;
> import org.apache.hadoop.hbase.client.HTable;
> import org.apache.hadoop.hbase.client.Put;
> import org.apache.hadoop.hbase.client.Result;
> import org.apache.hadoop.hbase.util.Bytes;
> import org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster;
> import org.testng.annotations.AfterMethod;
> import org.testng.annotations.BeforeMethod;
> import org.testng.annotations.Test;
>
> public class HBaseTestSample {
>
>     private static HBaseTestingUtility utility;
>     byte[] CF = "CF".getBytes();
>     byte[] QUALIFIER = "CQ-1".getBytes();
>
>     @BeforeMethod
>     public void setup() throws Exception {
>         Configuration hbaseConf = HBaseConfiguration.create();
>
>         utility = new HBaseTestingUtility(hbaseConf);
>
>         Process process = Runtime.getRuntime().exec("/bin/sh -c umask");
>         BufferedReader br = new BufferedReader(new InputStreamReader(
>                 process.getInputStream()));
>         int rc = process.waitFor();
>         if (rc == 0) {
>             String umask = br.readLine();
>
>             int umaskBits = Integer.parseInt(umask, 8);
>             int permBits = 0777 & ~umaskBits;
>             String perms = Integer.toString(permBits, 8);
>
>             utility.getConfiguration().set("dfs.datanode.data.dir.perm",
> perms);
>         }
>
>         utility.startMiniCluster(0);
>
>     }
>
>     @Test
>     public void testInsert() throws Exception {
>         HTable table = utility.createTable(CF, QUALIFIER);
>
>         System.out.println("create table t-f");
>
>         // byte [] family, byte [] qualifier, byte [] value
>         table.put(new Put("r".getBytes()).add("f".getBytes(),
> "c1".getBytes(),
>                 "v".getBytes()));
>         Result result = table.get(new Get("r".getBytes()));
>
>         System.out.println(result.list().size());
>
>         table.delete(new Delete("r".getBytes()));
>
>         System.out.println("clean up");
>
>     }
>
>     @AfterMethod
>     public void destroy() throws Exception {
>         utility.cleanupTestDir();
>     }
> }
>
> hbase-site.xml:
>
> <?xml version="1.0"?>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> <configuration>
>     <property>
>         <name>hbase.rootdir</name>
>         <value>file:///scratch/mingtzha/hbase/test</value>
>     </property>
>     <property>
>         <name>hbase.tmp.dir</name>
>         <value>/tmp/hbase</value>
>     </property>
>
>     <property>
>         <name>hbase.zookeeper.quorum</name>
>         <value>localhost</value>
>     </property>
>     <property>
>         <name>hbase.cluster.distributed</name>
>         <value>true</value>
>     </property>
>     <property>
>         <name>hbase.ipc.warn.response.time</name>
>         <value>1</value>
>     </property>
>
>     <!-- http://hbase.apache.org/book/ops.monitoring.html -->
>     <!-- -1 => Disable logging by size -->
>     <!-- <property> -->
>     <!-- <name>hbase.ipc.warn.response.size</name> -->
>     <!-- <value>-1</value> -->
>     <!-- </property> -->
> </configuration>
>
> pom.xml
>
> <?xml version="1.0" encoding="UTF-8"?>
> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="
> http://www.w3.org/2001/XMLSchema-instance"
>     xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
> http://maven.apache.org/xsd/maven-4.0.0.xsd">
>     <modelVersion>4.0.0</modelVersion>
>     <parent>
>         <groupId>com.**.sites.analytics.tests</groupId>
>         <artifactId>integration-test</artifactId>
>         <version>1.0-SNAPSHOT</version>
>     </parent>
>
>     <artifactId>repository-itest</artifactId>
>     <name>repository-itest</name>
>
>     <dependencies>
>         <dependency>
>             <groupId>com.**.sites.analytics</groupId>
>             <artifactId>test-integ</artifactId>
>             <version>${project.version}</version>
>             <scope>test</scope>
>         </dependency>
>         <dependency>
>             <groupId>com.**.sites.analytics.tests</groupId>
>             <artifactId>itest-core</artifactId>
>             <version>${project.version}</version>
>         </dependency>
>         <dependency>
>             <groupId>com.**.sites.analytics</groupId>
>             <artifactId>config-dev</artifactId>
>             <version>${project.version}</version>
>             <scope>test</scope>
>         </dependency>
>         <dependency>
>             <groupId>com.**.sites.analytics</groupId>
>             <artifactId>repository-core</artifactId>
>             <version>${project.version}</version>
>         </dependency>
>
>         <dependency>
>             <groupId>com.**.sites.analytics</groupId>
>             <artifactId>repository-hbase</artifactId>
>             <version>${project.version}</version>
>         </dependency>
>
>         <dependency>
>             <groupId>org.apache.hbase</groupId>
>             <artifactId>hbase</artifactId>
>             <version>0.94.21</version>
>             <classifier>tests</classifier>
>             <exclusions>
>                 <exclusion>
>                     <artifactId>slf4j-log4j12</artifactId>
>                     <groupId>org.slf4j</groupId>
>                 </exclusion>
>             </exclusions>
>         </dependency>
>         <dependency>
>             <groupId>org.apache.hadoop</groupId>
>             <artifactId>hadoop-core</artifactId>
>             <version>1.2.1</version>
>             <exclusions>
>                 <exclusion>
>                     <artifactId>slf4j-log4j12</artifactId>
>                     <groupId>org.slf4j</groupId>
>                 </exclusion>
>             </exclusions>
>         </dependency>
>         <dependency>
>             <groupId>org.apache.hadoop</groupId>
>             <artifactId>hadoop-test</artifactId>
>             <version>1.2.1</version>
>             <exclusions>
>                 <exclusion>
>                     <artifactId>slf4j-log4j12</artifactId>
>                     <groupId>org.slf4j</groupId>
>                 </exclusion>
>             </exclusions>
>         </dependency>
>     </dependencies>
> </project>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message