kylin-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Santosh Akhilesh <santoshakhil...@gmail.com>
Subject Re: Cube Build Failed at Last Step//RE: Error while making cube & Measure option is not responding on GUI
Date Sun, 01 Mar 2015 12:34:19 GMT
Hi Shaofeng,

                 I have raised the bug; Please suggest a resolution or
alternate ASAP
                  https://issues.apache.org/jira/browse/KYLIN-617

Regards,
Santosh Akhilesh


On Sun, Mar 1, 2015 at 5:02 PM, Shi, Shaofeng <shaoshi@ebay.com> wrote:

> Hi Santosh, this is very likely the problem; We will verify this on
> Monday; In the meantime, could you please report a new JIRA with this
> problem and your findings? I appreciateyour input!
>
> On 3/1/15, 3:03 PM, "Santosh Akhilesh" <santoshakhilesh@gmail.com> wrote:
>
> >Hi Shaofeng ,
> >      My map reduce application class path doesnt contain the hbase libs.
> >But I find that kylin.sh start / stop scripts has initialized hbase env
> >first before anything else. So when I see the kylin.log client env loads
> >the hbases  client libs before hadoop and hbase client lib is 2.2.0. Is
> >this isue related to kylin.sh startup script ? I am attaching my class
> >path setting in mapred site xml and kylin.log print of classpath.
> >
> ><name>mapreduce.application.classpath</name>
> ><value>/tmp/kylin/*,/home/santosh/work/frameworks/hadoop-2.6.0/etc/hadoop,
> >/home/santosh/work/frameworks/hadoop-2.6.0/etc/hadoop,/home/santosh/work/f
> >rameworks/hadoop-2.6.0/etc/hadoop,/home/santosh/work/frameworks/hadoop-2.6
> >.0/share/hadoop/common/lib/*,/home/santosh/work/frameworks/hadoop-2.6.0/sh
> >are/hadoop/common/*,/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoo
> >p/hdfs,/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/*,
> >/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/*,/home/santo
> >sh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/*,/home/santosh/work
> >/frameworks/hadoop-2.6.0/share/hadoop/yarn/*,/home/santosh/work/frameworks
> >/hadoop-2.6.0/share/hadoop/mapreduce/lib/*,/home/santosh/work/frameworks/h
> >adoop-2.6.0/share/hadoop/mapreduce/*,/contrib/capacity-scheduler/*.jar,/ho
> >me/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/*,/home/santosh/
> >wrk/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/*,/home/santosh/work/fr
> >ameworks/apache-hive-1.0.0/conf,/home/santosh/work/frameworks/apache-hive-
> >1.0.0/hcatalog/shar/hcatalog/*,/home/santosh/work/frameworks/apache-hive-
> >1.0.0/lib/hive-exec-1.0.0.jar</value>
> >
> >
> >Kylin.log
> >Client
> >environment:java.class.path=/etc/kylin:/home/santosh/work/software/tomcat/
> >bin/bootstrap.jar:/home/santosh/work/software/tomcat/bin/tomcat-juli.jar:/
> >home/santosh/work/software/tomcat/lib/catalina-tribes.jar:/home/santosh/wo
> >rk/software/tomcat/lib/jsp-api.jar:/home/santosh/work/software/tomcat/lib/
> >catalina-ant.jar:/home/santosh/work/software/tomcat/lib/ecj-4.4.jar:/home/
> >santosh/work/software/tomcat/lib/tomcat-dbcp.jar:/home/santosh/work/softwa
> >re/tomcat/lib/catalina.jar:/home/santosh/work/software/tomcat/lib/tomcat-a
> >pi.jar:/home/santosh/work/software/tomcat/lib/catalina-ha.jar:/home/santos
> >h/work/software/tomcat/lib/jasper-el.jar:/home/santosh/work/software/tomca
> >t/lib/tomcat7-websocket.jar:/home/santosh/work/software/tomcat/lib/jasper.
> >jar:/home/santosh/work/software/tomcat/lib/tomcat-coyote.jar:/home/santosh
> >/work/software/tomcat/lib/tomcat-i18n-ja.jar:/home/santosh/work/software/t
> >omcat/lib/tomcat-util.jar:/home/santosh/work/software/tomcat/lib/el-api.ja
> >r:/home/santosh/work/software/tomcat/lib/websocket-api.jar:/home/santosh/w
> >ork/software/tomcat/lib/servlet-api.jar:/home/santosh/work/software/tomcat
> >/lib/annotations-api.jar:/home/santosh/work/software/tomcat/lib/tomcat-i18
> >n-es.jar:/home/santosh/work/software/tomcat/lib/tomcat-i18n-fr.jar:/home/s
> >antosh/work/software/tomcat/lib/tomcat-jdbc.jar::/home/santosh/work/framew
> >orks/hase-0.98.10/bin/../conf:/home/santosh/work/java/jdk1.7.0_75/lib/too
> >ls.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/..:/home/santosh/wo
> >rk/frameworks/hbase-0.98.10/bin/../lib/activation-1.1.jar:/home/santosh/wo
> >rk/frameworks/hbase-0.98.10/bin/../lib/aopalliance-1.0.jar:/home/santosh/w
> >ork/frameworks/hbase-0.98.10/bin/../lib/asm-3.1.jar:/home/santosh/work/fra
> >meworks/hbase-0.98.10/bin/../lib/avro-1.7.4.jar:/home/santosh/work/framewo
> >rks/hbase-0.98.10/bin/../lib/commons-beanutils-1.7.0.jar:/home/santosh/wor
> >k/frameworks/hbase-0.98.10/bin/../lib/commons-beanutils-core-1.8.0.jar:/ho
> >me/santosh/work/frameworks/hbase-0.98.10/bin/../li/commons-cli-1.2.jar:/h
> >ome/santosh/work/frameworks/hbase-0.98.10/bin/../lib/commons-codec-1.7.jar
> >:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/commons-collection
> >s-3.2.1.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/commons
> >-copress-1.4.1.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib
> >/commons-configuration-1.6.jar:/home/santosh/work/frameworks/hbase-0.98.10
> >/bin/../lib/commons-daemon-1.0.13.jar:/home/santosh/work/frameworks/hbase-
> >0.98.10/bin/../lib/commons-digester-1.8.jar:/home/santosh/work/frameworks/
> >hbase-0.98.10/bin/../lib/commons-el-1.0.jar:/home/santosh/work/frameworks/
> >hbase-0.98.10/bin/../lib/commons-httpclient-3.1.jar:/home/santosh/work/fra
> >meworks/hbase-0.98.10/bin/../lib/commons-io-2.4.jar:/home/santosh/work/fra
> >meworks/hbase-0.98.10/bin/../lib/commons-lang-2.6.jar:/home/santosh/work/f
> >rameworks/hbase-0.98.10/bin/../lib/commons-logging-1.1.1.jar:/home/santosh
> >/work/frameworks/hbase-0.98.10/bin/../lib/commons-math-2.1.jar:/home/santo
> >sh/work/frameworks/hbase-0.98.10/bin/../lib/commons-net-3.1.jar:/home/sant
> >osh/work/frameworks/hbase-0.98.10/bin/../lib/findbugs-annotations-1.3.9-1.
> >jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/gmbal-api-only-
> >3.0.0-b023.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/griz
> >zly-framework-2.1.2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/..
> >/lib/grizzly-http-2.1.2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bi
> >n/../lib/grizzly-http-server-2.1.2.jar:/home/santosh/work/frameworks/hbase
> >-0.98.10/bin/../lib/grizzly-http-servlet-2.1.2.jar:/home/santosh/work/fram
> >eworks/hbase-0.98.10/bin/../lib/grizzly-rcm-2.1.2.jar:/home/santosh/work/f
> >rameworks/hbase-0.98.10/bin/../lib/guava-12.0.1.jar:/home/santosh/work/fra
> >meworks/hbase-0.98.10/bin/../lib/guice-3.0.jar:/home/santosh/work/framewor
> >ks/hbase-0.98.10/bin/../lib/guice-servlet-3.0.jar:/home/santosh/work/frame
> >works/hbase-0.98.10/bin/../lib/hadoop-annotations-2.2.0.jar:/home/santosh/
> >work/frameworks/hbase-0.98.10/bin/../lib/hadoop-auth-2.2.0.jar:/home/santo
> >sh/work/frameworks/hbase-0.98.10/bin/../lib/hadoop-client-2.2.0.jar:/home/
> >santosh/work/frameworks/hbase-0.98.10/bin/../lib/hadoop-common-2.2.0.jar:/
> >home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hadoop-hdfs-2.2.0.ja
> >r:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hadoop-mapreduce-
> >client-app-2.2.0.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../li
> >b/hadoop-mapreduce-client-common-2.2.0.jar:/home/santosh/work/frameworks/h
> >base-0.98.10/bin/../lib/hadoop-mapreduce-client-core-2.2.0.jar:/home/santo
> >sh/work/frameworks/hbase-0.98.10/bin/../lib/hadoop-mapreduce-client-jobcli
> >ent-2.2.0.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hadoo
> >p-mapreduce-client-shuffle-2.2.0.jar:/home/santosh/work/frameworks/hbase-0
> >.98.10/bin/../lib/hadoop-yarn-api-2.2.0.jar:/home/santosh/work/frameworks/
> >hbase-0.98.10/bin/../lib/hadoop-yarn-client-2.2.0.jar:/home/santosh/work/f
> >rameworks/hbase-0.98.10/bin/../lib/hadoop-yarn-common-2.2.0.jar:/home/sant
> >osh/work/frameworks/hbase-0.98.10/bin/../lib/hadoop-yarn-server-common-2.2
> >.0.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hadoop-yarn-
> >server-nodemanager-2.2.0.jar:/home/santosh/work/frameworks/hbase-0.98.10/b
> >in/../lib/hamcrest-core-1.3.jar:/home/santosh/work/frameworks/hbase-0.98.1
> >0/bin/../lib/hbase-annotations-0.98.10-hadoop2.jar:/home/santosh/work/fram
> >eworks/hbase-0.98.10/bin/../lib/hbase-checkstyle-0.98.10-hadoop2.jar:/home
> >/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hbase-client-0.98.10-had
> >oop2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hbase-comm
> >on-0.98.10-hadoop2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../
> >lib/hbase-common-0.98.10-hadoop2-tests.jar:/home/santosh/work/frameworks/h
> >base-0.98.10/bin/../lib/hbase-examples-0.98.10-hadoop2.jar:/home/santosh/w
> >ork/frameworks/hbase-0.98.10/bin/../lib/hbase-hadoop2-compat-0.98.10-hadoo
> >p2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hbase-hadoop
> >-compat-0.98.10-hadoop2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bi
> >n/../lib/hbase-it-0.98.10-hadoop2.jar:/home/santosh/work/frameworks/hbase-
> >0.98.10/bin/../lib/hbase-it-0.98.10-hadoop2-tests.jar:/home/santosh/work/f
> >rameworks/hbase-0.98.10/bin/../lib/hbase-prefix-tree-0.98.10-hadoop2.jar:/
> >home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hbase-protocol-0.98.
> >10-hadoop2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hbas
> >e-rest-0.98.10-hadoop2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin
> >/../lib/hbase-server-0.98.10-hadoop2.jar:/home/santosh/work/frameworks/hba
> >se-0.98.10/bin/../lib/hbase-server-0.98.10-hadoop2-tests.jar:/home/santosh
> >/work/frameworks/hbase-0.98.10/bin/../lib/hbase-shell-0.98.10-hadoop2.jar:
> >/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hbase-testing-util-
> >0.98.10-hadoop2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib
> >/hbase-thrift-0.98.10-hadoop2.jar:/home/santosh/work/frameworks/hbase-0.98
> >.10/bin/../lib/high-scale-lib-1.1.1.jar:/home/santosh/work/frameworks/hbas
> >e-0.98.10/bin/../lib/htrace-core-2.04.jar:/home/santosh/work/frameworks/hb
> >ase-0.98.10/bin/../lib/httpclient-4.1.3.jar:/home/santosh/work/frameworks/
> >hbase-0.98.10/bin/../lib/httpcore-4.1.3.jar:/home/santosh/work/frameworks/
> >hbase-0.98.10/bin/../lib/jackson-core-asl-1.8.8.jar:/home/santosh/work/fra
> >meworks/hbase-0.98.10/bin/../lib/jackson-jaxrs-1.8.8.jar:/home/santosh/wor
> >k/frameworks/hbase-0.98.10/bin/../lib/jackson-mapper-asl-1.8.8.jar:/home/s
> >antosh/work/frameworks/hbase-0.98.10/bin/../lib/jackson-xc-1.8.8.jar:/home
> >/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jamon-runtime-2.3.1.jar:
> >/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jasper-compiler-5.5
> >.23.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jasper-runt
> >ime-5.5.23.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/java
> >x.inject-1.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/java
> >x.servlet-3.1.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/j
> >avax.servlet-api-3.0.1.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin
> >/../lib/jaxb-api-2.2.2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin
> >/../lib/jaxb-impl-2.2.3-1.jar:/home/santosh/work/frameworks/hbase-0.98.10/
> >bin/../lib/jcodings-1.0.8.jar:/home/santosh/work/frameworks/hbase-0.98.10/
> >bin/../lib/jersey-client-1.9.jar:/home/santosh/work/frameworks/hbase-0.98.
> >10/bin/../lib/jersey-core-1.8.jar:/home/santosh/work/frameworks/hbase-0.98
> >.10/bin/../lib/jerey-grizzly2-1.9.jar:/home/santosh/work/frameworks/hbase
> >-0.98.10/bin/../lib/jersey-guice-1.9.jar:/home/santosh/work/frameworks/hba
> >se-0.98.10/bin/../lib/jersey-json-1.8.jar:/home/santosh/work/frameworks/hb
> >ase-0.98.10/bin/../lib/jersey-server-1.8.jar:/home/santosh/work/frameworks
> >/hbase-0.98.10/bin/../lib/jersey-test-framework-core-1.9.jar:/home/santosh
> >/work/framworks/hbase-0.98.10/bin/../lib/jersey-test-framework-grizzly2-1
> >.9.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jets3t-0.6.1
> >.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jettison-1.3.1
> >.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jetty-6.1.26.j
> >ar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jetty-sslengine-
> >6.1.26.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jetty-ut
> >il-6.1.26.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/joni-
> >2.1.2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jruby-com
> >plete-1.6.8.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jsc
> >h-0.1.42.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jsp-2.
> >1-6.1.14.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jsp-ap
> >i-2.1-6.1.14.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/js
> >r305-1.3.9.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/juni
> >t-4.11.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/libthrif
> >t-0.9.0.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/log4j-1
> >.2.17.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/managemen
> >t-api-3.0.0-b012.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../li
> >b/metrics-core-2.2.0.jar:/home/santosh/work/frameworks/hbase-.98.10/bin/.
> >./lib/netty-3.6.6.Final.jar:/home/santosh/work/frameworks/hbase-0.98.10/bi
> >n/../lib/paranamer-2.3.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin
> >/../lib/protobuf-java-2.5.0.jar:/home/santosh/work/frameworks/hbase-0.98.1
> >0/bin/../lib/servlet-api-2.5-6.1.14.jar:/home/santosh/work/frameworks/hbas
> >e-0.98.10/bin/../lib/slf4j-api-1.6.4.jar:/home/santosh/work/frameworks/hba
> >se-0.98.10/bin/../lib/slf4j-log4j12-1.6.4.jar:/home/santosh/work/framework
> >s/hbase-0.98.10/bin/../lib/snappy-java-1.0.4.1.jar:/home/santosh/work/fram
> >eworks/hbase-0.98.10/bin/../lib/xmlenc-0.52.jar:/home/santosh/work/framewo
> >rks/hbase-0.98.10/bin/../lib/xz-1.0.jar:/home/santosh/work/frameworks/hbas
> >e-0.98.10/bin/../lib/zookeeper-3.4.6.jar:/home/santosh/work/frameworks/had
> >oop-2..0/etc/hadoop:/home/santosh/work/frameworks/hadoop-2.6.0/share/hado
> >op/common/lib/jaxb-api-2.2.2.jar:/home/santosh/work/frameworks/hadoop-2.6.
> >0/share/hadoop/common/lib/curator-framework-2.6.0.jar:/home/santosh/work/f
> >raeworks/hadoop-2.6.0/share/hadoop/common/lib/commons-io-2.4.jar:/home/sa
> >ntosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/jackson-jaxrs-1
> >.9.13.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/ommon/l
> >ib/protobuf-java-2.5.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/shar
> >e/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/santosh/work/frameworks/
> >hadoop-2.6.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/santosh/work/
> >frameworks/hadoop-2.6.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/san
> >tosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/jsr305-1.3.9.jar
> >:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/common
> >s-beanutils-core-1.8.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/shar
> >e/hadoop/common/lib/commons-el-1.0.jar:/home/santosh/work/frameworks/hadoo
> >p-2.6.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/santosh/work/fra
> >meworks/hadoop-2.6.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/santosh
> >/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/commons-configuratio
> >n-1.6.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/l
> >ib/jackson-core-asl-1.9.13.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
> >share/hadoop/common/lib/activation-1.1.jar:/home/santosh/work/frameworks/h
> >adoop-2.6.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/santosh/wo
> >rk/frameworks/hadoop-2.6.0/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15
> >.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/ht
> >tpcore-4.2.5.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/c
> >ommon/lib/xmlenc-0.52.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share
> >/hadoop/common/lib/curator-recipes-2.6.0.jar:/home/santosh/work/frameworks
> >/hadoop-2.6.0/share/hadoop/common/lib/xz-1.0.jar:/home/santosh/work/framew
> >orks/hadoop-2.6.0/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/home/sant
> >osh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/commons-cli-1.2.j
> >ar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/avro
> >-1.7.4.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/
> >lib/jets3t-0.9.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hado
> >op/common/lib/jettison-1.1.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
> >share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/home/santosh/work/fr
> >ameworks/hadoop-2.6.0/share/hadoop/common/lib/hamcrest-core-1.3.jar:/home/
> >santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/commons-diges
> >ter-1.8.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common
> >/lib/commons-math3-3.1.1.jar:/home/santosh/work/frameworks/hadoop-2.6.0/sh
> >are/hadoop/common/lib/api-util-1.0.0-M20.jar:/home/santosh/work/frameworks
> >/hadoop-2.6.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/san
> >tosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/jetty-util-6.1.2
> >6.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/j
> >ackson-xc-1.9.13.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hado
> >op/common/lib/jersey-core-1.9.jar:/home/santosh/work/frameworks/hadoop-2.6
> >.0/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/home/santosh/work/fram
> >eworks/hadoop-2.6.0/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:
> >/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/curator
> >-client-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >common/lib/jersey-server-1.9.jar:/home/santosh/work/frameworks/hadoop-2.6.
> >0/share/hadoop/common/lib/httpclient-4.2.5.jar:/home/santosh/work/framewor
> >ks/hadoop-2.6.0/share/hadoop/common/lib/zookeeper-3.4.6.jar:/home/santosh/
> >work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.j
> >ar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/nett
> >y-3.6.2.Final.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >common/lib/commons-collections-3.2.1.jar:/home/santosh/work/frameworks/had
> >oop-2.6.0/share/hadoop/common/lib/commons-lang-2.6.jar:/home/santosh/work/
> >frameworks/hadoop-2.6.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/
> >santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/commons-beanu
> >tils-1.7.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/com
> >mon/lib/commons-logging-1.1.3.jar:/home/santosh/work/frameworks/hadoop-2.6
> >.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/santosh/work/frameworks/h
> >adoop-2.6.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/santosh/work
> >/frameworks/hadoop-2.6.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/ho
> >me/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/comon/lib/guava-11.0
> >.2.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/
> >asm-3.2.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common
> >/lib/gson-2.2.4.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoo
> >p/common/lib/htrace-core-3.0.4.jar:/home/santosh/work/frameworks/hadoop-2.
> >6.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/santosh/work/
> >frameworks/hadoop-2.6.0/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar
> >:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/jetty-
> >6.1.26.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/
> >lib/apacheds-kerberos-codec-2.0.0-M15.jar:/home/santosh/work/frameworks/ha
> >doop-2.6.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/santosh/work/
> >frameworks/hadoop-2.6.0/share/hadoop/common/lib/junit-4.11.jar:/home/santo
> >sh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/commons-httpclient
> >-3.1.jar:/home/santos/work/frameworks/hadoop-2.6.0/share/hadoop/common/li
> >b/stax-api-1.0-2.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hado
> >op/common/lib/commons-codec-1.4.jar:/home/santosh/work/frameworks/hadoop-2
> >.6.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/santosh/work/
> >frameworks/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0-tests.jar:
> >/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/hadoop-comm
> >on-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/commo
> >n/hadoop-nfs-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/ha
> >doop/hdfs:/home/santosh/work/frameworks/hadoop-2.6.0/share/haoop/hdfs/lib
> >/commons-io-2.4.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoo
> >p/hdfs/lib/protobuf-java-2.5.0.jar:/home/santosh/work/frameworks/hadoop-2.
> >6.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/santosh/work/frameworks/h
> >adoop-2.6.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/santosh/work/fram
> >eworks/hadoop-2.6.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/santosh
> >/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/h
> >ome/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/jackson-cor
> >e-asl-1.9.13.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/h
> >dfs/lib/xmlenc-0.52.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/h
> >adoop/hdfs/lib/commons-cli-1.2.jar:/home/santosh/work/frameworks/hadoop-2.
> >6.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/santosh/work/framewo
> >rks/hadoop-2.6.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/ome/santosh/w
> >ork/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.ja
> >r:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/jackson
> >-mapper-asl-1.9.13.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/ha
> >doop/hdfs/lib/jersey-server-1.9.jar:/home/santosh/work/frameworks/hadoop-2
> >.6.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/santosh/work/framew
> >orks/hadoop-2.6.0/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/home/santosh
> >/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/h
> >ome/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/xercesImpl-
> >2.9.1.ar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib
> >/commons-logging-1.1.3.jar:/home/santosh/work/frameworks/hadoop-2.6.0/shar
> >e/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/santosh/work/frameworks/hadoop-2.6
> >.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/santosh/work/frameworks/ha
> >doop-2.6.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/santosh/work/frameworks
> >/hadoop-2.6.0/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/home/santosh/wo
> >rk/framewors/hadoop-2.6.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/sa
> >ntosh/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/commons-codec-1.4
> >.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/jasp
> >er-runtime-5.5.23.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/had
> >oop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.
> >6.0/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/home/santosh/work/frame
> >works/hadoop-2.6.0/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/home/santosh/w
> >ork/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/home
> >/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/commons-io-2.4
> >.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/jack
> >son-jaxrs-1.9.13.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hado
> >op/yarn/lib/jline-0.9.94.jar:/home/santosh/work/frameworks/hadoop-2.6.0/sh
> >are/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/santosh/work/frameworks/
> >hadoop-2.6.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/santosh/work/fra
> >meworks/hadoop-2.6.0/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/home/santosh/
> >work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/servlet-api-2.5.jar:/ho
> >me/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/javax.inject
> >-1.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/ja
> >ckson-core-asl-1.9.13.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share
> >/hadoop/yarn/lib/activation-1.1.jar:/home/santosh/work/frameworks/hadoop-2
> >.6.0/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/home/santosh/work/framew
> >orks/hadoop-2.6.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/santosh/work/fram
> >eworks/hadoop-2.6.0/share/hadoop/yarn/lib/commons-cli-1.2.jar:/home/santos
> >h/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/jettison-1.1.jar:/hom
> >e/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/jersey-client
> >-1.9.jar:/home/santosh/work/frameworks/hadoop-2.6.0/hare/hadoop/yarn/lib/
> >commons-compress-1.4.1.jar:/home/santosh/work/frameworks/hadoop-2.6.0/shar
> >e/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/santosh/work/frameworks/hao
> >op-2.6.0/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/home/santosh/work/fr
> >ameworks/hadoop-2.6.0/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/home/sa
> >ntosh/work/frameorks/hadoop-2.6.0/share/hadoop/yarn/lib/jersey-core-1.9.j
> >ar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/jackso
> >n-mapper-asl-1.9.13.jar:/home/santos/work/frameworks/hadoop-2.6.0/share/h
> >adoop/yarn/lib/leveldbjni-all-1.8.jar:/home/santosh/work/frameworks/hadoop
> >-2.6.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/homesantosh/work/fram
> >eworks/hadoop-2.6.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/santos
> >h/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/
> >ome/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/netty-3.6.
> >2.Final.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/l
> >ib/commons-collections-3.2.1.jar:/home/santosh/work/frameworks/hadoop-2.6.
> >0/share/hadoop/yarn/lib/commons-lang-2.6.jar:/home/santosh/work/frameworks
> >/hadoop-2.6.0/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/home/santos
> >h/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/guava-11.0.2.jar:/hom
> >e/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/asm-3.2.jar:/
> >home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/jetty-6.1.
> >26.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/je
> >rsey-json-1.9.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >yarn/lib/guice-3.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/ha
> >doop/yarn/lib/commons-httpclient-3.1.jar:/home/santosh/work/frameworks/had
> >oop-2.6.0/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/home/santosh/work/fram
> >eworks/hadoop-2.6.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/santo
> >sh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/ommons-codec-1.4.ja
> >r:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn
> >-server-nodemanager-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/s
> >hare/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/h
> >ome/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-cli
> >ent-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn
> >/hadoop-yarn-server-web-proxy-2.6.0.jar:/home/santosh/work/frameworks/hado
> >op-2.6.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-
> >2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/had
> >oop-yarn-server-tets-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0
> >/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/home/santosh/work/
> >frameworks/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/home/
> >santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-server-
> >resourcemanager-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share
> >/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/home/santosh/work/frameworks/ha
> >doop-2.6.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.6
> >.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/hadoop
> >-yarn-registry-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2..0/share/
> >hadoop/mapreduce/lib/commons-io-2.4.jar:/home/santosh/work/frameworks/hado
> >op-2.6.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/santosh/
> >work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.
> >1.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/li
> >b/paranamer-2.3.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoo
> >p/mapreduce/lib/log4j-1.2.17.jar:/home/santosh/work/frameworks/hadoop-2.6.
> >0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/santosh/work/framewo
> >rks/hadoop-2.6.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/h
> >ome/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/lib/xz-1.0
> >.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/lib
> >/avro-1.7.4.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/ma
> >preduce/lib/hadoop-annotations-2.6.0.jar:/home/santosh/work/frameworks/had
> >oop-2.6.0/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/home/santosh/w
> >ork/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/lib/commons-compress-1.
> >4.1.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/
> >lib/guice-servlet-3.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share
> >/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/santosh/work/frameworks/ha
> >doop-2.6.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/home/
> >santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/lib/leveldbjni
> >-all-1.8.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapre
> >duce/lib/jersey-server-1.9.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
> >share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/santosh/work/framewor
> >ks/hadoop-2.6.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/san
> >tosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/
> >home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/lib/junit
> >-4.11.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduc
> >e/lib/guice-3.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoo
> >p/mapreduce/lib/jersey-guice-1.9.jar:/home/santosh/work/frameworks/hadoop-
> >2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/home/san
> >tosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/adoop-mapreduce-
> >client-jobclient-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/shar
> >e/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/home/santosh/
> >work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-clien
> >t-core-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/m
> >apreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/home/santosh/work/fram
> >eworks/hadoop-2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.6
> >.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/h
> >adoop-mapreduce-client-jobclient-2.6.0-tests.jar:/home/santosh/work/framew
> >orks/hadoop2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugin
> >s-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapred
> >uce/hadoop-mapreduce-client-hs-2.6.0.jar:/contrib/apacity-scheduler/*.jar
> >
> >
> >On Sat, Feb 28, 2015 at 9:06 AM, Shi, Shaofeng <shaoshi@ebay.com> wrote:
> >
> >> I don’t think downgrade hbase can fix that; The jar version in hbase is
> >> low; I suggest to check the mapped-site.xml, find property
> >> mapreduce.application.classpath, that is the class path that be loaded
> >>in
> >> MR; Check whether the hbase folder was put ahead of hadoop folders;
> >>
> >> On 2/28/15, 11:25 AM, "Santosh Akhilesh" <santoshakhilesh@gmail.com>
> >> wrote:
> >>
> >> >Only difference I find in my setup is hbase mine hbase is 0.98.10 and
> >> >kylin is 0.98.4. I wil try downgrading my hbase. But I really doubt
> >>that
> >> >this will solve the problem. But since there is no alternate option in
> >> >sight , I will anyhow give it a try.
> >> >
> >> >Sent fro Outlook on iPhone
> >> >
> >> >
> >> >
> >> >
> >> >On Fri, Feb 27, 2015 at 7:00 PM -0800 "Shi, Shaofeng"
> >><shaoshi@ebay.com>
> >> >wrote:
> >> >
> >> >
> >> >
> >> >
> >> >
> >> >
> >> >
> >> >
> >> >
> >> >
> >> >Hmm… please use the same level client jars; In Kylin’s pom.xml, it
> >> >compiles with 2.6.0 jars:
> >> >https://github.com/KylinOLAP/Kylin/blob/master/pom.xml#L19
> >> >
> >> >
> >> >On 2/27/15, 8:53 PM, "Santoshakhilesh"  wrote:
> >> >
> >> >>Hi Shaofeng ,
> >> >>    I checked the hbase libs , I am using hbas 0.98.10-hadoop2 its
> >>using
> >> >>hadoop-mapreduce-client-app-2.2.0.jar
> >> >>but hadoop is using 2.6.0
> >> >>
> >> >>Is this the issue ?
> >> >>
> >> >>I checked kylin POM its using the 0.98.4-hadoop2
> >> >>
> >> >>Is this problem due to this mismatch ? do you sugegst me to try with
> >> >>changing my hbase version ?
> >> >>
> >> >>Regards,
> >> >>Santosh Akhilesh
> >> >>Bangalore R&D
> >> >>HUAWEI TECHNOLOGIES CO.,LTD.
> >> >>
> >> >>www.huawei.com
> >>
> >>>>-----------------------------------------------------------------------
> >>>>--
> >> >>-
> >> >>-----------------------------------------------------------
> >> >>This e-mail and its attachments contain confidential information from
> >> >>HUAWEI, which
> >> >>is intended only for the person or entity whose address is listed
> >>above.
> >> >>Any use of the
> >> >>information contained herein in any way (including, but not limited
> >>to,
> >> >>total or partial
> >> >>disclosure, reproduction, or dissemination) by persons other than the
> >> >>intended
> >> >>recipient(s) is prohibited. If you receive this e-mail in error,
> >>please
> >> >>notifythe sender by
> >> >>phone or email immediately and delete it!
> >> >>
> >> >>________________________________________
> >> >>From: Santoshakhilesh [santosh.akhilesh@huawei.com]
> >> >>Sent: Friday, February 27, 2015 4:49 PM
> >> >>To: dev@kylin.incubator.apache.org
> >> >>Cc: Kulbhushan Rana
> >> >>Subject: RE: Cube Build Failed at Last Step//RE: Error while making
> >>cube
> >> >>& Measure option is not responding on GUI
> >> >>
> >> >>Hi Shaofeng ,
> >> >>    I configured job histroy server and no more connection exception.
> >>now
> >> >>I get the MR counter exception which we were suspecting.
> >> >>    My haddop version is indeed 2.6.0 , So any idea what can be done
> >>for
> >> >>this ?
> >> >>
> >> >>QuartzScheduler_Worker-8]:[2015-02-28
> >>
> >>>>00:36:26,507][DEBUG][com.kylinolap.job.tools.HadoopStatusChecker.checkS
> >>>>ta
> >> >>t
> >> >>us(HadoopStatusChecker.java:74)] - State of Hadoop job:
> >> >>job_1424957178195_0031:FINISHED-SUCCEEDED
> >> >>[QuartzScheduler_Worker-8]:[2015-02-28
> >>
> >>>>00:36:27,204][ERROR][com.kylinolap.job.cmd.JavaHadoopCmdOutput.updateJo
> >>>>bC
> >> >>o
> >> >>unter(JavaHadoopCmdOutput.java:176)] - No enum constant
> >> >>org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_REDUCES
> >> >>java.lang.IllegalArgumentException: No enum constant
> >> >>org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_REDUCES
> >> >> at java.lang.Enum.valueOf(Enum.java:236)
> >> >> at
> >>
> >>>>org.apache.hadoop.mapreduce.counters.FrameworkCounterGroup.valueOf(Fram
> >>>>ew
> >> >>o
> >> >>rkCounterGroup.java:148)
> >> >> at
> >>
> >>>>org.apache.hadoop.mapreduce.counters.FrameworkCounterGroup.findCounter(
> >>>>Fr
> >> >>a
> >> >>meworkCounterGroup.java:182)
> >> >> at
> >>
> >>>>org.apache.hadoop.mapreduce.counters.AbstractCounters.findCounter(Abstr
> >>>>ac
> >> >>t
> >> >>Counters.java:154)
> >> >> at
> >>
> >>>>org.apache.hadoop.mapreduce.TypeConverter.fromYarn(TypeConverter.java:2
> >>>>40
> >> >>)
> >> >> at
> >>
> >>>>org.apache.hadoop.mapred.ClientServiceDelegate.getJobCounters(ClientSer
> >>>>vi
> >> >>c
> >> >>eDelegate.java:370)
> >> >> at
> >>
> >>>>org.apache.hadoop.mapred.YARNRunner.getJobCounters(YARNRunner.java:511)
> >> >> at org.apache.hadoop.mapreduce.Job$7.run(Job.java:756)
> >> >> at org.apache.hadoop.mapreduce.Job$7.run(Job.java:753)
> >> >> at java.security.AccessController.doPrivileged(Native Method)
> >> >> at javax.security.auth.Subject.doAs(Subject.java:415)
> >> >> at
> >>
> >>>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformati
> >>>>on
> >> >>.
> >> >>java:1491)
> >> >> at org.apache.hadoop.mapreduce.Job.getCounters(Job.java:753)
> >> >> at
> >>
> >>>>com.kylinolap.job.hadoop.AbstractHadoopJob.getCounters(AbstractHadoopJo
> >>>>b.
> >> >>j
> >> >>ava:287)
> >> >> at
> >>
> >>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.updateJobCounter(JavaHadoopCm
> >>>>dO
> >> >>u
> >> >>tput.java:162)
> >> >> at
> >>
> >>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.getStatus(JavaHadoopCmdOutput
> >>>>.j
> >> >>a
> >> >>va:85)
> >> >> at
> >>
> >>>>com.kylinolap.job.flow.AsyncJobFlowNode.execute(AsyncJobFlowNode.java:8
> >>>>6)
> >> >> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
> >> >> at
> >>
> >>>>org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.jav
> >>>>a:
> >> >>5
> >> >>73)
> >> >>
> >> >>Regards,
> >> >>Santosh Akhilesh
> >> >>Bangalore R&D
> >> >>HUAWEI TECHNOLOGIES CO.,LTD.
> >> >>
> >> >>www.huawei.com
> >>
> >>>>-----------------------------------------------------------------------
> >>>>--
> >> >>-
> >> >>-----------------------------------------------------------
> >> >>This e-mail and its attachments contain confidential information from
> >> >>HUAWEI, which
> >> >>is intended only for the person or entity whose address is listed
> >>above.
> >> >>Anyuse of the
> >> >>information contained herein in any way (including, but not limited
> >>to,
> >> >total or partial
> >> >>disclosure, reproduction, or dissemination) by persons other thanthe
> >> >>intended
> >> >>recipient(s) is prohibited. If you receive this e-mail in error,
> >>please
> >> >>notify th sender by
> >> >>phone or email immediately and delete it!
> >> >>
> >> >>________________________________________
> >> >>From: Shi, Shaofeng [shaoshi@ebay.com]
> >> >>Sent: Friday, February 27, 2015 3:10 PM
> >> >>To: dev@kylin.incubator.apache.org
> >> >>Subject: Re: Cube Build Failed at Last Step//RE: Error while making
> >>cube
> >> >>& Measure option is not responding on GUI
> >> >>
> >> >>0.0.0.0:10020 isn’t a valid network address I think; please check the
> >> >>“mapreduce.jobhistory.address” in your mapred-site.xml; it should be
> >> >>something like:
> >> >>
> >> >>
> >> >>
> >> >>  mapreduce.jobhistory.address
> >> >>  sandbox.hortonworks.com:10020
> >> >>
> >> >>
> >> >>
> >> >>
> >> >>On 2/27/15, 5:29 PM, "Santoshakhilesh"
> >> >>wrote:
> >> >>
> >> >>>Hi Shaofeng ,
> >> >>>   No I have not found MR counter exception. I get following
> >>exception
> >> >>>frequently. I think this is related LogHistory server of hadoop.
> >> >>>
> >> >>>[QuartzScheduler_Worker-23]:[2015-02-27
> >>
> >>>>>22:18:37,299][ERROR][com.kylinolap.job.cmd.JavaHadoopCmdOutput.updateJ
> >>>>>ob
> >> >>>C
> >> >>>o
> >> >>>unter(JavaHadoopCmdOutput.java:176)] - java.io.IOException:
> >> >>>java.net.ConnectException: Call From linux/10.19.93.68 to
> >>0.0.0.0:10020
> >> >>>failed on connection exception: java.net.ConnectException: Connection
> >> >>>refused; For more details see:
> >> >>>http://wiki.apache.org/hadoop/ConnectionRefused
> >> >>>com.kylinolap.job.exception.JobException: java.io.IOException:
> >> >>>java.net.ConnectException: Call From linux/10.19.93.68 to
> >>0.0.0.0:10020
> >> >>>failed on connection exception: java.net.ConnectException: Connection
> >> >>>refused; For more details see:
> >> >>>http://wiki.apache.org/hadoop/ConnectionRefused
> >> >>> at
> >>
> >>>>>com.kylinolap.job.hadoop.AbstractHadoopJob.getCounters(AbstractHadoopJ
> >>>>>ob
> >> >>>.
> >> >>>j
> >> >>>ava:289)
> >> >>> at
> >>
> >>>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.updateJobCounter(JavaHadoopC
> >>>>>md
> >> >>>O
> >> >>>u
> >> >>>tput.java:162)
> >> >>> at
> >>
> >>>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.getStatus(JavaHadoopCmdOutpu
> >>>>>t.
> >> >>>j
> >> >>>a
> >> >>>va:85)
> >> >>> at
> >>
> >>>>>com.kylinolap.job.flow.AsyncJobFlowNode.execute(AsyncJobFlowNode.java:
> >>>>>86
> >> >>>)
> >> >>> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
> >> >>> at
> >>
> >>>>>rg.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.jav
> >>>>>a:
> >> >>>5
> >> >>>73)
> >> >>>Caused by: java.io.IOException: java.net.ConnectException: Call From
> >> >>>linux/10.19.93.68 to 0.0.0.0:10020 failed on connection exception:
> >> >>>java.net.ConnectException: Connection refused; For more details see:
> >> >>>http://wiki.apache.org/hadoop/ConnectionRefused
> >> >>>
> >> >>>Regards,
> >> >>>Santosh Akhilesh
> >> >>>Bangalore R&D
> >> >>>HUAWEI TECHNOLOGIES CO.,LTD.
> >> >>>
> >> >>>www.huawei.com
> >>
> >>>>>----------------------------------------------------------------------
> >>>>>--
> >> >>>-
> >> >>>-
> >> >>>-----------------------------------------------------------
> >> >>>This e-mail and its attachments contain confidential information from
> >> >>>HUAWEI, which
> >> >>>is intended only for the person or entity whose address is listed
> >>above.
> >> >>>Any use of the
> >> >>>information contained herein in any way (including, but not limited
> >>to,
> >> >>>total or partial
> >> >>>disclosure, reproduction, or dissemination) by persons other than the
> >> >>>intended
> >> >>>recipient(s) is prohibited. If you receive this e-mail in error,
> >>please
> >> >>>notify the sender by
> >> >>>phone or email immediately and delete it!
> >> >>>
> >> >>>________________________________________
> >> >>>From: Shi, Shaofeng [shaoshi@ebay.com]
> >> >>>Sent: Friday, February 27, 2015 2:47 PM
> >> >>>To: dev@klin.incubator.apache.org
> >> >>>Cc: Kulbhushan Rana
> >> >>>Subject: Re: Cube Build Failed at Last Step//RE: Error while making
> >>cube
> >> >>>& Measure option is not responding on GUI
> >> >>>
> >> >>>Did ou figure out the exception of "No enum constant
> >> >>>org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_REDUCES” ? Is it
> >>still
> >> >>>be
> >> >>>thrown in the logs? In the last step, Kyin need to parse the MR
> >> >>>counters
> >> >>>to update cube size; Please refer to
> >> >>>https://issues.apache.org/jira/browse/MAPREDUCE-5831 for that error.
> >> >>>
> >> >>>On 2/27/15 5:04 PM, "Santoshakhilesh"
> >> >>>wrote:
> >> >>>
> >> >>>>Hi Shaofeng ,
> >> >>>>          Cube building is failed at last step while loading Hfile
> >>to
> >> >>>>Hbase with exception "Can't get cube segment size.
> >> >>>>".  What could be reason ?
> >> >>>>
> >> >>>>parameter : -input
> >> >>>>/tmp/kylin-17a4606f-905b-4ea1-922a-27c2bfb5c68b/RetailCube/hfile/
> >> >>>>-htablename KYLIN_K27LDMX63W -cubename RetailCube
> >> >>>>
> >> >>>>Log:
> >> >>>>
> >> >>>>Start to execute command:
> >> >>>> -input
> >> >>>>/tmp/kylin-17a4606f-905b-4ea1-922a-27c2bfb5c68b/RetailCube/hfile/
> >> >>>>-htablename KYLIN_K27LDMX63W -cubename RetailCube
> >> >>>>Command execute return code 0
> >> >>>>Failed with Exception:java.lang.RuntimeException: Can't get cube
> >> >>>>segment
> >> >>>>size.
> >> >>>> at
> >>
> >>>>>>com.kylinolap.job.flow.JobFlowListener.updateCubeSegmentInfoOnSucceed
> >>>>>>(J
> >> >>>>o
> >> >>>>b
> >> >>>>F
> >> >>>>lowListener.java:247)
> >> >>>> at
> >>
> >>>>>>com.kylinolap.job.flow.JobFlowListener.jobWasExecuted(JobFlowListener
> >>>>>>.j
> >> >>>>a
> >> >>>>v
> >> >>>>a
> >> >>>>:101)
> >> >>>> at
> >>
> >>>>>>org.quartz.core.QuartzScheduler.notifyJobListenersWasExecuted(QuartzS
> >>>>>>ch
> >> >>>>e
> >> >>>>d
> >> >>>>u
> >> >>>>ler.java:1985)
> >> >>>> at
> >>
> >>>>>>org.quartz.core.JobRunShell.notifyJobListenersComplete(JobRunShell.ja
> >>>>>>va
> >> >>>>:
> >> >>>>3
> >> >>>>4
> >> >>>>0)
> >> >>>> at org.quartz.core.JobRunShell.run(JobRunShell.java:224)
> >> >>>> at
> >>
> >>>>>>org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.j
> >>>>>>av
> >> >>>>a
> >> >>>>:
> >> >>>>5
> >> >>>>73)
> >> >>>>
> >> >>>>I have checked in hbase shell and following are the tables in hbase;
> >> >>>>hbase(main):001:0> list
> >> >>>>TABLE
> >> >>>>
> >> >>>>KYLIN_K27LDMX63W
> >> >>>>kylin_metadata_qa
> >> >>>>kylin_metadata_qa_acl
> >> >>>>kylin_metadata_qa_cube
> >> >>>>kylin_metadata_qa_dict
> >> >>>>kylin_metadata_qa_invertedindex
> >> >>>>kylin_metadata_qa_job
> >> >>>>kylin_metadata_qa_job_output
> >> >>>>kylin_metadata_qa_proj
> >> >>>>kylin_metadata_qa_table_snapshot
> >> >>>>kylin_metadata_qa_user
> >> >>>>11 row(s) in 0.8990 seconds
> >> >>>>
> >> >>>>
> >> >>>>Regards,
> >> >>>>Santosh Akhilesh
> >> >>>>Bangalore R&D
> >> >>>>HUAWEI TECHNOLOGIES CO.,LTD.
> >> >>>>
> >> >>>>www.huawei.com
> >>
> >>>>>>---------------------------------------------------------------------
> >>>>>>--
> >> >>>>-
> >> >>>>-
> >> >>>>-
> >> >>>>-----------------------------------------------------------
> >> >>>>This e-mail and its attachments contain confidential information
> >>from
> >> >>>>HUAWEI, which
> >> >>>>is intended only for the person or entity whose address is listed
> >> >>>>above.
> >> >>>>Any use of the
> >> >>>>information contained herein in any way (including, but not limited
> >>to,
> >> >>>>total or partial
> >> >>>>disclosure, reproduction, or dissemination) by persons other than
> >>the
> >> >>>>intended
> >> >>>>recipient(s) is prohibited. If you receive this e-mail in error,
> >>please
> >> >>>>notify the sender by
> >> >>>>phone or email immediately and delete it!
> >> >>>>
> >> >>>>________________________________________
> >> >>>>From: Santoshakhilesh
> >> >>>>Sent: Friday, February 27, 2015 2:15 PM
> >> >>>>To: dev@kylin.incubator.apache.org
> >> >>>>Subject: RE: Error while making cube & Measure option is not
> >>responding
> >> >>>>on GUI
> >> >>>>
> >> >>>>I have manually copied the jar to /tmp/kylin , now satge 2 is done ,
> >> >>>>thanks.
> >> >>>>
> >> >>>>Regards,
> >> >>>>Santosh Akhilesh
> >> >>>>Bangalore R&D
> >> >>>>HUAWEI TECHNOLOGIES CO.,LTD.
> >> >>>>
> >> >>>>www.huawei.com
> >>
> >>>>>>---------------------------------------------------------------------
> >>>>>>--
> >> >>>>-
> >> >>>>-
> >> >>>>-
> >> >>>>-----------------------------------------------------------
> >> >>>>This e-mail and its attachments contain confidential information
> >>from
> >> >>>>HUAWEI, which
> >> >>>>is intended only for the person or entity whose address is listed
> >> >>>>above.
> >> >>>>Any use of the
> >> >>>>information contained herein in any way (including, but not limited
> >>to,
> >> >>>>total or partial
> >> >>>>disclosure, reproduction, or dissemination) by persons other than
> >>the
> >> >>>>intended
> >> >>>>recipient(s) is prohibited. If you receive this e-mail in error,
> >>please
> >> >>>>notify the sender by
> >> >>>>phone or email immediately and delete it!
> >> >>>>
> >> >>>>________________________________________
> >> >>>>From: Shi, Shaofeng [shaoshi@ebay.com]
> >> >>>>Sent: Friday, February 27, 2015 1:00 PM
> >> >>>>To: dev@kylin.incubator.apache.org
> >> >>>>Cc: Kulbhushan Rana
> >> >>>>Subject: Re: Error while making cube & Measure option is not
> >>responding
> >> >>>>on GUI
> >> >>>>
> >> >>>>In 0.6.x the packages are named with “com.kylinolap.xxx”, from 0.7
> >>we
> >> >>>>renamed the package to “org.apache.kylin.xxx”; When you downgrade to
> >> >>>>0.6,
> >> >>>>did you also replace the jar location with 0.6 ones in
> >> >>>>kylin.properties?
> >> >>>>
> >> >>>>On 2/27/15, 3:13 PM, "Santoshakhilesh"
> >> >>>>wrote:
> >> >>>>
> >> >>>>>Hi Shaofeng ,
> >> >>>>>         I have added my fact and dimension tables under default
> >> >>>>>database
> >> >>>>>of hive.
> >> >>>>>         Now stage 1 of Cube Build is ok. And there is failure at
> >> >>>>>step2.
> >> >>>>>        The map reduce job for the finding distinct columns of fact
> >> >>>>>table
> >> >>>>>is error. Yarn log is as below.
> >> >>>>>        Strangely this is class not found error. I have checked the
> >> >>>>>Kylin.properties and the jar is already set as below.
> >> >>>>>kylin. log has one exception connecting to linux/10.19.93.68 to
> >> >>>>>0.0.0.0:10020
> >> >>>>> Please help me to give a clue , I am also trying to check
> >>meanwhile
> >> >>>>>
> >> >>>>>Thanks.
> >> >>>>>kylin property
> >> >>>>># Temp folder in hdfs
> >> >>>>>kylin.hdfs.working.dir=/tmp
> >> >>>>># Path to the local(relative to job engine) job jar, job engine
> >>will
> >> >>>>>use
> >> >>>>>this jar
> >> >>>>>kylin.job.jar=/tmp/kylin/kylin-job-latest.jar
> >> >>>>>
> >> >>>>>Map Reduce error
> >> >>>>>----------------------------
> >> >>>>>2015-02-27 20:24:25,262 FATAL [main]
> >> >>>>>org.apache.hadoop.mapred.YarnChild:
> >> >>>>>Error running child : java.lang.NoClassDefFoundError:
> >> >>>>>com/kylinolap/common/mr/KylinMapper
> >> >>>>> at java.lang.ClassLoader.defineClass1(Native Method)
> >> >>>>> at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
> >> >>>>> at
> >>
> >>>>>>>java.security.SecureClassLoader.defineClass(SecureClassLoader.java:1
> >>>>>>>42
> >> >>>>>)
> >> >>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
> >> >>>>> at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
> >> >>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
> >> >>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >> >>>>> at java.security.AccessController.doPrivileged(Native Method)
> >> >>>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >> >>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> >> >>>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> >> >>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> >> >>>>> at java.lang.Class.forName0(Native Method)
> >> >>>>> at java.lang.Class.forName(Class.java:274)
> >> >>>>> at
> >>
> >>>>>>>org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configurat
> >>>>>>>io
> >> >>>>>n
> >> >>>>>.
> >> >>>>>j
> >> >>>>>a
> >> >>>>>va:2013)
> >> >>>>>
> >> >>>>>Kylin.log
> >> >>>>>QuartzScheduler_Worker-20]:[2015-02-27
> >>
> >>>>>>>20:25:00,663][DEBUG][com.kylinolap.job.engine.JobFetcher.execute(Job
> >>>>>>>Fe
> >> >>>>>t
> >> >>>>>c
> >> >>>>>h
> >> >>>>>e
> >> >>>>>r.java:60)] - 0 pending jobs
> >> >>>>>[QuartzScheduler_Worker-19]:[2015-02-27
> >>
> >>>>>>>20:25:01,730][ERROR][com.kylinolap.job.cmd.JavaHadoopCmdOutput.updat
> >>>>>>>eJ
> >> >>>>>o
> >> >>>>>b
> >> >>>>>C
> >> >>>>>o
> >> >>>>>unter(JavaHadoopCmdOutput.java:176)] - java.io.IOException:
> >> >>>>>java.net.ConnectException: Call From linux/10.19.93.68 to
> >> >>>>>0.0.0.0:10020
> >> >>>>>failed on connection exception: java.net.ConnectException:
> >>Connection
> >> >>>>>refused; For more details see:
> >> >>>>>http://wiki.apache.org/hadoop/ConnectionRefused
> >> >>>>>com.kylinolap.job.exception.JobException: java.io.IOException:
> >> >>>>>java.net.ConnectException: Call From linux/10.19.93.68 to
> >> >>>>>0.0.0.0:10020
> >> >>>>>failed on connection exception: java.net.ConnectException:
> >>Connection
> >> >>>>>refused; For more details see:
> >> >>>>>http://wiki.apache.org/hadoop/ConnectionRefused
> >> >>>>> at
> >>
> >>>>>>>com.kylinolap.job.hadoop.AbstractHadoopJob.getCounters(AbstractHadoo
> >>>>>>>pJ
> >> >>>>>o
> >> >>>>>b
> >> >>>>>.
> >> >>>>>j
> >> >>>>>ava:289)
> >> >>>>> at
> >>
> >>>>>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.updateJobCounter(JavaHadoo
> >>>>>>>pC
> >> >>>>>m
> >> >>>>>d
> >> >>>>>O
> >> >>>>>u
> >> >>>>>tput.java:162)
> >> >>>>> at
> >>
> >>>>>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.getStatus(JavaHadoopCmdOut
> >>>>>>>pu
> >> >>>>>t
> >> >>>>>.
> >> >>>>>j
> >> >>>>>a
> >> >>>>>va:85)
> >> >>>>> at
> >>
> >>>>>>>com.kylinolap.job.flow.AsyncJobFlowNode.execute(AsyncJobFlowNode.jav
> >>>>>>>a:
> >> >>>>>8
> >> >>>>>6
> >> >>>>>)
> >> >>>>> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
> >> >>>>> at
> >>
> >>>>>>>org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.
> >>>>>>>ja
> >> >>>>>v
> >> >>>>>a
> >> >>>>>:
> >> >>>>>5
> >> >>>>>73)
> >> >>>>>Caused by: java.io.IOException: java.net.ConnectException: Call
> >>From
> >> >>>>>linux/10.19.93.68 to 0.0.0.0:10020 failed on connection exception:
> >> >>>>>java.net.ConnectException: Connection refused; For more details
> >>see:
> >> >>>>>http://wiki.apache.org/hadoop/ConnectionRefused
> >> >>>>> at
> >>
> >>>>>>>org.apache.hadoop.mapred.ClientServiceDelegate.invoke(ClientServiceD
> >>>>>>>el
> >> >>>>>e
> >> >>>>>g
> >> >>>>>a
> >> >>>>>t
> >> >>>>>e.java:331)
> >> >>>>> at
> >>
> >>>>>>>org.apache.hadoop.mapred.ClientServiceDelegate.getJobCounters(Client
> >>>>>>>Se
> >> >>>>>r
> >> >>>>>v
> >> >>>>>i
> >> >>>>>c
> >> >>>>>eDelegate.java:368)
> >> >>>>> at
> >>
> >>>>>>>org.apache.hadoop.mapred.YARNRunner.getJobCounters(YARNRunner.java:5
> >>>>>>>11
> >> >>>>>)
> >> >>>>> at org.apache.hadoop.mapreduce.Job$7.run(Job.java:756)
> >> >>>>> at org.apache.hadoop.mapreduce.Job$7.run(Job.java:753)
> >> >>>>> at java.security.AccessController.doPrivileged(Native Method)
> >> >>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
> >> >>>>> at
> >>
> >>>>>>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInform
> >>>>>>>at
> >> >>>>>i
> >> >>>>>o
> >> >>>>>n
> >> >>>>>.
> >> >>>>>java:1491)
> >> >>>>> at org.apache.hadoop.mapreduce.Job.getCounters(Job.java:753)
> >> >>>>> at
> >>
> >>>>>>>com.kylinolap.job.hadoop.AbstractHadoopJob.getCounters(AbstractHadoo
> >>>>>>>pJ
> >> >>>>>o
> >> >>>>>b
> >> >>>>>.
> >> >>>>>j
> >> >>>>>ava:287)
> >> >>>>> ... 5 more
> >> >>>>>
> >> >>>>>Regards,
> >> >>>>>Santosh Akhilesh
> >> >>>>>Bangalore R&D
> >> >>>>>HUAWEI TECHNOLOGIES CO.,LTD.
> >> >>>>>
> >> >>>>>www.huawei.com
> >>
> >>>>>>>--------------------------------------------------------------------
> >>>>>>>--
> >> >>>>>-
> >> >>>>>-
> >> >>>>>-
> >> >>>>>-
> >> >>>>>-----------------------------------------------------------
> >> >>>>>This e-mail and its attachments contain confidential information
> >>from
> >> >>>>>HUAWEI, which
> >> >>>>>is intended only for the person or entity whose address is listed
> >> >>>>>above.
> >> >>>>>Any use of the
> >> >>>>>information contained herein in any way (including, but not limited
> >> >>>>>to,
> >> >>>>>total or partial
> >> >>>>>disclosure, reproduction, or dissemination) by persons other than
> >>the
> >> >>>>>intended
> >> >>>>>recipient(s) is prohibited. If you receive this e-mail in error,
> >> >>>>>please
> >> >>>>>notify the sender by
> >> >>>>>phone or email immediately and delete it!
> >> >>>>>
> >> >>>>>________________________________________
> >> >>>>>From: Shi, Shaofeng [shaoshi@ebay.com]
> >> >>>>>Sent: Friday, February 27, 2015 8:01 AM
> >> >>>>>To: dev@kylin.incubator.apache.org
> >> >>>>>Subject: Re: Error while making cube & Measure option is not
> >> >>>>>responding
> >> >>>>>on GUI
> >> >>>>>
> >> >>>>>In 0.6.x it only support tables in default database, this is a
> >> >>>>>limitation;
> >> >>>>>The support for non-default tables will be released in 0.7;
> >> >>>>>
> >> >>>>>To bypass this issue for now, please copy the table to default
> >> >>>>>database
> >> >>>>>as
> >> >>>>>a workaround;
> >> >>>>>
> >> >>>>>On 2/27/15, 10:16 AM, "Santosh Akhilesh"
> >> >>>>>wrote:
> >> >>>>>
> >> >>>>>>@Jason
> >> >>>>>>thanks , but now as suggested by Saofeng I m not using the
> >>inverted
> >> >>>>>>index
> >> >>>>>>brach as its not stable.
> >> >>>>>>I have switched back to 0.6 branch , in this branch yesterday
> >>night I
> >> >>>>>>could
> >> >>>>>>crete the cube successfully but there is issue while building it.
> >>I
> >> >>>>>>feel
> >> >>>>>>that at step 1 of cube build  while creating flat table when
> >>command
> >> >>>>>>is
> >> >>>>>>issued to hive if the tables are not under default datbase flat
> >>table
> >> >>>>>>creation is failed and cube build fails. my fact and dimension
> >>tables
> >> >>>>>>are
> >> >>>>>>under a database called retail.
> >> >>>>>>
> >> >>>>>>@Saofeng - Can you please confirm this behavior ? Do I need to
> >>create
> >> >>>>>>the
> >> >>>>>>hive tables under default database?
> >> >>>>>>
> >> >>>>>>On Fri, Feb 27, 2015 at 7:32 AM, jason zhong
> >> >>>>>>wrote:
> >> >>>>>>
> >> >>>>>>> @Santoshakhilesh
> >> >>>>>>>
> >> >>>>>>> 1. When I go to measure section and click on measure option ,
> >>there
> >> >>>>>>>is
> >> >>>>>>>no
> >> >>>>>>> response , I want add measure on qty and price with sum
> >> >>>>>>>          --bug fixed on inverted-index branch
> >> >>>>>>>
> >> >>>>>>>
> >> >>>>>>> On Fri, Feb 27, 2015 at 3:03 AM, Santosh Akhilesh <
> >> >>>>>>> santoshakhilesh@gmail.com
> >> >>>>>>> > wrote:
> >> >>>>>>>
> >> >>>>>>> > Hi Shaofeng ,
> >> >>>>>>> >      I have build the 0.6 version and now able to create the
> >>cube
> >> >>>>>>> > successfully.
> >> >>>>>>> >      While building the cube , it fails at step1 with
> >>following
> >> >>>>>>>error.
> >> >>>>>>> > Table not found 'DIM_ITEM'
> >> >>>>>>> >      the table exists , but its under retail data base and not
> >> >>>>>>>under
> >> >>>>>>> > default database.
> >> >>>>>>> >      does kylin require hive taables to be under default
> >>database
> >> >>>>>>>?
> >> >>>>>>>I
> >> >>>>>>>see
> >> >>>>>>> > the flat table being created under default database.
> >> >>>>>>> >
> >> >>>>>>> > Logging initialized using configuration in
> >> >>>>>>> >
> >> >>>>>>> >
> >> >>>>>>>
> >>
> >>>>>>>>>jar:file:/home/santosh/work/frameworks/apache-hive-1.0.0/lib/hive-
> >>>>>>>>>co
> >> >>>>>>>m
> >> >>>>>>>m
> >> >>>>>>>o
> >> >>>>>>>n
> >> >>>>>>>-
> >> >>>>>>>1.0.0.jar!/hive-log4j.properties
> >> >>>>>>> > SLF4J: Class path contains multiple SLF4J bindings.
> >> >>>>>>> > SLF4J: Found binding in
> >> >>>>>>> >
> >> >>>>>>> >
> >> >>>>>>>
> >>
> >>>>>>>>>[jar:file:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/
> >>>>>>>>>co
> >> >>>>>>>m
> >> >>>>>>>m
> >> >>>>>>>o
> >> >>>>>>>n
> >> >>>>>>>/
> >>
> >>>>>>>>>lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.cla
> >>>>>>>>>ss
> >> >>>>>>>]
> >> >>>>>>> > SLF4J: Found binding in
> >> >>>>>>> >
> >> >>>>>>> >
> >> >>>>>>>
> >>
> >>>>>>>>>[jar:file:/home/santosh/work/frameworks/apache-hive-1.0.0/lib/hive
> >>>>>>>>>-j
> >> >>>>>>>d
> >> >>>>>>>b
> >> >>>>>>>c
> >> >>>>>>>-
> >> >>>>>>>1
> >> >>>>>>>.0.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >> >>>>>>> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings
> >>for
> >> >>>>>>>an
> >> >>>>>>> > explanation.
> >> >>>>>>> > SLF4J: Actual binding is of type
> >> >>>>>>>[org.slf4j.impl.Log4jLoggerFactory]
> >> >>>>>>> > OK
> >> >>>>>>> > Time taken: 0.964 seconds
> >> >>>>>>> > OK
> >> >>>>>>> > Time taken: 0.948 seconds
> >> >>>>>>> > FAILED: SemanticException [Error 10001]: Line 12:11 Table not
> >> >>>>>>>found
> >> >>>>>>> > 'DIM_ITEM'
> >> >>>>>>> >
> >> >>>>>>> >
> >> >>>>>>> >
> >> >>>>>>> > Command is as below.
> >> >>>>>>> >
> >> >>>>>>> > hive -e "DROP TABLE IF EXISTS
> >> >>>>>>> >
> >>
> >>>>>>>>>kylin_intermediate_test_FULL_BUILD_8b30b29b_5f2c_4b63_8c0f_07d1f55
> >>>>>>>>>9d
> >> >>>>>>>d
> >> >>>>>>>4
> >> >>>>>>>4
> >> >>>>>>>;
> >> >>>>>>> > CREATE EXTERNAL TABLE IF NOT EXISTS
> >> >>>>>>> >
> >>
> >>>>>>>>>kylin_intermediate_test_FULL_BUILD_8b30b29b_5f2c_4b63_8c0f_07d1f55
> >>>>>>>>>9d
> >> >>>>>>>d
> >> >>>>>>>4
> >> >>>>>>>4
> >> >>>>>>> > (
> >> >>>>>>> > STOREID int
> >> >>>>>>> > ,ITEMID int
> >> >>>>>>> > ,CUSTID int
> >> >>>>>>> > ,QTY int
> >> >>>>>>> > ,AMOUNT double
> >> >>>>>>> > )
> >> >>>>>>> > ROW FORMAT DELIMITED FIELDS TERMINATED BY '\177'
> >> >>>>>>> > STORED AS SEQUENCEFILE
> >> >>>>>>> > LOCATION
> >> >>>>>>> >
> >> >>>>>>>
> >>
> >>>>>>>>>'/tmp/kylin-8b30b29b-5f2c-4b63-8c0f-07d1f559dd44/kylin_intermediat
> >>>>>>>>>e_
> >> >>>>>>>t
> >> >>>>>>>e
> >> >>>>>>>s
> >> >>>>>>>t
> >> >>>>>>>_
> >> >>>>>>>FULL_BUILD_8b30b29b_5f2c_4b63_8c0f_07d1f559dd44';
> >> >>>>>>> > SET hive.exec.compress.output=true;
> >> >>>>>>> > SET hive.auto.convert.join.noconditionaltask = true;
> >> >>>>>>> > SET hive.auto.convert.join.noconditionaltask.size = 300000000;
> >> >>>>>>> > INSERT OVERWRITE TABLE
> >> >>>>>>> >
> >>
> >>>>>>>>>kylin_intermediate_test_FULL_BUILD_8b30b29b_5f2c_4b63_8c0f_07d1f55
> >>>>>>>>>9d
> >> >>>>>>>d
> >> >>>>>>>4
> >> >>>>>>>4
> >> >>>>>>> > SELECT
> >> >>>>>>> > FACT_SALES.STOREID
> >> >>>>>>> > ,FACT_SALES.ITEMID
> >> >>>>>>> > ,FACT_SALES.CUSTID
> >> >>>>>>> > ,FACT_SALES.QTY
> >> >>>>>>> > ,FACT_SALES.AMOUNT
> >> >>>>>>> > FROM FACT_SALES
> >> >>>>>>> > INNER JOIN DIM_STORE
> >> >>>>>>> > ON FACT_SALES.STOREID = DIM_STORE.SROREID
> >> >>>>>>> > INNER JOIN DIM_ITEM
> >> >>>>>>> > ON FACT_SALES.ITEMID = DIM_ITEM.ITEMID
> >> >>>>>>> > INNER JOIN DIM_CUSTOMER
> >> >>>>>>> > ON FACT_SALES.CUSTID = DIM_CUSTOMER.CUSTID
> >> >>>>>>> > ;
> >> >>>>>>> > "
> >> >>>>>>> >
> >> >>>>>>> >
> >> >>>>>>> >
> >> >>>>>>> > On Thu, Feb 26, 2015 at 8:11 PM, Shi, Shaofeng
> >> >>>>>>>wrote:
> >> >>>>>>> >
> >> >>>>>>> > > The 0.7.1 is test version, its package contains the
> >>“snapshot”
> >> >>>>>>>suffix;
> >> >>>>>>> we
> >> >>>>>>> > > will upload a new package there; Luke will also add a
> >>message
> >> >>>>>>>there
> >> >>>>>>>to
> >> >>>>>>> > > avoid this confusion;
> >> >>>>>>> > >
> >> >>>>>>> > > Regarding the problem that you encountered, could you please
> >> >>>>>>>open
> >> >>>>>>>a
> >> >>>>>>> JIRA
> >> >>>>>>> > > ticket for tracking? Here is link of Apache JIRA:
> >> >>>>>>> > >
> >> >>>>>>> > > https://issues.apache.org/jira/secure/Dashboard.jspa
> >> >>>>>>> > >
> >> >>>>>>> > >
> >> >>>>>>> > > Thanks for the feedback!
> >> >>>>>>> > >
> >> >>>>>>> > > On 2/26/15, 10:21 PM, "Santosh Akhilesh"
> >> >>>>>>>
> >> >>>>>>> > > wrote:
> >> >>>>>>> > >
> >> >>>>>>> > > >Actually I see this being published on kylin webpage.
> >> >>>>>>> > > >http://kylin.incubator.apache.org/download/
> >> >>>>>>> > > >I am using 0.7.1 inverted index branch binary distribution.
> >> >>>>>>> > > >If this is not stable please give me the link of stable
> >>branch
> >> >>>>>>>I
> >> >>>>>>>would
> >> >>>>>>> > try
> >> >>>>>>> > > >building and testing tonight.
> >> >>>>>>> > > >On Thu, 26 Feb 2015 at 7:30 pm, Shi, Shaofeng
> >> >>>>>>>
> >> >>>>>>> wrote:
> >> >>>>>>> > > >
> >> >>>>>>> > > >> Hi Santosh, it is not recommended to use the dev code
> >>branch
> >> >>>>>>> > (actually I
> >> >>>>>>> > > >> don’t know how you get the v0.7.x build and what’s the
> >> >>>>>>>detail
> >> >>>>>>> version
> >> >>>>>>> > of
> >> >>>>>>> > > >> that; each day we submit many changes to that);
> >> >>>>>>> > > >>
> >> >>>>>>> > > >> The options are 1) switch back to latest release v0.6.5;
> >>or
> >> >>>>>>>2)
> >> >>>>>>>wait
> >> >>>>>>> > for
> >> >>>>>>> > > >> the formal release of 0.7, that should be in March;
> >> >>>>>>>Otherwise,
> >> >>>>>>>we
> >> >>>>>>> > > >>couldn’t
> >> >>>>>>> > > >> ensure there is no new problems come out in your next
> >>steps;
> >> >>>>>>> > > >>
> >> >>>>>>> > > >> On 2/26/15, 5:39 PM, "Santosh Akhilesh"
> >> >>>>>>>
> >> >>>>>>> > > >>wrote:
> >> >>>>>>> > > >>
> >> >>>>>>> > > >> >Hi Shaofeng
> >> >>>>>>> > > >> >So what do you suggest , how should I proceed further
> >>with
> >> >>>>>>>this
> >> >>>>>>> > > >>release?
> >> >>>>>>> > > >> >Will there be a patch? Any alternate way I can create
> >>cube?
> >> >>>>>>> > > >> >Please suggest.
> >> >>>>>>> > > >> >Regards
> >> >>>>>>> > > >> >Santosh
> >> >>>>>>> > > >> >On Thu, 26 Feb 2015 at 3:04 pm, Shi, Shaofeng
> >> >>>>>>>
> >> >>>>>>> > > wrote:
> >> >>>>>>> > > >> >
> >> >>>>>>> > > >> >> Hi Santosh,
> >> >>>>>>> > > >> >>
> >> >>>>>>> > > >> >> 0.7.1 hasn’t been formally released; from 0.6.x to
> >>0.7.x
> >> >>>>>>>we
> >> >>>>>>>have
> >> >>>>>>> > > >> >>metadata
> >> >>>>>>> > > >> >> structure change; While, the web UI (cube wizard) for
> >> >>>>>>>this
> >> >>>>>>>change
> >> >>>>>>> > > >>hasn’t
> >> >>>>>>> > > >> >> been stabilized; So it is not strange that you got
> >> >>>>>>>trouble
> >> >>>>>>>when
> >> >>>>>>> > > >>saving
> >> >>>>>>> > > >> >>the
> >> >>>>>>> > > >> >> cube;
> >> >>>>>>> > > >> >>
> >> >>>>>>> > > >> >> @Jason, any idea about the JS error?
> >> >>>>>>> > > >> >>
> >> >>>>>>> > > >> >> On 2/26/15, 5:08 PM, "Santosh Akhilesh" <
> >> >>>>>>> santoshakhilesh@gmail.com
> >> >>>>>>> > >
> >> >>>>>>> > > >> >>wrote:
> >> >>>>>>> > > >> >>
> >> >>>>>>> > > >> >> >Hi Shaofeng,
> >> >>>>>>> > > >> >> >
> >> >>>>>>> > > >> >> >I am using the binary distribution 0.7.1. I have not
> >> >>>>>>>been
> >> >>>>>>>able
> >> >>>>>>> to
> >> >>>>>>> > > >>save
> >> >>>>>>> > > >> >> >cube
> >> >>>>>>> > > >> >> >even once. I have tried creating new project and from
> >> >>>>>>>local
> >> >>>>>>> > machine
> >> >>>>>>> > > >>and
> >> >>>>>>> > > >> >> >server machine. But I am always stuck with this
> >>error. I
> >> >>>>>>>am
> >> >>>>>>> never
> >> >>>>>>> > > >> >>allowed
> >> >>>>>>> > > >> >> >to add measures and never been able to save the
> >>cube. I
> >> >>>>>>>also
> >> >>>>>>>see
> >> >>>>>>> > the
> >> >>>>>>> > > >> >> >kylin.log and it always tries to save cube with
> >>append
> >> >>>>>>>mode.
> >> >>>>>>>One
> >> >>>>>>> > > >>thing
> >> >>>>>>> > > >> >>I
> >> >>>>>>> > > >> >> >need to tell that at partition stage since I don't
> >>have
> >> >>>>>>>a
> >> >>>>>>>big
> >> >>>>>>> fact
> >> >>>>>>> > > >> >>table
> >> >>>>>>> > > >> >> >now I have not partititioned the fact table and I
> >>skip
> >> >>>>>>>this
> >> >>>>>>> step.
> >> >>>>>>> > > >>Does
> >> >>>>>>> > > >> >> >this
> >> >>>>>>> > > >> >> >have affect in saving the cube. Is this because some
> >> >>>>>>>metadata is
> >> >>>>>>> > > >> >>available
> >> >>>>>>> > > >> >> >and it tries to modify the cube? I am using latest
> >> >>>>>>>Hadoop
> >> >>>>>>>2.6.6.
> >> >>>>>>> > Yes
> >> >>>>>>> > > >> >>kylin
> >> >>>>>>> > > >> >> >propert I have not added the jar. I will add them and
> >> >>>>>>>check.
> >> >>>>>>>But
> >> >>>>>>> > > >>cube
> >> >>>>>>> > > >> >> >creation failure is really puzzling me. I could see
> >>no
> >> >>>>>>>error
> >> >>>>>>> logs
> >> >>>>>>> > in
> >> >>>>>>> > > >> >> >kylin.log.
> >> >>>>>>> > > >> >> >Regards
> >> >>>>>>> > > >> >> >Santosh
> >> >>>>>>> > > >> >> >On Thu, 26 Feb 2015 at 1:40 pm, Shi, Shaofeng
> >> >>>>>>>>>>>> >
> >> >>>>>>> > > >> wrote:
> >> >>>>>>> > > >> >> >
> >> >>>>>>> > > >> >> >> Which version or code branch are you using? I
> >>assume
> >> >>>>>>>you’re
> >> >>>>>>> > using
> >> >>>>>>> > > >>the
> >> >>>>>>> > > >> >> >> stable version from master; Seems you’re trying to
> >> >>>>>>>edit
> >> >>>>>>>an
> >> >>>>>>> > > >>existing
> >> >>>>>>> > > >> >>cube
> >> >>>>>>> > > >> >> >> to add new measurement, try refresh your browser's
> >> >>>>>>>cache;
> >> >>>>>>>if
> >> >>>>>>> it
> >> >>>>>>> > > >>still
> >> >>>>>>> > > >> >> >> couldn’t be saved, try to create a new cube;
> >> >>>>>>> > > >> >> >>
> >> >>>>>>> > > >> >> >> The two error traces in tomcat need be taken care:
> >> >>>>>>> > > >> >> >>
> >> >>>>>>> > > >> >> >> 1) java.lang.NoClassDefFoundError:
> >> >>>>>>> > > >> >> >>org/apache/kylin/common/mr/KylinMapper
> >> >>>>>>> > > >> >> >>         Please check kylin.properties file, making
> >> >>>>>>>sure
> >> >>>>>>>the
> >> >>>>>>> > > >> >> >>“kylin.job.jar”
> >> >>>>>>> > > >> >> >> points to a right jar file; It will be loaded in
> >> >>>>>>>Map-reduce;
> >> >>>>>>> > > >> >> >>
> >> >>>>>>> > > >> >> >> 2) java.lang.IllegalArgumentException: No enum
> >> >>>>>>>constant
> >> >>>>>>> > > >> >> >>
> >>org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_MAPS
> >> >>>>>>> > > >> >> >>         This indicates your hadoop version might be
> >> >>>>>>>old;
> >> >>>>>>> Please
> >> >>>>>>> > > >>check
> >> >>>>>>> > > >> >> >>and
> >> >>>>>>> > > >> >> >> ensure
> >> >>>>>>> > > >> >> >> hadoop version is 2.2 or above.
> >> >>>>>>> > > >> >> >>
> >> >>>>>>> > > >> >> >> On 2/26/15, 3:21 PM, "Santoshakhilesh"
> >> >>>>>>> > > >>
> >> >>>>>>> > > >> >> >> wrote:
> >> >>>>>>> > > >> >> >>
> >> >>>>>>> > > >> >> >> >Hi Shaofeng ,
> >> >>>>>>> > > >> >> >> >
> >> >>>>>>> > > >> >> >> >   I am using chrome , When I click on button to
> >>add
> >> >>>>>>> measures ,
> >> >>>>>>> > > >> >> >>following
> >> >>>>>>> > > >> >> >> >is error on chrome console. When I try to save the
> >> >>>>>>>cube
> >> >>>>>>>there
> >> >>>>>>> > is
> >> >>>>>>> > > >>no
> >> >>>>>>> > > >> >> >>error
> >> >>>>>>> > > >> >> >> >in console. I just get a error dialog saying
> >>failed
> >> >>>>>>>to
> >> >>>>>>>take
> >> >>>>>>> > > >>action
> >> >>>>>>> > > >> >>and
> >> >>>>>>> > > >> >> >> >gives me the JSON cube schema.
> >> >>>>>>> > > >> >> >> >
> >> >>>>>>> > > >> >> >> >Error on chrome debug console is as below;
> >> >>>>>>> > > >> >> >> >
> >> >>>>>>> > > >> >> >> > ReferenceError: CubeDescModel is not defined
> >> >>>>>>> > > >> >> >> >    at h.$scope.addNewMeasure
> >> >>>>>>>(scripts.min.0.js:15984)
> >> >>>>>>> > > >> >> >> >    at scripts.min.0.js:180
> >> >>>>>>> > > >> >> >> >    at scripts.min.0.js:197
> >> >>>>>>> > > >> >> >> >    at h.$eval (scripts.min.0.js:119)
> >> >>>>>>> > > >> >> >> >    at h.$apply (scripts.min.0.js:119)
> >> >>>>>>> > > >> >> >> >    at HTMLButtonElement.
> >> >>>>>>>(scripts.min.0.js:197)
> >> >>>>>>> > > >> >> >> >    at HTMLButtonElement.m.event.dispatch
> >> >>>>>>> (scripts.min.0.js:3)
> >> >>>>>>> > > >> >> >> >    at HTMLButtonElement.r.handle
> >> >>>>>>> > > >> >> >> >(scripts.min.0.js:3)scripts.min.0.js:100
> >>(anonymous
> >> >>>>>>> > > >> >> >> >function)scripts.min.0.js:77 (anonymous
> >> >>>>>>> > > >> >>function)scripts.min.0.js:119
> >> >>>>>>> > > >> >> >> >h.$applyscripts.min.0.js:197 (anonymous
> >> >>>>>>> > > >>function)scripts.min.0.js:3
> >> >>>>>>> > > >> >> >> >m.event.dispatchscripts.min.0.js:3 r.handle
> >> >>>>>>> > > >> >> >> >
> >> >>>>>>> > > >> >> >> >   About the hive table import , I got pass the
> >>run
> >> >>>>>>>shell
> >> >>>>>>> > command
> >> >>>>>>> > > >> >> >> >exception but it still fails the hadoop log is;
> >> >>>>>>> > > >> >> >> >2015-02-26 20:46:48,332 INFO [main]
> >> >>>>>>>org.apache.hadoop.mapred.
> >> >>>>>>> > > >> >> YarnChild:
> >> >>>>>>> > > >> >> >> >mapreduce.cluster.local.dir for child:
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>/tmp/hadoop-root/nm-local-dir/usercache/root/appcache/appli
> >> >>>>>>> > > >> >> >> cation_14249530
> >> >>>>>>> > > >> >> >> >91340_0002
> >> >>>>>>> > > >> >> >> >2015-02-26 20:46:48,776 INFO [main]
> >> >>>>>>> > > >> >> >> >org.apache.hadoop.conf.Configuration.deprecation:
> >> >>>>>>>session.id
> >> >>>>>>> > is
> >> >>>>>>> > > >> >> >> >deprecated. Instead, use dfs.metrics.session-id
> >> >>>>>>> > > >> >> >> >2015-02-26 20:46:49,310 INFO [main]
> >> >>>>>>> > > >>org.apache.hadoop.mapred.Task:
> >> >>>>>>> > > >> >> >>Using
> >> >>>>>>> > > >> >> >> >ResourceCalculatorProcessTree : [ ]
> >> >>>>>>> > > >> >> >> >2015-02-26 20:46:49,386 FATAL [main]
> >> >>>>>>> > > >> >> >>org.apache.hadoop.mapred.YarnChild:
> >> >>>>>>> > > >> >> >> >Error running child :
> >>java.lang.NoClassDefFoundError:
> >> >>>>>>> > > >> >> >> >org/apache/kylin/common/mr/KylinMapper
> >> >>>>>>> > > >> >> >> > at java.lang.ClassLoader.defineClass1(Native
> >>Method)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>>java.lang.ClassLoader.defineClass(ClassLoader.java:800)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >> >> >> >java.security.SecureClassLoader.defineClass(
> >> >>>>>>> > > >> >> SecureClassLoader.java:142)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> java.net.URLClassLoader.access$100(URLClassLoader.java:71)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>>java.net.URLClassLoader$1.run(URLClassLoader.java:361)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>>java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>>java.security.AccessController.doPrivileged(Native
> >> >>>>>>> Method)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>>java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> >> >>>>>>> > > >> >> >> >
> >> >>>>>>> > > >> >> >> >tomcat logs:
> >> >>>>>>> > > >> >> >> >usage: HiveColumnCardinalityJob
> >> >>>>>>> > > >> >> >> > -output         Output path
> >> >>>>>>> > > >> >> >> > -table >>>>> > > >> >
> >> >>>>>>> > > >> >> >> >wrote:
> >> >>>>>>> > > >> >> >> >
> >> >>>>>>> > > >> >> >> >>Hi Shaofeng ,
> >> >>>>>>> > > >> >> >> >>   Thanks for replying.
> >> >>>>>>> > > >> >> >> >>   Yes I am checking the yarn exception, But I
> >>find
> >> >>>>>>>that
> >> >>>>>>> this
> >> >>>>>>> > > >>error
> >> >>>>>>> > > >> >> >>comes
> >> >>>>>>> > > >> >> >> >>while importing the hive table to kylin.
> >> >>>>>>> > > >> >> >> >>   Even if this error comes , hive tables is
> >> >>>>>>>exported
> >> >>>>>>> > > >>successfully
> >> >>>>>>> > > >> >>in
> >> >>>>>>> > > >> >> >> >>kylin. Is this the reason why cube saving has
> >> >>>>>>>failed
> >> >>>>>>>?
> >> >>>>>>> > > >> >> >> >>
> >> >>>>>>> > > >> >> >> >>   Next step when I go on creating the cube for
> >> >>>>>>>following
> >> >>>>>>> > > >>schema  I
> >> >>>>>>> > > >> >> >>get
> >> >>>>>>> > > >> >> >> >>error at last step while saving and  I am unable
> >>to
> >> >>>>>>>add
> >> >>>>>>>any
> >> >>>>>>> > > >> >>measures ,
> >> >>>>>>> > > >> >> >> >>clicking on measure option just dont pop up any
> >> >>>>>>>dialog,
> >> >>>>>>> > > >> >> >> >>
> >> >>>>>>> > > >> >> >> >>I am using star schema with fact_sales as fact
> >>table
> >> >>>>>>>and
> >> >>>>>>> dim_*
> >> >>>>>>> > > >>as
> >> >>>>>>> > > >> >> >> >>dimension tables.
> >> >>>>>>> > > >> >> >> >>
> >> >>>>>>> > > >> >> >> >> fact_sales:
> >> >>>>>>> > > >> >> >> >> storeid                 int
> >> >>>>>>> > > >> >> >> >> itemid                  int
> >> >>>>>>> > > >> >> >> >> custid                  int
> >> >>>>>>> > > >> >> >> >> qty                     int
> >> >>>>>>> > > >> >> >> >> price                   double
> >> >>>>>>> > > >> >> >> >>
> >> >>>>>>> > > >> >> >> >>dim_customer
> >> >>>>>>> > > >> >> >> >> custid                  int
> >> >>>>>>> > > >> >> >> >> name                    string
> >> >>>>>>> > > >> >> >> >>
> >> >>>>>>> > > >> >> >> >> dim_item
> >> >>>>>>> > > >> >> >> >> itemid                  int
> >> >>>>>>> > > >> >> >> >> category                string
> >> >>>>>>> > > >> >> >> >> brand                   string
> >> >>>>>>> > > >> >> >> >> color                   string
> >> >>>>>>> > > >> >> >> >>
> >> >>>>>>> > > >> >> >> >>dim_store
> >> >>>>>>> > > >> >> >> >> storeid                 int
> >> >>>>>>> > > >> >> >> >> city                    string
> >> >>>>>>> > > >> >> >> >> state                   string
> >> >>>>>>> > > >> >> >> >>
> >> >>>>>>> > > >> >> >> >>JSON is as below.
> >> >>>>>>> > > >> >> >> >> The JSON is as below.
> >> >>>>>>> > > >> >> >> >>
> >> >>>>>>> > > >> >> >> >> {
> >> >>>>>>> > > >> >> >> >>   "name": "Retail_Cube",
> >> >>>>>>> > > >> >> >> >>   "description": "",
> >> >>>>>>> > > >> >> >> >>   "dimensions": [
> >> >>>>>>> > > >> >> >> >>     {
> >> >>>>>>> > > >> >> >> >>       "name": "RETAIL.FACT_SALES.STOREID",
> >> >>>>>>> > > >> >> >> >>       "table": "RETAIL.FACT_SALES",
> >> >>>>>>> > > >> >> >> >>       "hierarchy": false,
> >> >>>>>>> > > >> >> >> >>       "derived": null,
> >> >>>>>>> > > >> >> >> >>       "column": [
> >> >>>>>>> > > >> >> >> >>         "STOREID"
> >> >>>>>>> > > >> >> >> >>       ],
> >> >>>>>>> > > >> >> >> >>       "id": 1
> >> >>>>>>> > > >> >> >> >>     },
> >> >>>>>>> > > >> >> >> >>     {
> >> >>>>>>> > > >> >> >> >>       "name": "RETAIL.FACT_SALES.ITEMID",
> >> >>>>>>> > > >> >> >> >>       "table": "RETAIL.FACT_SALES",
> >> >>>>>>> > > >> >> >> >>       "hierarchy": false,
> >> >>>>>>> > > >> >> >> >>       "derived": null,
> >> >>>>>>> > > >> >> >> >>       "column": [
> >> >>>>>>> > > >> >> >> >>         "ITEMID"
> >> >>>>>>> > > >> >> >> >>       ],
> >> >>>>>>> > > >> >> >> >>       "id": 2
> >> >>>>>>> > > >> >> >> >>     },
> >> >>>>>>> > > >> >> >> >>     {
> >> >>>>>>> > > >> >> >> >>       "name": "RETAIL.FACT_SALES.CUSTID",
> >> >>>>>>> > > >> >> >> >>       "table": "RETAIL.FACT_SALES",
> >> >>>>>>> > > >> >> >> >>       "hierarchy": false,
> >> >>>>>>> > > >> >> >> >>       "derived": null,
> >> >>>>>>> > > >> >> >> >>       "column": [
> >> >>>>>>> > > >> >> >> >>         "CUSTID"
> >> >>>>>>> > > >> >> >> >>       ],
> >> >>>>>>> > > >> >> >> >>       "id": 3
> >> >>>>>>> > > >> >> >> >>     }
> >> >>>>>>> > > >> >> >> >>   ],
> >> >>>>>>> > > >> >> >> >>   "measures": [
> >> >>>>>>> > > >> >> >> >>     {
> >> >>>>>>> > > >> >> >> >>       "id": 1,
> >> >>>>>>> > > >> >> >> >>       "name": "_COUNT_",
> >> >>>>>>> > > >> >> >> >>       "function": {
> >> >>>>>>> > > >> >> >> >>         "expression": "COUNT",
> >> >>>>>>> > > >> >> >> >>         "returntype": "bigint",
> >> >>>>>>> > > >> >> >> >>         "parameter": {
> >> >>>>>>> > > >> >> >> >>           "type": "constant",
> >> >>>>>>> > > >> >> >> >>           "value": 1
> >> >>>>>>> > > >> >> >> >>         }
> >> >>>>>>> > > >> >> >> >>       }
> >> >>>>>>> > > >> >> >> >>     }
> >> >>>>>>> > > >> >> >> >>   ],
> >> >>>>>>> > > >> >> >> >>   "rowkey": {
> >> >>>>>>> > > >> >> >> >>     "rowkey_columns": [
> >> >>>>>>> > > >> >> >> >>       {
> >> >>>>>>> > > >> >> >> >>         "column": "STOREID",
> >> >>>>>>> > > >> >> >> >>         "length": 0,
> >> >>>>>>> > > >> >> >> >>         "dictionary": "true",
> >> >>>>>>> > > >> >> >> >>         "mandatory": false
> >> >>>>>>> > > >> >> >> >>       },
> >> >>>>>>> > > >> >> >> >>       {
> >> >>>>>>> > > >> >> >> >>         "column": "ITEMID",
> >> >>>>>>> > > >> >> >> >>         "length": 0,
> >> >>>>>>> > > >> >> >> >>         "dictionary": "true",
> >> >>>>>>> > > >> >> >> >>         "mandatory": false
> >> >>>>>>> > > >> >> >> >>       },
> >> >>>>>>> > > >> >> >> >>       {
> >> >>>>>>> > > >> >> >> >>         "column": "CUSTID",
> >> >>>>>>> > > >> >> >> >>         "length": 0,
> >> >>>>>>> > > >> >> >> >>         "dictionary": "true",
> >> >>>>>>> > > >> >> >> >>         "mandatory": false
> >> >>>>>>> > > >> >> >> >>       }
> >> >>>>>>> > > >> >> >> >>     ],
> >> >>>>>>> > > >> >> >> >>     "aggregation_groups": [
> >> >>>>>>> > > >> >> >> >>       [
> >> >>>>>>> > > >> >> >> >>         "STOREID",
> >> >>>>>>> > > >> >> >> >>         "ITEMID",
> >> >>>>>>> > > >> >> >> >>         "CUSTID"
> >> >>>>>>> > > >> >> >> >>       ]
> >> >>>>>>> > > >> >> >> >>     ]
> >> >>>>>>> > > >> >> >> >>   },
> >> >>>>>>> > > >> >> >> >>   "notify_list": [],
> >> >>>>>>> > > >> >> >> >>   "capacity": "",
> >> >>>>>>> > > >> >> >> >>   "hbase_mapping": {
> >> >>>>>>> > > >> >> >> >>     "column_family": [
> >> >>>>>>> > > >> >> >> >>       {
> >> >>>>>>> > > >> >> >> >>         "name": "f1",
> >> >>>>>>> > > >> >> >> >>         "columns": [
> >> >>>>>>> > > >> >> >> >>           {
> >> >>>>>>> > > >> >> >> >>             "qualifier": "m",
> >> >>>>>>> > > >> >> >> >>             "measure_refs": [
> >> >>>>>>> > > >> >> >> >>               "_COUNT_"
> >> >>>>>>> > > >> >> >> >>             ]
> >> >>>>>>> > > >> >> >> >>           }
> >> >>>>>>> > > >> >> >> >>         ]
> >> >>>>>>> > > >> >> >> >>       }
> >> >>>>>>> > > >> >> >> >>     ]
> >> >>>>>>> > > >> >> >> >>   },
> >> >>>>>>> > > >> >> >> >>   "project": "RetailProject",
> >> >>>>>>> > > >> >> >> >>   "model_name": "Retail_Cube"
> >> >>>>>>> > > >> >> >> >> }
> >> >>>>>>> > > >> >> >> >>
> >> >>>>>>> > > >> >> >> >>Regards,
> >> >>>>>>> > > >> >> >> >>Santosh Akhilesh
> >> >>>>>>> > > >> >> >> >>Bangalore R&D
> >> >>>>>>> > > >> >> >> >>HUAWEI TECHNOLOGIES CO.,LTD.
> >> >>>>>>> > > >> >> >> >>
> >> >>>>>>> > > >> >> >> >>www.huawei.com
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>----------------------------------------------------------
> >> >>>>>>> > > >> >> >> ---------------
> >> >>>>>>> > > >> >> >> >>-
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>-----------------------------------------------------------
> >> >>>>>>> > > >> >> >> >>This e-mail and its attachments contain
> >>confidential
> >> >>>>>>> > information
> >> >>>>>>> > > >> >>from
> >> >>>>>>> > > >> >> >> >>HUAWEI, which
> >> >>>>>>> > > >> >> >> >>is intended only for the person or entity whose
> >> >>>>>>>address
> >> >>>>>>>is
> >> >>>>>>> > > >>listed
> >> >>>>>>> > > >> >> >>above.
> >> >>>>>>> > > >> >> >> >>Any use of the
> >> >>>>>>> > > >> >> >> >>information contained herein in any way
> >>(including,
> >> >>>>>>>but
> >> >>>>>>>not
> >> >>>>>>> > > >>limited
> >> >>>>>>> > > >> >> >>to,
> >> >>>>>>> > > >> >> >> >>total or partial
> >> >>>>>>> > > >> >> >> >>disclosure, reproduction, or dissemination) by
> >> >>>>>>>persons
> >> >>>>>>>other
> >> >>>>>>> > > >>than
> >> >>>>>>> > > >> >>the
> >> >>>>>>> > > >> >> >> >>intended
> >> >>>>>>> > > >> >> >> >>recipient(s) is prohibited. If you receive this
> >> >>>>>>>e-mail
> >> >>>>>>>in
> >> >>>>>>> > error,
> >> >>>>>>> > > >> >> >>please
> >> >>>>>>> > > >> >> >> >>notify the sender by
> >> >>>>>>> > > >> >> >> >>phone or email immediately and delete it!
> >> >>>>>>> > > >> >> >> >>
> >> >>>>>>> > > >> >> >> >>________________________________________
> >> >>>>>>> > > >> >> >> >>From: Shi, Shaofeng [shaoshi@ebay.com]
> >> >>>>>>> > > >> >> >> >>Sent: Thursday, February 26, 2015 7:01 AM
> >> >>>>>>> > > >> >> >> >>To: dev@kylin.incubator.apache.org
> >> >>>>>>> > > >> >> >> >>Subject: Re: Error while making cube & Measure
> >> >>>>>>>option
> >> >>>>>>>is
> >> >>>>>>>not
> >> >>>>>>> > > >> >> >>responding
> >> >>>>>>> > > >> >> >> >>on GUI
> >> >>>>>>> > > >> >> >> >>
> >> >>>>>>> > > >> >> >> >>Hi Santosh,
> >> >>>>>>> > > >> >> >> >>
> >> >>>>>>> > > >> >> >> >>It looks like hadoop failed to execute some shell
> >> >>>>>>>command in
> >> >>>>>>> > the
> >> >>>>>>> > > >> >> >> >>container; You need dive into hadoop to see
> >>what¹s
> >> >>>>>>>the
> >> >>>>>>> > concrete
> >> >>>>>>> > > >> >>error.
> >> >>>>>>> > > >> >> >> >>You
> >> >>>>>>> > > >> >> >> >>can use yarn logs command to fetch all logs:
> >> >>>>>>> > > >> >> >> >>
> >> >>>>>>> > > >> >> >> >>yarn logs -applicationId
> >> >>>>>>> > > >> >> >> >>
> >> >>>>>>> > > >> >> >> >>
> >> >>>>>>> > > >> >> >> >>On 2/25/15, 7:39 PM, "Santosh Akhilesh"
> >> >>>>>>> > > >>>>>>> > > >> >
> >> >>>>>>> > > >> >> >> >>wrote:
> >> >>>>>>> > > >> >> >> >>
> >> >>>>>>> > > >> >> >> >>>Hi Luke / Shaofeng ,
> >> >>>>>>> > > >> >> >> >>>           Can you please help me to check this
> >> >>>>>>>issue.
> >> >>>>>>> > > >> >> >> >>>Regards,
> >> >>>>>>> > > >> >> >> >>>Santosh Akhilesh
> >> >>>>>>> > > >> >> >> >>>
> >> >>>>>>> > > >> >> >> >>>On Tue, Feb 24, 2015 at 10:41 PM, Santosh
> >>Akhilesh
> >> >>>>>>><
> >> >>>>>>> > > >> >> >> >>>santoshakhilesh@gmail.com> wrote:
> >> >>>>>>> > > >> >> >> >>>
> >> >>>>>>> > > >> >> >> >>>> Hi All ,
> >> >>>>>>> > > >> >> >> >>>>         is it because of following error in
> >>map
> >> >>>>>>>reduce
> >> >>>>>>> job
> >> >>>>>>> > ?
> >> >>>>>>> > > >> >>what
> >> >>>>>>> > > >> >> >> >>>>could
> >> >>>>>>> > > >> >> >> >>>>be
> >> >>>>>>> > > >> >> >> >>>> way to resolve this , a google search says
> >>that
> >> >>>>>>>its
> >> >>>>>>>issue
> >> >>>>>>> > of
> >> >>>>>>> > > >> >>Yarn
> >> >>>>>>> > > >> >> >> >>>>class
> >> >>>>>>> > > >> >> >> >>>> path , but I am not sure what it is ?
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> Kylin Hive Column Cardinality Job
> >> >>>>>>>table=RETAIL.FACT_SALES
> >> >>>>>>> > > >> >> >> >>>> output=/tmp/cardinality/RETAIL.FACT_SALES
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> Application application_1424791969399_0008
> >> >>>>>>>failed
> >> >>>>>>>2
> >> >>>>>>>times
> >> >>>>>>> > due
> >> >>>>>>> > > >> >>to AM
> >> >>>>>>> > > >> >> >> >>>> Container for
> >> >>>>>>>appattempt_1424791969399_0008_000002
> >> >>>>>>>exited
> >> >>>>>>> > > >>with
> >> >>>>>>> > > >> >> >> >>>>exitCode: 1
> >> >>>>>>> > > >> >> >> >>>> For more detailed output, check application
> >> >>>>>>>tracking
> >> >>>>>>> page:
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > >
> >> >>>>>>>>>http://santosh:8088/proxy/application_1424791969399_0008/Then,
> >> >>>>>>> > > >> >> >>click
> >> >>>>>>> > > >> >> >> >>>>on
> >> >>>>>>> > > >> >> >> >>>> links to logs of each attempt.
> >> >>>>>>> > > >> >> >> >>>> Diagnostics: Exception from container-launch.
> >> >>>>>>> > > >> >> >> >>>> Container id:
> >> >>>>>>>container_1424791969399_0008_02_000001
> >> >>>>>>> > > >> >> >> >>>> Exit code: 1
> >> >>>>>>> > > >> >> >> >>>> Stack trace: ExitCodeException exitCode=1:
> >> >>>>>>> > > >> >> >> >>>> at
> >> >>>>>>> org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
> >> >>>>>>> > > >> >> >> >>>> at
> >> >>>>>>>org.apache.hadoop.util.Shell.run(Shell.java:455)
> >> >>>>>>> > > >> >> >> >>>> at
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>>>org.apache.hadoop.util.Shell$ShellCommandExecutor.execut
> >> >>>>>>> > > >> >> >> e(Shell.java:71
> >> >>>>>>> > > >> >> >> >>>>5
> >> >>>>>>> > > >> >> >> >>>>)
> >> >>>>>>> > > >> >> >> >>>> at
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>>>org.apache.hadoop.yarn.server.nodemanager.DefaultContain
> >> >>>>>>> > > >> >> >> erExecutor.laun
> >> >>>>>>> > > >> >> >> >>>>c
> >> >>>>>>> > > >> >> >> >>>>h
> >> >>>>>>> > > >> >> >> >>>>Container(DefaultContainerExecutor.java:211)
> >> >>>>>>> > > >> >> >> >>>> at
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>>>org.apache.hadoop.yarn.server.nodemanager.containermanag
> >> >>>>>>> > > >> >> >> er.launcher.Con
> >> >>>>>>> > > >> >> >> >>>>t
> >> >>>>>>> > > >> >> >> >>>>a
> >> >>>>>>> > > >> >> >> >>>>inerLaunch.call(ContainerLaunch.java:302)
> >> >>>>>>> > > >> >> >> >>>> at
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>>>org.apache.hadoop.yarn.server.nodemanager.containermanag
> >> >>>>>>> > > >> >> >> er.launcher.Con
> >> >>>>>>> > > >> >> >> >>>>t
> >> >>>>>>> > > >> >> >> >>>>a
> >> >>>>>>> > > >> >> >> >>>>inerLaunch.call(ContainerLaunch.java:82)
> >> >>>>>>> > > >> >> >> >>>> at
> >> >>>>>>> java.util.concurrent.FutureTask.run(FutureTask.java:262)
> >> >>>>>>> > > >> >> >> >>>> at
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>>>java.util.concurrent.ThreadPoolExecutor.runWorker(Thread
> >> >>>>>>> > > >> >> >> PoolExecutor.ja
> >> >>>>>>> > > >> >> >> >>>>v
> >> >>>>>>> > > >> >> >> >>>>a
> >> >>>>>>> > > >> >> >> >>>>:1145)
> >> >>>>>>> > > >> >> >> >>>> at
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>>>java.util.concurrent.ThreadPoolExecutor$Worker.run(Threa
> >> >>>>>>> > > >> >> >> dPoolExecutor.j
> >> >>>>>>> > > >> >> >> >>>>a
> >> >>>>>>> > > >> >> >> >>>>v
> >> >>>>>>> > > >> >> >> >>>>a:615)
> >> >>>>>>> > > >> >> >> >>>> at java.lang.Thread.run(Thread.java:745)
> >> >>>>>>> > > >> >> >> >>>> Container exited with a non-zero exit code 1
> >> >>>>>>> > > >> >> >> >>>> Failing this attempt. Failing the application.
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> ---------- Forwarded message ----------
> >> >>>>>>> > > >> >> >> >>>> From: Santoshakhilesh
> >> >>>>>>>
> >> >>>>>>> > > >> >> >> >>>> Date: Tue, Feb 24, 2015 at 7:41 PM
> >> >>>>>>> > > >> >> >> >>>> Subject: FW: Error while making cube & Measure
> >> >>>>>>>option
> >> >>>>>>>is
> >> >>>>>>> > not
> >> >>>>>>> > > >> >> >> >>>>responding
> >> >>>>>>> > > >> >> >> >>>>on
> >> >>>>>>> > > >> >> >> >>>> GUI
> >> >>>>>>> > > >> >> >> >>>> To: "dev@kylin.incubator.apache.org"
> >> >>>>>>> > > >> >> >>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> hi ,
> >> >>>>>>> > > >> >> >> >>>>    please someone give me a hand to resolve
> >>this
> >> >>>>>>>issue ,
> >> >>>>>>> > > >>thanks.
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> Regards,
> >> >>>>>>> > > >> >> >> >>>> Santosh Akhilesh
> >> >>>>>>> > > >> >> >> >>>> Bangalore R&D
> >> >>>>>>> > > >> >> >> >>>> HUAWEI TECHNOLOGIES CO.,LTD.
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> www.huawei.com
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>>>--------------------------------------------------------
> >> >>>>>>> > > >> >> >> ---------------
> >> >>>>>>> > > >> >> >> >>>>-
> >> >>>>>>> > > >> >> >> >>>>-
> >> >>>>>>> > > >> >> >>
> >> >>>>>>> >>>>------------------------------------------------------------
> >> >>>>>>> > > >> >> >> >>>> This e-mail and its attachments contain
> >> >>>>>>>confidential
> >> >>>>>>> > > >>information
> >> >>>>>>> > > >> >> >>from
> >> >>>>>>> > > >> >> >> >>>> HUAWEI, which
> >> >>>>>>> > > >> >> >> >>>> is intended only for the person or entity
> >>whose
> >> >>>>>>>address
> >> >>>>>>> is
> >> >>>>>>> > > >> >>listed
> >> >>>>>>> > > >> >> >> >>>>above.
> >> >>>>>>> > > >> >> >> >>>> Any use of the
> >> >>>>>>> > > >> >> >> >>>> information contained herein in any way
> >> >>>>>>>(including,
> >> >>>>>>>but
> >> >>>>>>> not
> >> >>>>>>> > > >> >>limited
> >> >>>>>>> > > >> >> >> >>>>to,
> >> >>>>>>> > > >> >> >> >>>> total or partial
> >> >>>>>>> > > >> >> >> >>>> disclosure, reproduction, or dissemination) by
> >> >>>>>>>persons
> >> >>>>>>> > other
> >> >>>>>>> > > >> >>than
> >> >>>>>>> > > >> >> >>the
> >> >>>>>>> > > >> >> >> >>>> intended
> >> >>>>>>> > > >> >> >> >>>> recipient(s) is prohibited. If you receive
> >>this
> >> >>>>>>>e-mail in
> >> >>>>>>> > > >>error,
> >> >>>>>>> > > >> >> >> >>>>please
> >> >>>>>>> > > >> >> >> >>>> notify the sender by
> >> >>>>>>> > > >> >> >> >>>> phone or email immediately and delete it!
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> ________________________________________
> >> >>>>>>> > > >> >> >> >>>> From: Santoshakhilesh
> >> >>>>>>>[santosh.akhilesh@huawei.com]
> >> >>>>>>> > > >> >> >> >>>> Sent: Tuesday, February 24, 2015 12:55 PM
> >> >>>>>>> > > >> >> >> >>>> To: dev@kylin.incubator.apache.org
> >> >>>>>>> > > >> >> >> >>>> Cc: Kulbhushan Rana
> >> >>>>>>> > > >> >> >> >>>> Subject: FW: Error while making cube & Measure
> >> >>>>>>>option
> >> >>>>>>>is
> >> >>>>>>> > not
> >> >>>>>>> > > >> >> >> >>>>responding
> >> >>>>>>> > > >> >> >> >>>>on
> >> >>>>>>> > > >> >> >> >>>> GUI
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> 2. If  I ignore and continue and try to save
> >>the
> >> >>>>>>>cube
> >> >>>>>>>I
> >> >>>>>>> get
> >> >>>>>>> > > >>an
> >> >>>>>>> > > >> >> >> >>>>exception
> >> >>>>>>> > > >> >> >> >>>> in Kylin.log , I have checked the path is set
> >> >>>>>>>correctly
> >> >>>>>>> and
> >> >>>>>>> > > >> >> >> >>>>HCatInputFormat
> >> >>>>>>> > > >> >> >> >>>> this file is present in
> >> >>>>>>>hive-hcatalog-core-0.14.0.jar
> >> >>>>>>>.
> >> >>>>>>> > > >>Please
> >> >>>>>>> > > >> >>let
> >> >>>>>>> > > >> >> >>me
> >> >>>>>>> > > >> >> >> >>>>know
> >> >>>>>>> > > >> >> >> >>>> what can I do to resolve this ?
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>  -- This was path issue , now no more
> >>exception
> >> >>>>>>>in
> >> >>>>>>> > kylin.log
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> But saveing cube still fails with error. And
> >> >>>>>>>still
> >> >>>>>>>can't
> >> >>>>>>> > add
> >> >>>>>>> > > >> >> >>measures.
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> Error Message
> >> >>>>>>> > > >> >> >> >>>> Failed to take action.
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> In log I can find no exception. Following is
> >>the
> >> >>>>>>>last
> >> >>>>>>>log
> >> >>>>>>> > in
> >> >>>>>>> > > >> >> >>kylin.log
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> [pool-3-thread-1]:[2015-02-24
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>>>20:47:15,613][INFO][org.apache.kylin.job.impl.threadpool
> >> >>>>>>> > > >> >> >> .DefaultSchedul
> >> >>>>>>> > > >> >> >> >>>>e
> >> >>>>>>> > > >> >> >> >>>>r
> >> >>>>>>> > > >> >> >> >>>>$FetcherRunner.run(DefaultScheduler.java:117)]
> >> >>>>>>> > > >> >> >> >>>> - Job Fetcher: 0 running, 0 actual running, 0
> >> >>>>>>>ready,
> >> >>>>>>>6
> >> >>>>>>> > others
> >> >>>>>>> > > >> >> >> >>>> [http-bio-7070-exec-2]:[2015-02-24
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>>>20:47:51,610][DEBUG][org.apache.kylin.rest.controller.Cu
> >> >>>>>>> > > >> >> >> beController.de
> >> >>>>>>> > > >> >> >> >>>>s
> >> >>>>>>> > > >> >> >> >>>>e
> >> >>>>>>> > > >> >> >> >>>>rializeDataModelDesc(CubeController.java:459)]
> >> >>>>>>> > > >> >> >> >>>> - Saving cube {
> >> >>>>>>> > > >> >> >> >>>>   "name": "",
> >> >>>>>>> > > >> >> >> >>>>   "fact_table": "RETAIL.FACT_SALES",
> >> >>>>>>> > > >> >> >> >>>>   "lookups": [],
> >> >>>>>>> > > >> >> >> >>>>   "filter_condition": "",
> >> >>>>>>> > > >> >> >> >>>>   "capacity": "SMALL",
> >> >>>>>>> > > >> >> >> >>>>   "partition_desc": {
> >> >>>>>>> > > >> >> >> >>>>     "partition_date_column": "",
> >> >>>>>>> > > >> >> >> >>>>     "partition_date_start": 0,
> >> >>>>>>> > > >> >> >> >>>>     "partition_type": "APPEND"
> >> >>>>>>> > > >> >> >> >>>>   },
> >> >>>>>>> > > >> >> >> >>>>   "last_modified": 0
> >> >>>>>>> > > >> >> >> >>>> }
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> local access logs all with 200 , so seems ok.
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> 10.18.146.105 - - [24/Feb/2015:20:46:56 +0800]
> >> >>>>>>>"GET
> >> >>>>>>> > > >> >> >> >>>> /kylin/api/user/authentication HTTP/1.1" 200
> >>246
> >> >>>>>>> > > >> >> >> >>>> 10.18.146.105 - - [24/Feb/2015:20:47:07 +0800]
> >> >>>>>>>"GET
> >> >>>>>>> > > >> >> >> >>>> /kylin/api/user/authentication HTTP/1.1" 200
> >>246
> >> >>>>>>> > > >> >> >> >>>> 10.18.146.105 - - [24/Feb/2015:20:47:27 +0800]
> >> >>>>>>>"GET
> >> >>>>>>> > > >> >> >> >>>> /kylin/api/user/authentication HTTP/1.1" 200
> >>246
> >> >>>>>>> > > >> >> >> >>>> 10.18.146.105 - - [24/Feb/2015:20:47:28 +0800]
> >> >>>>>>>"GET
> >> >>>>>>> > > >> >> >> >>>> /kylin/api/user/authentication HTTP/1.1" 200
> >>246
> >> >>>>>>> > > >> >> >> >>>> 10.18.146.105 - - [24/Feb/2015:20:47:34 +0800]
> >> >>>>>>>"GET
> >> >>>>>>> > > >> >> >> >>>> /kylin/api/user/authentication HTTP/1.1" 200
> >>246
> >> >>>>>>> > > >> >> >> >>>> 10.18.146.105 - - [24/Feb/2015:20:47:48 +0800]
> >> >>>>>>>"GET
> >> >>>>>>> > > >> >> >> >>>> /kylin/api/user/authentication HTTP/1.1" 200
> >>246
> >> >>>>>>> > > >> >> >> >>>> 10.18.146.105 - - [24/Feb/2015:20:47:51 +0800]
> >> >>>>>>>"POST
> >> >>>>>>> > > >> >> >>/kylin/api/cubes
> >> >>>>>>> > > >> >> >> >>>> HTTP/1.1" 200 701
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> Regards,
> >> >>>>>>> > > >> >> >> >>>> Santosh Akhilesh
> >> >>>>>>> > > >> >> >> >>>> Bangalore R&D
> >> >>>>>>> > > >> >> >> >>>> HUAWEI TECHNOLOGIES CO.,LTD.
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> www.huawei.com
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>>>--------------------------------------------------------
> >> >>>>>>> > > >> >> >> ---------------
> >> >>>>>>> > > >> >> >> >>>>-
> >> >>>>>>> > > >> >> >> >>>>-
> >> >>>>>>> > > >> >> >>
> >> >>>>>>> >>>>------------------------------------------------------------
> >> >>>>>>> > > >> >> >> >>>> This e-mail and its attachments contain
> >> >>>>>>>confidential
> >> >>>>>>> > > >>information
> >> >>>>>>> > > >> >> >>from
> >> >>>>>>> > > >> >> >> >>>> HUAWEI, which
> >> >>>>>>> > > >> >> >> >>>> is intended only for the person or entity
> >>whose
> >> >>>>>>>address
> >> >>>>>>> is
> >> >>>>>>> > > >> >>listed
> >> >>>>>>> > > >> >> >> >>>>above.
> >> >>>>>>> > > >> >> >> >>>> Any use of the
> >> >>>>>>> > > >> >> >> >>>> information contained herein in any way
> >> >>>>>>>(including,
> >> >>>>>>>but
> >> >>>>>>> not
> >> >>>>>>> > > >> >>limited
> >> >>>>>>> > > >> >> >> >>>>to,
> >> >>>>>>> > > >> >> >> >>>> total or partial
> >> >>>>>>> > > >> >> >> >>>> disclosure, reproduction, or dissemination) by
> >> >>>>>>>persons
> >> >>>>>>> > other
> >> >>>>>>> > > >> >>than
> >> >>>>>>> > > >> >> >>the
> >> >>>>>>> > > >> >> >> >>>> intended
> >> >>>>>>> > > >> >> >> >>>> recipient(s) is prohibited. If you receive
> >>this
> >> >>>>>>>e-mail in
> >> >>>>>>> > > >>error,
> >> >>>>>>> > > >> >> >> >>>>please
> >> >>>>>>> > > >> >> >> >>>> notify the sender by
> >> >>>>>>> > > >> >> >> >>>> phone or email immediately and delete it!
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> ________________________________________
> >> >>>>>>> > > >> >> >> >>>> From: Santoshakhilesh
> >> >>>>>>>[santosh.akhilesh@huawei.com]
> >> >>>>>>> > > >> >> >> >>>> Sent: Tuesday, February 24, 2015 12:09 PM
> >> >>>>>>> > > >> >> >> >>>> To: dev@kylin.incubator.apache.org
> >> >>>>>>> > > >> >> >> >>>> Cc: Kulbhushan Rana
> >> >>>>>>> > > >> >> >> >>>> Subject: Error while making cube & Measure
> >>option
> >> >>>>>>>is
> >> >>>>>>>not
> >> >>>>>>> > > >> >> >>responding on
> >> >>>>>>> > > >> >> >> >>>>GUI
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> Hi All ,
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>     I am building a simple cube for test and
> >> >>>>>>>using
> >> >>>>>>>the
> >> >>>>>>> > binary
> >> >>>>>>> > > >> >>build
> >> >>>>>>> > > >> >> >> >>>>0.7.1
> >> >>>>>>> > > >> >> >> >>>> . I have following hive tables with columns.
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> fact_sales:
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> storeid                 int
> >> >>>>>>> > > >> >> >> >>>> itemid                  int
> >> >>>>>>> > > >> >> >> >>>> custid                  int
> >> >>>>>>> > > >> >> >> >>>> qty                     int
> >> >>>>>>> > > >> >> >> >>>> price                   double
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> dim_customer
> >> >>>>>>> > > >> >> >> >>>> custid                  int
> >> >>>>>>> > > >> >> >> >>>> name                    string
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> dim_item
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> itemid                  int
> >> >>>>>>> > > >> >> >> >>>> category                string
> >> >>>>>>> > > >> >> >> >>>> brand                   string
> >> >>>>>>> > > >> >> >> >>>> color                   string
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> dim_store
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> storeid                 int
> >> >>>>>>> > > >> >> >> >>>> city                    string
> >> >>>>>>> > > >> >> >> >>>> state                   string
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> Please help me to answer following issues;
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> 1. When I go to measure section and click on
> >> >>>>>>>measure
> >> >>>>>>> > option ,
> >> >>>>>>> > > >> >> >>there is
> >> >>>>>>> > > >> >> >> >>>>no
> >> >>>>>>> > > >> >> >> >>>> response , I want add measure on qty and price
> >> >>>>>>>with
> >> >>>>>>>sum
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> 2. If  I ignore and continue and try to save
> >>the
> >> >>>>>>>cube
> >> >>>>>>>I
> >> >>>>>>> get
> >> >>>>>>> > > >>an
> >> >>>>>>> > > >> >> >> >>>>exception
> >> >>>>>>> > > >> >> >> >>>> in Kylin.log , I have checked the path is set
> >> >>>>>>>correctly
> >> >>>>>>> and
> >> >>>>>>> > > >> >> >> >>>>HCatInputFormat
> >> >>>>>>> > > >> >> >> >>>> this file is present in
> >> >>>>>>>hive-hcatalog-core-0.14.0.jar
> >> >>>>>>>.
> >> >>>>>>> > > >>Please
> >> >>>>>>> > > >> >>let
> >> >>>>>>> > > >> >> >>me
> >> >>>>>>> > > >> >> >> >>>>know
> >> >>>>>>> > > >> >> >> >>>> what can I do to resolve this ?
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> 3. Also I have another question since this is
> >>a
> >> >>>>>>>test
> >> >>>>>>>and
> >> >>>>>>> > > >>data is
> >> >>>>>>> > > >> >> >>small
> >> >>>>>>> > > >> >> >> >>>>I
> >> >>>>>>> > > >> >> >> >>>> have not partitioned the fact table , is it
> >>ok to
> >> >>>>>>>skip
> >> >>>>>>> > > >>partition
> >> >>>>>>> > > >> >> >>stage
> >> >>>>>>> > > >> >> >> >>>> while cube build ?
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> Exception
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> pool-4-thread-4]:[2015-02-24
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>>>19:26:32,577][ERROR][org.apache.kylin.job.impl.threadpoo
> >> >>>>>>> > > >> >> >> l.DefaultSchedu
> >> >>>>>>> > > >> >> >> >>>>l
> >> >>>>>>> > > >> >> >> >>>>e
> >> >>>>>>> > > >> >> >> >>>>r$JobRunner.run(DefaultScheduler.java:134)]
> >> >>>>>>> > > >> >> >> >>>> - ExecuteException
> >> >>>>>>> job:c3532a6f-97ea-474a-b36a-218dd517cedb
> >> >>>>>>> > > >> >> >> >>>>
> >>org.apache.kylin.job.exception.ExecuteException:
> >> >>>>>>> > > >> >> >> >>>>
> >>org.apache.kylin.job.exception.ExecuteException:
> >> >>>>>>> > > >> >> >> >>>> java.lang.NoClassDefFoundError:
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>>org/apache/hive/hcatalog/mapreduce/HCatInputFormat
> >> >>>>>>> > > >> >> >> >>>>  at
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>>>org.apache.kylin.job.execution.AbstractExecutable.execut
> >> >>>>>>> > > >> >> >> e(AbstractExecu
> >> >>>>>>> > > >> >> >> >>>>t
> >> >>>>>>> > > >> >> >> >>>>a
> >> >>>>>>> > > >> >> >> >>>>ble.java:102)
> >> >>>>>>> > > >> >> >> >>>>  at
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>>>org.apache.kylin.job.impl.threadpool.DefaultScheduler$Jo
> >> >>>>>>> > > >> >> >> bRunner.run(Def
> >> >>>>>>> > > >> >> >> >>>>a
> >> >>>>>>> > > >> >> >> >>>>u
> >> >>>>>>> > > >> >> >> >>>>ltScheduler.java:132)
> >> >>>>>>> > > >> >> >> >>>>  at
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>>>java.util.concurrent.ThreadPoolExecutor.runWorker(Thread
> >> >>>>>>> > > >> >> >> PoolExecutor.ja
> >> >>>>>>> > > >> >> >> >>>>v
> >> >>>>>>> > > >> >> >> >>>>a
> >> >>>>>>> > > >> >> >> >>>>:1145)
> >> >>>>>>> > > >> >> >> >>>>  at
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>>>java.util.concurrent.ThreadPoolExecutor$Worker.run(Threa
> >> >>>>>>> > > >> >> >> dPoolExecutor.j
> >> >>>>>>> > > >> >> >> >>>>a
> >> >>>>>>> > > >> >> >> >>>>v
> >> >>>>>>> > > >> >> >> >>>>a:615)
> >> >>>>>>> > > >> >> >> >>>>  at java.lang.Thread.run(Thread.java:745)
> >> >>>>>>> > > >> >> >> >>>> Caused by:
> >> >>>>>>> org.apache.kylin.job.exception.ExecuteException:
> >> >>>>>>> > > >> >> >> >>>> java.lang.NoClassDefFoundError:
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>>org/apache/hive/hcatalog/mapreduce/HCatInputFormat
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> The JSON is as below.
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> {
> >> >>>>>>> > > >> >> >> >>>>   "name": "Retail_Cube",
> >> >>>>>>> > > >> >> >> >>>>   "description": "",
> >> >>>>>>> > > >> >> >> >>>>   "dimensions": [
> >> >>>>>>> > > >> >> >> >>>>     {
> >> >>>>>>> > > >> >> >> >>>>       "name": "RETAIL.FACT_SALES.STOREID",
> >> >>>>>>> > > >> >> >> >>>>       "table": "RETAIL.FACT_SALES",
> >> >>>>>>> > > >> >> >> >>>>       "hierarchy": false,
> >> >>>>>>> > > >> >> >> >>>>       "derived": null,
> >> >>>>>>> > > >> >> >> >>>>       "column": [
> >> >>>>>>> > > >> >> >> >>>>         "STOREID"
> >> >>>>>>> > > >> >> >> >>>>       ],
> >> >>>>>>> > > >> >> >> >>>>       "id": 1
> >> >>>>>>> > > >> >> >> >>>>     },
> >> >>>>>>> > > >> >> >> >>>>     {
> >> >>>>>>> > > >> >> >> >>>>       "name": "RETAIL.FACT_SALES.ITEMID",
> >> >>>>>>> > > >> >> >> >>>>       "table": "RETAIL.FACT_SALES",
> >> >>>>>>> > > >> >> >> >>>>       "hierarchy": false,
> >> >>>>>>> > > >> >> >> >>>>       "derived": null,
> >> >>>>>>> > > >> >> >> >>>>       "column": [
> >> >>>>>>> > > >> >> >> >>>>         "ITEMID"
> >> >>>>>>> > > >> >> >> >>>>       ],
> >> >>>>>>> > > >> >> >> >>>>       "id": 2
> >> >>>>>>> > > >> >> >> >>>>     },
> >> >>>>>>> > > >> >> >> >>>>     {
> >> >>>>>>> > > >> >> >> >>>>       "name": "RETAIL.FACT_SALES.CUSTID",
> >> >>>>>>> > > >> >> >> >>>>       "table": "RETAIL.FACT_SALES",
> >> >>>>>>> > > >> >> >> >>>>       "hierarchy": false,
> >> >>>>>>> > > >> >> >> >>>>       "derived": null,
> >> >>>>>>> > > >> >> >> >>>>       "column": [
> >> >>>>>>> > > >> >> >> >>>>         "CUSTID"
> >> >>>>>>> > > >> >> >> >>>>       ],
> >> >>>>>>> > > >> >> >> >>>>       "id": 3
> >> >>>>>>> > > >> >> >> >>>>     }
> >> >>>>>>> > > >> >> >> >>>>   ],
> >> >>>>>>> > > >> >> >> >>>>   "measures": [
> >> >>>>>>> > > >> >> >> >>>>     {
> >> >>>>>>> > > >> >> >> >>>>       "id": 1,
> >> >>>>>>> > > >> >> >> >>>>       "name": "_COUNT_",
> >> >>>>>>> > > >> >> >> >>>>       "function": {
> >> >>>>>>> > > >> >> >> >>>>         "expression": "COUNT",
> >> >>>>>>> > > >> >> >> >>>>         "returntype": "bigint",
> >> >>>>>>> > > >> >> >> >>>>         "parameter": {
> >> >>>>>>> > > >> >> >> >>>>           "type": "constant",
> >> >>>>>>> > > >> >> >> >>>>           "value": 1
> >> >>>>>>> > > >> >> >> >>>>         }
> >> >>>>>>> > > >> >> >> >>>>       }
> >> >>>>>>> > > >> >> >> >>>>     }
> >> >>>>>>> > > >> >> >> >>>>   ],
> >> >>>>>>> > > >> >> >> >>>>   "rowkey": {
> >> >>>>>>> > > >> >> >> >>>>     "rowkey_columns": [
> >> >>>>>>> > > >> >> >> >>>>       {
> >> >>>>>>> > > >> >> >> >>>>         "column": "STOREID",
> >> >>>>>>> > > >> >> >> >>>>         "length": 0,
> >> >>>>>>> > > >> >> >> >>>>         "dictionary": "true",
> >> >>>>>>> > > >> >> >> >>>>         "mandatory": false
> >> >>>>>>> > > >> >> >> >>>>       },
> >> >>>>>>> > > >> >> >> >>>>       {
> >> >>>>>>> > > >> >> >> >>>>         "column": "ITEMID",
> >> >>>>>>> > > >> >> >> >>>>         "length": 0,
> >> >>>>>>> > > >> >> >> >>>>         "dictionary": "true",
> >> >>>>>>> > > >> >> >> >>>>         "mandatory": false
> >> >>>>>>> > > >> >> >> >>>>       },
> >> >>>>>>> > > >> >> >> >>>>       {
> >> >>>>>>> > > >> >> >> >>>>         "column": "CUSTID",
> >> >>>>>>> > > >> >> >> >>>>         "length": 0,
> >> >>>>>>> > > >> >> >> >>>>         "dictionary": "true",
> >> >>>>>>> > > >> >> >> >>>>         "mandatory": false
> >> >>>>>>> > > >> >> >> >>>>       }
> >> >>>>>>> > > >> >> >> >>>>     ],
> >> >>>>>>> > > >> >> >> >>>>     "aggregation_groups": [
> >> >>>>>>> > > >> >> >> >>>>       [
> >> >>>>>>> > > >> >> >> >>>>         "STOREID",
> >> >>>>>>> > > >> >> >> >>>>         "ITEMID",
> >> >>>>>>> > > >> >> >> >>>>         "CUSTID"
> >> >>>>>>> > > >> >> >> >>>>       ]
> >> >>>>>>> > > >> >> >> >>>>     ]
> >> >>>>>>> > > >> >> >> >>>>   },
> >> >>>>>>> > > >> >> >> >>>>   "notify_list": [],
> >> >>>>>>> > > >> >> >> >>>>   "capacity": "",
> >> >>>>>>> > > >> >> >> >>>>   "hbase_mapping": {
> >> >>>>>>> > > >> >> >> >>>>     "column_family": [
> >> >>>>>>> > > >> >> >> >>>>       {
> >> >>>>>>> > > >> >> >> >>>>         "name": "f1",
> >> >>>>>>> > > >> >> >> >>>>         "columns": [
> >> >>>>>>> > > >> >> >> >>>>           {
> >> >>>>>>> > > >> >> >> >>>>             "qualifier": "m",
> >> >>>>>>> > > >> >> >> >>>>             "measure_refs": [
> >> >>>>>>> > > >> >> >> >>>>               "_COUNT_"
> >> >>>>>>> > > >> >> >> >>>>             ]
> >> >>>>>>> > > >> >> >> >>>>           }
> >> >>>>>>> > > >> >> >> >>>>         ]
> >> >>>>>>> > > >> >> >> >>>>       }
> >> >>>>>>> > > >> >> >> >>>>     ]
> >> >>>>>>> > > >> >> >> >>>>   },
> >> >>>>>>> > > >> >> >> >>>>   "project": "RetailProject",
> >> >>>>>>> > > >> >> >> >>>>   "model_name": "Retail_Cube"
> >> >>>>>>> > > >> >> >> >>>> }
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> Regards,
> >> >>>>>>> > > >> >> >> >>>> Santosh Akhilesh
> >> >>>>>>> > > >> >> >> >>>> Bangalore R&D
> >> >>>>>>> > > >> >> >> >>>> HUAWEI TECHNOLOGIES CO.,LTD.
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> www.huawei.com
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>>>>--------------------------------------------------------
> >> >>>>>>> > > >> >> >> ---------------
> >> >>>>>>> > > >> >> >> >>>>-
> >> >>>>>>> > > >> >> >> >>>>-
> >> >>>>>>> > > >> >> >>
> >> >>>>>>> >>>>------------------------------------------------------------
> >> >>>>>>> > > >> >> >> >>>> This e-mail and its attachments contain
> >> >>>>>>>confidential
> >> >>>>>>> > > >>information
> >> >>>>>>> > > >> >> >>from
> >> >>>>>>> > > >> >> >> >>>> HUAWEI, which
> >> >>>>>>> > > >> >> >> >>>> is intended only for the person or entity
> >>whose
> >> >>>>>>>address
> >> >>>>>>> is
> >> >>>>>>> > > >> >>listed
> >> >>>>>>> > > >> >> >> >>>>above.
> >> >>>>>>> > > >> >> >> >>>> Any use of the
> >> >>>>>>> > > >> >> >> >>>> information contained herein in any way
> >> >>>>>>>(including,
> >> >>>>>>>but
> >> >>>>>>> not
> >> >>>>>>> > > >> >>limited
> >> >>>>>>> > > >> >> >> >>>>to,
> >> >>>>>>> > > >> >> >> >>>> total or partial
> >> >>>>>>> > > >> >> >> >>>> disclosure, reproduction, or dissemination) by
> >> >>>>>>>persons
> >> >>>>>>> > other
> >> >>>>>>> > > >> >>than
> >> >>>>>>> > > >> >> >>the
> >> >>>>>>> > > >> >> >> >>>> intended
> >> >>>>>>> > > >> >> >> >>>> recipient(s) is prohibited. If you receive
> >>this
> >> >>>>>>>e-mail in
> >> >>>>>>> > > >>error,
> >> >>>>>>> > > >> >> >> >>>>please
> >> >>>>>>> > > >> >> >> >>>> notify the sender by
> >> >>>>>>> > > >> >> >> >>>> phone or email immediately and delete it!
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>> --
> >> >>>>>>> > > >> >> >> >>>> Regards,
> >> >>>>>>> > > >> >> >> >>>> Santosh Akhilesh
> >> >>>>>>> > > >> >> >> >>>> +91-0-9845482201
> >> >>>>>>> > > >> >> >> >>>>
> >> >>>>>>> > > >> >> >> >>>
> >> >>>>>>> > > >> >> >> >>>
> >> >>>>>>> > > >> >> >> >>>
> >> >>>>>>> > > >> >> >> >>>--
> >> >>>>>>> > > >> >> >> >>>Regards,
> >> >>>>>>> > > >> >> >> >>>Santosh Akhilesh
> >> >>>>>>> > > >> >> >> >>>+91-0-9845482201
> >> >>>>>>> > > >> >> >>
> >> >>>>>>> > > >> >> >>
> >> >>>>>>> > > >> >>
> >> >>>>>>> > > >> >>
> >> >>>>>>> > > >>
> >> >>>>>>> > > >>
> >> >>>>>>> > >
> >> >>>>>>> > >
> >> >>>>>>> >
> >> >>>>>>> >
> >> >>>>>>> > --
> >> >>>>>>> > Regards,
> >> >>>>>>> > Santosh Akhilesh
> >> >>>>>>> > +91-0-9845482201
> >> >>>>>>> >
> >> >>>>>>>
> >> >>>>>>
> >> >>>>>>
> >> >>>>>>
> >> >>>>>>--
> >> >>>>>>Regards,
> >> >>>>>>Santosh Akhilesh
> >> >>>>>>+91-0-9845482201
> >> >
> >> >
> >> >
> >> >
> >> >
> >> >
> >> >   The hive table name
> >> >>>>>>> > > >> >> >> >[pool-4-thread-2]:[2015-02-26
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>20:47:49,936][ERROR][org.apache.kylin.job.common.HadoopShel
> >> >>>>>>> > > >> >> >> lExecutable.doW
> >> >>>>>>> > > >> >> >> >ork(HadoopShellExecutable.java:64)] - error
> >>execute
> >> >>>>>>> > > >> >> >>
> >> >>>>>>> >
> >>
> >>>>>>>>>>HadoopShellExecutable{id=d4730d26-7fe6-412e-9841-3288ab362c5b-00,
> >> >>>>>>> > > >> >> >> >name=null, state=RUNNING}
> >> >>>>>>> > > >> >> >> >java.lang.IllegalArgumentException: No enum
> >>constant
> >> >>>>>>> > > >> >> >>
> >>>org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_MAPS
> >> >>>>>>> > > >> >> >> > at java.lang.Enum.valueOf(Enum.java:236)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >> >> >>
> >> >>>>>>> > >
> >>
> >>>>>>>>>>>>org.apache.hadoop.mapreduce.counters.FrameworkCounterGroup.valu
> >>>>>>>>>>>>eO
> >> >>>>>>>>>>f
> >> >>>>>>>>>>(
> >> >>>>>>> > > >> >> >> Framewo
> >> >>>>>>> > > >> >> >> >rkCounterGroup.java:148)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>org.apache.hadoop.mapreduce.counters.FrameworkCounterGroup.
> >> >>>>>>> > > >> >> >> findCounter(Fra
> >> >>>>>>> > > >> >> >> >meworkCounterGroup.java:182)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>org.apache.hadoop.mapreduce.counters.AbstractCounters.findC
> >> >>>>>>> > > >> >> >> ounter(Abstract
> >> >>>>>>> > > >> >> >> >Counters.java:154)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>org.apache.hadoop.mapreduce.TypeConverter.fromYarn(TypeConv
> >> >>>>>>> > > >> >> >> erter.java:240)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>org.apache.hadoop.mapred.ClientServiceDelegate.getJobCounte
> >> >>>>>>> > > >> >> >> rs(ClientServic
> >> >>>>>>> > > >> >> >> >eDelegate.java:370)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >> >> >>
> >>>org.apache.hadoop.mapred.YARNRunner.getJobCounters(
> >> >>>>>>> > > >> >> YARNRunner.java:511)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>>org.apache.hadoop.mapreduce.Job$7.run(Job.java:756)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>>org.apache.hadoop.mapreduce.Job$7.run(Job.java:753)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>>java.security.AccessController.doPrivileged(Native
> >> >>>>>>> Method)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>>javax.security.auth.Subject.doAs(Subject.java:415)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGr
> >> >>>>>>> > > >> >> >> oupInformation.
> >> >>>>>>> > > >> >> >> >java:1491)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>>org.apache.hadoop.mapreduce.Job.getCounters(Job.java:753)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >>
> >>
> >>>>>>>>>>>org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1361
> >>>>>>>>>>>)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>>org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.
> >> >>>>>>> > > >> java:1289)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>org.apache.kylin.job.hadoop.AbstractHadoopJob.waitForComple
> >> >>>>>>> > > >> >> >> tion(AbstractHa
> >> >>>>>>> > > >> >> >> >doopJob.java:134)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >> >> >>
> >> >>>>>>> > > >>
> >> >>>>>>> >
> >>
> >>>>>>>>>>>>org.apache.kylin.job.hadoop.cardinality.HiveColumnCardinalityJo
> >>>>>>>>>>>>b.
> >> >>>>>>>>>>r
> >> >>>>>>>>>>u
> >> >>>>>>>>>>n
> >> >>>>>>>>>>(
> >> >>>>>>> > > >> >> >> HiveC
> >> >>>>>>> > > >> >> >> >olumnCardinalityJob.java:114)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>>org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>>org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>org.apache.kylin.job.common.HadoopShellExecutable.doWork(Ha
> >> >>>>>>> > > >> >> >> doopShellExecut
> >> >>>>>>> > > >> >> >> >able.java:62)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>org.apache.kylin.job.execution.AbstractExecutable.execute(A
> >> >>>>>>> > > >> >> >> bstractExecutab
> >> >>>>>>> > > >> >> >> >le.java:99)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>org.apache.kylin.job.execution.DefaultChainedExecutable.doW
> >> >>>>>>> > > >> >> >> ork(DefaultChai
> >> >>>>>>> > > >> >> >> >nedExecutable.java:50)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>org.apache.kylin.job.execution.AbstractExecutable.execute(A
> >> >>>>>>> > > >> >> >> bstractExecutab
> >> >>>>>>> > > >> >> >> >le.java:99)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRu
> >> >>>>>>> > > >> >> >> nner.run(Defaul
> >> >>>>>>> > > >> >> >> >tScheduler.java:132)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoo
> >> >>>>>>> > > >> >> >> lExecutor.java:
> >> >>>>>>> > > >> >> >> >1145)
> >> >>>>>>> > > >> >> >> > at
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPo
> >> >>>>>>> > > >> >> >> olExecutor.java
> >> >>>>>>> > > >> >> >> >:615)
> >> >>>>>>> > > >> >> >> > at java.lang.Thread.run(Thread.java:745)
> >> >>>>>>> > > >> >> >> >
> >> >>>>>>> > > >> >> >> >Regards,
> >> >>>>>>> > > >> >> >> >Santosh Akhilesh
> >> >>>>>>> > > >> >> >> >Bangalore R&D
> >> >>>>>>> > > >> >> >> >HUAWEI TECHNOLOGIES CO.,LTD.
> >> >>>>>>> > > >> >> >> >
> >> >>>>>>> > > >> >> >> >www.huawei.com
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>-----------------------------------------------------------
> >> >>>>>>> > > >> >> >> ---------------
> >> >>>>>>> > > >> >> >>
> >> >>>>>>>>-----------------------------------------------------------
> >> >>>>>>> > > >> >> >> >This e-mail and its attachments contain
> >>confidential
> >> >>>>>>> > information
> >> >>>>>>> > > >> >>from
> >> >>>>>>> > > >> >> >> >HUAWEI, which
> >> >>>>>>> > > >> >> >> >is intended only for the person or entity whose
> >> >>>>>>>address
> >> >>>>>>>is
> >> >>>>>>> > listed
> >> >>>>>>> > > >> >> >>above.
> >> >>>>>>> > > >> >> >> >Any use of the
> >> >>>>>>> > > >> >> >> >information contained herein in any way
> >>(including,
> >> >>>>>>>but
> >> >>>>>>>not
> >> >>>>>>> > > >>limited
> >> >>>>>>> > > >> >>to,
> >> >>>>>>> > > >> >> >> >total or partial
> >> >>>>>>> > > >> >> >> >disclosure, reproduction, or dissemination) by
> >> >>>>>>>persons
> >> >>>>>>>other
> >> >>>>>>> > than
> >> >>>>>>> > > >> >>the
> >> >>>>>>> > > >> >> >> >intended
> >> >>>>>>> > > >> >> >> >recipient(s) is prohibited. If you receive this
> >> >>>>>>>e-mail
> >> >>>>>>>in
> >> >>>>>>> > error,
> >> >>>>>>> > > >> >>please
> >> >>>>>>> > > >> >> >> >notify the sender by
> >> >>>>>>> > > >> >> >> >phone or email immediately and delete it!
> >> >>>>>>> > > >> >> >> >
> >> >>>>>>> > > >> >> >> >________________________________________
> >> >>>>>>> > > >> >> >> >From: Shi, Shaofeng [shaoshi@ebay.com]
> >> >>>>>>> > > >> >> >> >Sent: Thursday, February 26, 2015 11:32 AM
> >> >>>>>>> > > >> >> >> >To: dev@kylin.incubator.apache.org
> >> >>>>>>> > > >> >> >> >Cc: Kulbhushan Rana
> >> >>>>>>> > > >> >> >> >Subject: Re: Error while making cube & Measure
> >>option
> >> >>>>>>>is
> >> >>>>>>>not
> >> >>>>>>> > > >> >>responding
> >> >>>>>>> > > >> >> >> >on GUI
> >> >>>>>>> > > >> >> >> >
> >> >>>>>>> > > >> >> >> >Hi Santosh, hive table importing issue should not
> >> >>>>>>>impact
> >> >>>>>>>on
> >> >>>>>>> > cube
> >> >>>>>>> > > >> >> >>saving.
> >> >>>>>>> > > >> >> >> >
> >> >>>>>>> > > >> >> >> >If you couldn’t save the cube, firstly please
> >>check
> >> >>>>>>>whether
> >> >>>>>>> > > >>there is
> >> >>>>>>> > > >> >> >>error
> >> >>>>>>> > > >> >> >> >in the tomcat’s log; If not please check your web
> >> >>>>>>>browser; We
> >> >>>>>>> > > >> >>suggest
> >> >>>>>>> > > >> >> >>use
> >> >>>>>>> > > >> >> >> >Firefox (with firebug add-on) or Chrome, open the
> >>JS
> >> >>>>>>>console
> >> >>>>>>> > > >>(press
> >> >>>>>>> > > >> >> >>F12)
> >> >>>>>>> > > >> >> >> >and then operate web UI, check whether there is
> >>any
> >> >>>>>>>error
> >> >>>>>>> > > >>reported
> >> >>>>>>> > > >> >>in
> >> >>>>>>> > > >> >> >> >browser.
> >> >>>>>>> > > >> >> >> >
> >> >>>>>>> > > >> >> >> >
> >> >>>>>>> > > >> >> >> >On 2/26/15, 1:08 PM, "Santoshakhilesh"
> >> >>>>>>> > > >>
> >>
> >>
> >
> >
> >--
> >Regards,
> >Santosh Akhilesh
> >+91-0-9845482201
>
>


-- 
Regards,
Santosh Akhilesh
+91-0-9845482201

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message