kylin-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Shi, Shaofeng" <shao...@ebay.com>
Subject Re: Cube Build Failed at Last Step//RE: Error while making cube & Measure option is not responding on GUI
Date Mon, 02 Mar 2015 02:10:32 GMT
Please refer to this segment in http://hbase.apache.org/book.html:

Replace the Hadoop Bundled With HBase!
Because HBase depends on Hadoop, it bundles an instance of the Hadoop jar
under its lib directory. The bundled jar is ONLY for use in standalone
mode. In distributed mode, it is critical that the version of Hadoop that
is out on your cluster match what is under HBase. Replace the hadoop jar
found in the HBase lib directory with the hadoop jar you are running on
your cluster to avoid version mismatch issues. Make sure you replace the
jar in HBase everywhere on your cluster. Hadoop version mismatch issues
have various manifestations but often all looks like its hung up.



On 3/1/15, 8:34 PM, "Santosh Akhilesh" <santoshakhilesh@gmail.com> wrote:

>Hi Shaofeng,
>
>                 I have raised the bug; Please suggest a resolution or
>alternate ASAP
>                  https://issues.apache.org/jir/browse/KYLIN-617
>
>Regards,
>Santosh Akhilesh
>
>
>On Sun, Mar 1, 2015 at 5:02 PM, Shi, Shaofeng <shaoshi@ebay.com> wrote:
>
>> Hi Santosh, this is very likely the problem; We will verify this on
>> Monday; In the meantime, could you please report a new JIRA with this
>> problem and your findings? I appreciateyour input!
>>
>> On 3/1/15, 3:03 PM, "Santosh Akhilesh" <santoshakhilesh@gmail.com>
>>wrote:
>>
>> >Hi Shaofeng ,
>> >      My map reduce application class path doesnt contain the hbase
>>libs.
>> >But I find that kylin.sh start / stop scripts has initialized hbase env
>> >first before anything else. So when I see th kylin.log client env
>>loads
>> >the hbases  client libs before hadoop and hbase client lib is 2.2.0. Is
>> >this isue related to kylin.sh startup script ? I am attaching my class
>> >path setting in mapred site xml and kylin.log print of classpath.
>> >
>> ><name>mapreduce.application.classpath</name>
>> 
>>><value>/tmp/kylin/*,/home/santosh/work/frameworks/hadoop-2.6.0/etc/hadoo
>>>p,
>> 
>>>/home/santosh/work/frameworks/hadoop-2.6.0/etc/hadoop,/home/santosh/work
>>>/f
>> 
>>>rameworks/hadoop-2.6.0/etc/hadoop,/home/santosh/work/frameworks/hadoop-2
>>>.6
>> 
>>>.0/share/hadoop/common/lib/*,/home/santosh/work/frameworks/hadoop-2.6.0/
>>>sh
>> 
>>>are/hadoop/common/*,/home/santosh/work/frameworks/hadoop-2.6.0/share/had
>>>oo
>> >>>p/hdfs,/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/
>>>*,
>> 
>>>/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/*,/home/san
>>>to
>> 
>>>sh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/*,/home/santosh/wo
>>>rk
>> 
>>>/frameworks/hadoop-2.6.0/share/hadoop/yarn/*,/home/santosh/work/framewor
>>>ks
>> 
>>>/hadoop-2.6.0/share/hadoop/mapreduce/lib/*,/home/santosh/work/frameworks
>>>/h
>> 
>>>adoop-2.6.0/share/hadoop/mapreduce/*,/contrib/capacity-scheduler/*.jar,/
>>>ho
>> 
>>>me/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/*,/home/santos
>>>h/
>> 
>>>wrk/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/*,/home/santosh/work/f
>>>r
>> 
>>>ameworks/apache-hive-1.0.0/conf,/home/santosh/work/frameworks/apache-hiv
>>>e-
>> 
>>>1.0.0/hcatalog/shar/hcatalog/*,/home/santosh/work/frameworks/apache-hive
>>>-
>> >1.0.0/lib/hive-exec-1.0.0.jar</value>
>> >
>> >
>> >Kylin.log
>> >Client
>> 
>>>environment:java.class.path=/etc/kylin:/home/santosh/work/software/tomca
>>>t/
>> 
>>>bin/bootstrap.jar:/home/santosh/work/software/tomcat/bin/tomcat-juli.jar
>>>:/
>> 
>>>home/santosh/work/software/tomcat/lib/catalina-tribes.jar:/home/santosh/
>>>wo
>> 
>>>rk/software/tomca/lib/jsp-api.jar:/home/santosh/work/software/tomcat/li
>>>b/
>> 
>>>catalina-ant.jar:/home/santosh/work/software/tomcat/lib/ecj-4.4.jar:/hom
>>>e/
>> 
>>>santosh/work/software/tomcat/lib/tomcat-dbcp.jar:/home/santosh/work/soft
>>>wa
>> 
>>>re/tomcat/lib/catalina.jar:/home/santosh/work/software/tomcat/lib/tomcat
>>>-a
>> 
>>>pi.jar:/home/santosh/work/software/tomcat/lib/catalina-ha.jar:/home/sant
>>>os
>> 
>>>h/work/software/tomcat/lib/jasper-el.jar:/home/santosh/work/software/tom
>>>ca
>> 
>>>t/lib/tomcat7-websocket.jar:/home/santosh/work/software/tomcat/lib/jaspe
>>>r.
>> 
>>>jar:/home/santosh/work/software/tomcat/lib/tomcat-coyote.jar:/home/santo
>>>sh
>> 
>>>/work/software/tomcat/li/tomcat-i18n-ja.jar:/home/santosh/work/software
>>>/t
>> 
>>>omcat/lib/tomcat-util.jar:/home/santosh/work/software/tomcat/lib/el-api.
>>>ja
>> 
>>>r:/home/santosh/work/software/tomcat/lib/ebsocket-api.jar:/home/santosh
>>>/w
>> 
>>>ork/software/tomcat/lib/servlet-api.jar:/home/santosh/work/software/tomc
>>>at
>> 
>>>/lib/annotations-api.jar:/home/santosh/work/software/tomcat/lib/tomcat-i
>>>18
>> 
>>>n-es.jar:/home/santosh/work/software/tomcat/lib/tomcat-i18n-fr.jar:/home
>>>/s
>> 
>>>antosh/work/software/tomcat/lib/tomcat-jdbc.jar::/home/santosh/work/fram
>>>ew
>> 
>>>orks/hase-0.98.10/bin/../conf:/home/santosh/work/java/jdk1.7.0_75/lib/to
>>>o
>> 
>>>ls.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/..:/home/santosh/
>>>wo
>> 
>>>rk/frameworks/hbase-0.98.10/bin/../lib/activation-1.1.jar:/home/santosh/
>>>wo
>> 
>>>rk/frameworks/hbase-0.98.10/bin/../lib/aopalliance-1.0.jar:/home/santosh
>>>/w
>> 
>>>ork/frameworks/hbase-0.98.10/bin/../lib/asm-3.1.jar:/home/santosh/work/f
>>>ra
>> 
>>>meworks/hbase-0.98.10/bin/../lib/avro-1.7.4.jar:/home/santosh/work/frame
>>>wo
>> 
>>>rks/hbase-0.98.10/bin/../lib/commons-beanutils-1.7.0.ja:/home/santosh/w
>>>or
>> 
>>>k/frameworks/hbase-0.98.10/bin/../lib/commons-beanutils-core-1.8.0.jar:/
>>>ho
>> 
>>>me/santosh/work/frameworks/hbase-0.98.10/bin/../li/commons-cli-1.2.jar:/
>>>h
>> 
>>>ome/santosh/work/frameworks/hbase-0.98.10/bin/../lib/commons-codec-1.7.j
>>>ar
>> 
>>>:/home/santosh/wor/frameworks/hbase-0.98.10/bin/../lib/commons-collecti
>>>on
>> 
>>>s-3.2.1.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/commo
>>ns
>> 
>>>-copress-1.4.1.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../li
>>>b
>> 
>>>/commons-configuration-1.6.jar:/home/santosh/work/frameworks/hbase-0.98.
>>>10
>> 
>>>/bin/../lib/commons-daemon-1.0.13.jar:/home/santosh/work/frameworks/hbas
>>>e-
>> 
>>>0.98.10/bin/../lib/commons-digester-1.8.jar:/home/santosh/work/framework
>>>s/
>> 
>>>hbase-0.98.10/bin/../lib/commons-el-1.0.jar:/home/santosh/work/framework
>>>s/
>> 
>>>hbase-0.98.10/bin/../lib/commons-httpclient-3.1.jar:/home/santosh/work/f
>>>ra
>> 
>>>meworks/hbase-0.98.0/bin/../lib/commons-io-2.4.jar:/home/santosh/work/f
>>>ra
>> 
>>>meworks/hbase-0.98.10/bin/../lib/commons-lang-2.6.jar:/home/santosh/work
>>>/f
>> 
>>>rameworks/hbse-0.98.10/bin/../lib/commons-logging-1.1.1.jar:/home/santo
>>>sh
>> 
>>>/work/frameworks/hbase-0.98.10/bin/../lib/commons-math-2.1.jar:/home/san
>>>to
>> 
>>>sh/work/frameworks/hbase-0.98.10/bin/../lib/commons-net-3.1.jar:/home/sa
>>>nt
>> 
>>>osh/work/framewors/hbase-0.98.10/bin/../lib/findbugs-annotations-1.3.9-
>>>1.
>> 
>>>jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/gmbal-api-onl
>>>y-
>> 
>>>3.0.0-b023.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/gr
>>>iz
> 
>>>zly-framework-2.1.2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/
>>>..
>> 
>>>/lib/grizzly-http-2.1.2.jar:/home/santosh/work/frameworks/hbase-0.98.10/
>>>bi
>> 
>>>n/../lib/grizzly-http-server-2.1.2.jar:/home/santosh/work/frameworks/hba
>>>se
>> 
>>>-0.98.10/bin/../lib/grizzly-http-servlet-2.1.2.jar:/home/santosh/work/fr
>>>am
>> 
>>>eworks/hbase-0.98.10/bin/../lib/grizzly-rcm-2.1.2.jar:/home/santosh/work
>>>/f
>> 
>>>rameworks/hbase-0.98.10/bin/..lib/guava-12.0.1.jar:/home/santosh/work/f
>>>ra
>> 
>>>meworks/hbase-0.98.10/bin/../lib/guice-3.0.jar:/home/santosh/work/framew
>>>or
>> 
>>>ks/hbase-0.98.10/bin/../lib/guice-servlet-3.0.jar:/home/santosh/work/fra
>>>me
>> 
>>>works/hbase-0.98.10/bin/../lib/hadoop-annotations-2.2.0.jar:/home/santos
>>>h/
>> 
>>>work/frameworks/hbase-0.98.10/bin/../lib/hadoop-auth-2.2.0.jar:/home/san
>>>to
>> 
>>>sh/work/frameworks/hbase-0.98.10/bin/../lib/hadoop-client-2.2.0.jar:/hom
>>>e/
>> 
>>>santosh/work/frameworks/hbase-0.9810/bin/../lib/hadoop-common-2.2.0.jar
>>>:/
>> 
>>>home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hadoop-hdfs-2.2.0.
>>>ja
>> 
>>>r:/home/santoh/work/frameworks/hbase-0.98.10/bin/../lib/hadoop-mapreduc
>>>e-
>> 
>>>client-app-2.2.0.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../
>>>li
>> 
>>>b/hadoop-mapreduce-client-common-2.2.0.jar:/home/santosh/work/frameworks
>>>/h
>> 
>>>base-0.98.10/bin/../lib/hadoop-mapreduce-client-core-2.2.0.jar:/home/san
>>>to
>> 
>>>sh/work/frameworks/hbase-0.98.10/bin/../lib/hadoop-mapreduce-client-jobc
>>>li
>> 
>>>ent-2.2.0.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/had
>>>oo
>> 
>>>p-mapreduce-client-shuffle-2.2.0.jar:/home/santosh/work/frameworks/hbase
>>>-0
>> 
>>>.98.10/bin/../lib/hadoop-yarn-api-2.2.0.jar:/home/santosh/work/framework
>>>s/
>> 
>>>hbase-0.98.10/bin/../lib/hadoop-yarn-client-2.2.0.jar:/home/santosh/work
>>>/f
>> 
>>>rameworks/hbase-0.98.10/bin/../lib/hadoop-yarn-common-2.2.0.jar:/home/sa
>>>nt
>> 
>>>osh/work/frameworks/hbase-0.98.10/bin/../lib/hadoop-yarn-server-common-2
>>>.2
>> 
>>>.0.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hadoop-yar
>>>n-
>> 
>>>server-nodemanager-2.2.0.jar:/home/santosh/work/frameworks/hbase-0.98.10
>>>/b
>> 
>>>in/../lib/hamcrest-core-1.3.jar:/home/santosh/work/frameworks/hbase-0.98
>>>.1
>> 
>>>0/bin/../lib/hbase-annotations-0.98.10-hadoop2.jar:/home/santosh/work/fr
>>>am
>> 
>>>eworks/hbase-0.98.10/bin/../lib/hbase-checkstyle-0.98.10-hadoop2.jar:/ho
>>>me
>> 
>>>/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hbase-client-0.98.10-h
>>>ad
>> 
>>>oop2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hbase-co
>>>mm
>> 
>>>on-0.98.10-hadoop2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/.
>>>./
>> 
>>>lib/hbase-common-0.98.10-hadoop2-tests.jar:/home/santosh/work/frameworks
>>>/h
>> 
>>>base-0.98.10/bin/../lib/hbase-examples-0.98.10-hadoop2.jar:/home/santosh
>>>/w
>> 
>>>ork/frameworks/hbase-0.98.10/bin/../lib/hbase-hadoop2-compat-0.98.10-had
>>>oo
>> 
>>>p2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hbase-hado
>>>op
>> 
>>>-compat-0.98.10-hadoop2.jar:/home/santosh/work/frameworks/hbase-0.98.10/
>>>bi
>> 
>>>n/../lib/hbase-it-0.98.10-hadoop2.jar:/home/santosh/work/frameworks/hbas
>>>e-
>> 
>>>0.98.10/bin/../lib/hbase-it-0.98.10-hadoop2-tests.jar:/home/santosh/work
>>>/f
>> 
>>>rameworks/hbase-0.98.10/bin/../libhbase-prefix-tree-0.98.10-hadoop2.jar
>>>:/
>> 
>>>home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hbase-protocol-0.9
>>>8.
>> 
>>>10-hadoop2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hb
>>>as
>> 
>>>e-rest-0.98.10-hadoop2.jar:/home/santosh/work/frameworks/hbase-0.98.10/b
>>>in
>> 
>>>/../lib/hbase-server-0.98.10-hadoop2.jar:/home/santosh/work/frameworks/h
>>>ba
>> 
>>>se-0.98.10/bin/../lib/hbase-server-0.98.10-hadoop2-tests.jar:/home/santo
>>>sh
>> 
>>>/work/frameworks/hbase-0.98.10/bin/../lib/hbase-shell-0.98.10-hadoop2.ja
>>>r:
>> 
>>>/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/hbase-testing-uti
>>>l-
>> 
>>>0.98.10-hadoop2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../l
>>>ib
>> 
>>>/hbase-thrift-0.98.10-hadoop2.jar:/home/santosh/work/frameworks/hbase-0.
>>>98
>> 
>>>.10/bin/../lib/high-scale-lib-1.1.1.jar:/home/santosh/work/frameworks/hb
>>>as
>> 
>>>e-0.98.10/bin/../lib/htrace-core-2.04.jar:/home/santosh/work/frameworks/
>>>hb
>> 
>>>ase-0.98.10/bin/../lib/httpclient-4.1.3.jar:/home/santosh/work/framework
>>>s/
>> 
>>>hbase-0.98.10/bin/../lib/httpcore-4.1.3.jar:/home/santosh/work/framework
>>>s/
>> 
>>>hbase-0.98.10/bin/../lib/jackson-core-asl-1.8.8.jar:/home/santosh/work/f
>>>ra
>> 
>>>meworks/hbase-0.98.10/bin/../lib/jackson-jaxrs-1.8.8.jar:/home/santosh/w
>>>or
>> 
>>>k/frameworks/hbase-0.98.10/bin/../lib/jackson-mapper-asl-1.8.8.jar:/home
>>>/s
>> 
>>>antosh/work/frameworks/hbase-0.98.10/bin/../lib/jackson-xc-1.8.8.jar:/ho
>>>me
>> 
>>>/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jamon-runtime-2.3.1.ja
>>>r:
>> 
>>>/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jasper-compiler-5
>>>.5
>> 
>>>.23.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jasper-ru
>>>nt
>> 
>>>ime-5.5.23.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/ja
>>>va
>> 
>>>x.inject-1.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/ja
>>>va
>> 
>>>x.servlet-3.1.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib
>>>/j
>> 
>>>avax.servlet-api-3.0.1.jar:/home/santosh/work/frameworks/hbase-0.98.10/b
>>>in
>> 
>>>/../lib/jaxb-api-2.2.2.jar:/home/santosh/work/frameworks/hbase-0.98.10/b
>>>in
>> 
>>>/../lib/jaxb-impl-2.2.3-1.jar:/home/santosh/work/frameworks/hbase-0.98.1
>>>0/
>> 
>>>bin/../lib/jcodings-1.0.8.jar:/home/santosh/work/frameworks/hbase-0.98.1
>>>0/
>> 
>>>bin/../lib/jersey-client-1.9.jar:/home/santosh/work/frameworks/hbase-0.9
>>>8.
>> 
>>>10/bin/../lib/jersey-core-1.8.jar:/home/santosh/work/frameworks/hbase-0.
>>>98
>> 
>>>.10/bin/../lib/jerey-grizzly2-1.9.jar:/home/santosh/work/frameworks/hbas
>>>e
>> 
>>>-0.98.10/bin/../lib/jersey-guice-1.9.jar:/home/santosh/work/frameworks/h
>>>ba
>> 
>>>se-0.98.10/bin/../lib/jersey-json-1.8.jar:/home/santosh/work/frameworks/
>>>hb
>> 
>>>ase-0.98.10/bin/../lib/jersey-server-1.8.jar:/home/santosh/work/framewor
>>>s
>> 
>>>/hbase-0.98.10/bin/../lib/jersey-test-framework-core-1.9.jar:/home/santo
>>>sh
>> 
>>>/work/framworks/hbase-0.98.10/bin/../lib/jersey-test-framework-grzzly2-
>>>1
>> 
>>>.9.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jets3t-0.6
>>>.1
>> 
>>>.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/./lib/jettison-1.3
>>>.1
>> 
>>>.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jetty-6.1.26
>>>.j
>> 
>>>ar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jetty-sslengin
>>>e-
>> 
>>>6.1.26.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jetty-
>>>ut
>> 
>>>il-6.1.26.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jon
>>>i-
>> 
>>>2.1.2.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jruby-c
>>>om
>> 
>>>plete-1.6.8.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/j
>>>sc
>> 
>>>h-0.1.42.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jsp-
>>>2.
>> 
>>>1-6.1.14.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/jsp-
>>>ap
>> 
>>>i-2.1-6.1.14.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/
>>>js
>> 
>>>r305-1.3.9.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/ju
>>>ni
>> 
>>>t-4.11.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/libthr
>>>if
>> 
>>>t-0.9.0.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/log4j
>>>-1
>> 
>>>.2.17.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../lib/managem
>>>en
>> 
>>>t-api-3.0.0-b012.jar:/home/santosh/work/frameworks/hbase-0.98.10/bin/../
>>>li
>> 
>>>b/metrics-core-2.2.0.jar:/home/santosh/work/frameworks/hbase-.98.10/bin/
>>>.
>> 
>>>./lib/netty-3.6.6.Final.jar:/home/santosh/work/frameworks/hbase-0.98.10/
>>>bi
>> 
>>>n/../lib/paranamer-2.3.jar:/home/santosh/work/frameworks/hbase-0.98.10/b
>>>in
>> 
>>>/../lib/protobuf-java-2.5.0.jar:/home/santosh/work/frameworks/hbase-0.98
>>>.1
>> 
>>>0/bin/../lib/servlet-api-2.5-6.1.14.jar:/home/santosh/work/frameworks/hb
>>>as
>> 
>>>e-0.98.10/bin/../lib/slf4j-api-1.6.4.jar:/home/santosh/work/frameworks/h
>>>ba
>> 
>>>se-0.98.10/bin/../lib/slf4j-log4j12-1.6.4.jar:/home/santosh/work/framewo
>>>rk
>> 
>>>s/hbase-0.98.10/bin/../lib/snappy-java-1.0.4.1.jar:/home/santosh/work/fr
>>>am
>> 
>>>eworks/hbase-0.98.10/bin/../lib/xmlenc-0.52.jar:/home/santosh/work/frame
>>>wo
>> 
>>>rks/hbase-0.98.10/bin/../lib/xz-1.0.jar:/home/santosh/work/frameworks/hb
>>>as
>> 
>>>e-0.98.10/bin/../lib/zookeeper-3.4.6.jar:/home/santosh/work/frameworks/h
>>>ad
>> 
>>>oop-2..0/etc/hadoop:/home/santosh/work/frameworks/hadoop-2.6.0/share/had
>>>o
>> 
>>>op/common/lib/jaxb-api-2.2.2.jar:/home/santosh/work/frameworks/hadoop-2.
>>>6.
>> 
>>>0/share/hadoop/common/lib/curator-framework-2.6.0.jar:/home/santosh/work
>>>/f
>> 
>>>raeworks/hadoop-2.6.0/share/hadoop/common/lib/commons-io-2.4.jar:/home/s
>>>a
>> 
>>>ntosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/jackson-jaxrs
>>>-1
>> 
>>>.9.13.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/ommon/
>>>l
>> 
>>>ib/protobuf-java-2.5.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/sh
>>>ar
>> 
>>>e/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/santosh/work/framework
>>>s/
>> 
>>>hadoop-2.6.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/santosh/wor
>>>k/
>> 
>>>frameworks/hadoop-2.6.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/s
>>>an
>> 
>>>tosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/jsr305-1.3.9.j
>>>ar
>> 
>>>:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/comm
>>>on
>> 
>>>s-beanutils-core-1.8.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/sh
>>>ar
>> 
>>>e/hadoop/common/lib/commons-el-1.0.jar:/home/santosh/work/frameworks/had
>>>oo
>> 
>>>p-2.6.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/santosh/work/f
>>>ra
>> 
>>>meworks/hadoop-2.6.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/santo
>>>sh
>> 
>>>/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/commons-configurat
>>>io
>> 
>>>n-1.6.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common
>>>/l
>> 
>>>ib/jackson-core-asl-1.9.13.jar:/home/santosh/work/frameworks/hadoop-2.6.
>>>0/
>> 
>>>share/hadoop/common/lib/activation-1.1.jar:/home/santosh/work/frameworks
>>>/h
>> 
>>>adoop-2.6.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/santosh/
>>>wo
>> 
>>>rk/frameworks/hadoop-2.6.0/share/hadoop/common/lib/apacheds-i18n-2.0.0-M
>>>15
>> 
>>>.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/
>>>ht
>> 
>>>tpcore-4.2.5.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop
>>>/c
>> 
>>>ommon/lib/xmlenc-0.52.jar:/home/santosh/work/frameworks/hadoop-2.6.0/sha
>>>re
>> 
>>>/hadoop/common/lib/curator-recipes-2.6.0.jar:/home/santosh/work/framewor
>>>ks
>> 
>>>/hadoop-2.6.0/share/hadoop/common/lib/xz-1.0.jar:/home/santosh/work/fram
>>>ew
>> 
>>>orks/hadoop-2.6.0/share/hadoop/common/lib/hadoop-auth-2.6.0.jar:/home/sa
>>>nt
>> 
>>>osh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/commons-cli-1.2
>>>.j
>> 
>>>ar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/av
>>>ro
>> 
>>>-1.7.4.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/commo
>>>n/
>> 
>>>lib/jets3t-0.9.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/ha
>>>do
>> 
>>>op/common/lib/jettison-1.1.jar:/home/santosh/work/frameworks/hadoop-2.6.
>>>0/
>> 
>>>share/hadoop/common/lib/hadoop-annotations-2.6.0.jar:/home/santosh/work/
>>>fr
>> 
>>>ameworks/hadoop-2.6.0/share/hadoop/common/lib/hamcrest-core-1.3.jar:/hom
>>>e/
>> 
>>>santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/commons-dig
>>>es
>> 
>>>ter-1.8.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/comm
>>>on
>> 
>>>/lib/commons-math3-3.1.1.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
>>>sh
>> 
>>>are/hadoop/common/lib/api-util-1.0.0-M20.jar:/home/santosh/work/framewor
>>>ks
>> 
>>>/hadoop-2.6.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/s
>>>an
>> 
>>>tosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/jetty-util-6.1
>>>.2
>> 
>>>6.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib
>>>/j
>> 
>>>ackson-xc-1.9.13.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/ha
>>>do
>> 
>>>op/common/lib/jersey-core-1.9.jar:/home/santosh/work/frameworks/hadoop-2
>>>.6
>> 
>>>.0/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/home/santosh/work/fr
>>>am
>> 
>>>eworks/hadoop-2.6.0/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.ja
>>>r:
>> 
>>>/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/curat
>>>or
>> 
>>>-client-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoo
>>>p/
>> 
>>>common/lib/jersey-server-1.9.jar:/home/santosh/work/frameworks/hadoop-2.
>>>6.
>> 
>>>0/share/hadoop/common/lib/httpclient-4.2.5.jar:/home/santosh/work/framew
>>>or
>> 
>>>ks/hadoop-2.6.0/share/hadoop/common/lib/zookeeper-3.4.6.jar:/home/santos
>>>h/
>> 
>>>work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5
>>>.j
>> 
>>>ar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/ne
>>>tt
>> 
>>>y-3.6.2.Final.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoo
>>>p/
>> 
>>>common/lib/commons-collections-3.2.1.jar:/home/santosh/work/frameworks/h
>>>ad
>> 
>>>oop-2.6.0/share/hadoop/common/lib/commons-lang-2.6.jar:/home/santosh/wor
>>>k/
>> 
>>>frameworks/hadoop-2.6.0/share/hadoop/common/lib/commons-net-3.1.jar:/hom
>>>e/>> 
>>>santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/commons-bea
>>>nu
>> 
>>>tils-1.7.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/c
>>>om
>> 
>>>mon/lib/commons-logging-1.1.3.jar:/home/santosh/work/frameworks/hadoop-2
>>>.6
>> 
>>>.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/santosh/work/frameworks
>>>/h
>> 
>>>adoop-2.6.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/santosh/wo
>>>rk
>> 
>>>/frameworks/hadoop-2.6.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/
>>>ho
>> 
>>>me/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/comon/lib/guava-11.
>>>0
>> 
>>>.2.jar:/home/santosh/work/frameworks/hadoop2.6.0/share/hadoop/common/li
>>>b/
>> 
>>>asm-3.2.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/comm
>>>on
>> 
>>>/lib/gson-2.2.4.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/had
>>>oo
>> 
>>>p/common/lib/htrace-core-3.0.4.jar:/home/santosh/work/frameworks/hadoop-
>>>2.
>> 
>>>6.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/santosh/wor
>>>k/
>> 
>>>frameworks/hadoop-2.6.0/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.j
>>>ar
>> 
>>>:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/jett
>>>y-
>> 
>>>6.1.26.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/commo
>>>n/
>> 
>>>lib/apacheds-kerberos-codec-2.0.0-M15.jar:/home/santosh/work/frameworks/
>>>ha
>> 
>>>doop-2.6.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/santosh/wor
>>>k/
>> 
>>>frameworks/hadoop-2.6.0/share/hadoop/common/lib/junit-4.11.jar:/home/san
>>>to
>> 
>>sh/work/frameworks/hadoop-2.6.0/share/hadoop/common/lib/commons-httpclie
>>>nt
>> 
>>>-3.1.jar:/home/santos/work/frameworks/hadoop-2.6.0/share/hadoop/common/l
>>>i
>> 
>>>b/stax-api-1.0-2.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/ha
>>>do
>> 
>>>op/common/lib/commons-codec-1.4.jar:/home/santosh/work/frameworks/hadoop
>>>-2
>> 
>>>.6.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/santosh/wor
>>>k/
>> 
>>>frameworks/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0-tests.ja
>>>r:
>> 
>>>/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/common/hadoop-co
>>>mm
>> 
>>>on-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/com
>>>mo
>> 
>>>n/hadoop-nfs-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0share/
>>>ha
>> 
>>>doop/hdfs:/home/santosh/work/frameworks/hadoop-2.6.0/share/haoop/hdfs/li
>>>b
>> 
>>>/commons-io-2.4.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/had
>>>oo
>> 
>>>p/hdfs/lib/protobuf-java-2.5.0.jar:/home/santosh/work/frameworks/hadoop-
>>>2.
>> 
>>>6.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/santosh/work/frameworks
>>>/h
>> 
>>>adoop-2.6.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/santosh/work/fr
>>>am
>> 
>>>eworks/hadoop-2.6.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/santo
>>>sh
>> 
>>>/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:
>>>/h
>> 
>>>ome/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/jackson-c
>>>or
>> 
>>>e-asl-1.9.13.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop
>>>/h
>> 
>>>dfs/lib/xmlenc-0.52.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share
>>>/h
>> 
>>>adoop/hdfs/lib/commons-cli-1.2.jar:/home/santosh/work/frameworks/hadoop-
>>>2.
>> 
>>>6.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/santosh/work/frame
>>>wo
>> 
>>>rks/hadoop-2.6.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/ome/santosh/
>>>w
>> 
>>>ork/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.
>>>ja
>> 
>>>r:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/jacks
>>>on
>> 
>>>-mapper-asl-1.9.13.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/
>>>ha
>> 
>>>doop/hdfs/lib/jersey-server-1.9.jar:/home/santosh/work/frameworks/hadoop
>>>-2
>> 
>>>.6.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/santosh/work/fram
>>>ew
>> 
>>>orks/hadoop-2.6.0/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/home/santo
>>>sh
>> 
>>>/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:
>>>/h
>> 
>>>ome/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/xercesImp
>>>l-
>> 
>>>2.9.1.ar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/li
>>>b
>> 
>>>/commons-logging-1.1.3.jar:/ome/santosh/work/frameworks/hadoop-2.6.0/sh
>>>ar
>> 
>>>e/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/santosh/work/frameworks/hadoop-2
>>>.6
>> 
>>>.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/santosh/work/frameworks/
>>>ha
>> 
>>>doop-2.6.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/santosh/work/framewor
>>>ks
>> 
>>>/hadoop-2.6.0/share/hadoop/hdfs/lib/htrace-core-3.0.4.jar:/home/santosh/
>>>wo
>> 
>>>rk/framewors/hadoop-2.6.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/s
>>>a
>> 
>>>ntosh/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/commons-codec-1
>>>.4
>> 
>>>.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/hdfs/lib/ja
>>>sp
>> 
>>>er-runtime-5.5.23.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/h
>>>ad
>> 
>>>oop/hdfs/hadoop-hdfs-nfs-2.6.0.jar:/home/santosh/work/frameworks/hadoop-
>>>2.
>> 
>>>6.0/share/hadoop/hdfs/hadoop-hdfs-2.6.0-tests.jar:/home/santosh/work/fra
>>>me
>> 
>>>works/hadoop-2.6.0/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar:/home/santosh
>>>/w
>> 
>>>ork/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/ho
>>>me
>> 
>>>/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/commons-io-2
>>>.4
>> 
>>>.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/ja
>>>ck
>> 
>>>son-jaxrs-1.9.13.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/ha
>>>do
>> 
>>>op/yarn/lib/jline-0.9.94.jar:/home/santosh/work/frameworks/hadoop-2.6.0/
>>>sh
>> 
>>>are/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/santosh/work/framework
>>>s/
>> 
>>>hadoop-2.6.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/santosh/work/f
>>>ra
>> 
>>>meworks/hadoop-2.6.0/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/home/santos
>>>h/
>> 
>>>work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/servlet-api-2.5.jar:/
>>>ho
>> 
>>>me/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/javax.inje
>>>ct
>> 
>>>-1.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/
>>>ja
>> 
>>>ckson-core-asl-1.9.13.jar:/home/santosh/work/frameworks/hadoop-2.6.0/sha
>>>re
>> 
>>>/hadoop/yarn/lib/activation-1.1.jar:/home/santosh/work/frameworks/hadoop
>>>-2
>> 
>>>.6.0/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/home/santosh/work/fram
>>>ew
>> 
>>>orks/hadoop-2.6.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/santosh/work/fr
>>>am
>> 
>>>eworks/hadoop-2.6.0/share/hadoop/yarn/lib/commons-cli-1.2.jar:/home/sant
>>>os
>> 
>>>h/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/jettison-1.1.jar:/h
>>>om
>> 
>>>e/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/jersey-clie
>>>nt
>> 
>>>-1.9.jar:/home/santosh/work/frameworks/hadoop-2.6.0/hare/hadoop/yarn/lib
>>>/
>> 
>>>commons-compress-14.1.jar:/home/santosh/work/frameworks/hadoop-2.6.0/sh
>>>ar
>> 
>>>e/hdoop/yarn/lib/guice-servlet-3.0.jar:/home/santosh/work/frameworks/ha
>>>o
>> 
>>>op-2.6.0/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/home/santosh/work/
>>>fr
>> 
>>>ameworks/hadoop-2.6.0/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/home/
>>>sa
>> 
>>>ntosh/work/frameorks/hadoop-2.6.0/share/hadoop/yarn/lib/jersey-core-1.9.>>>j
>> 
>>>ar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/jack
>>>so
>> 
>>>n-mapper-asl-1.9.13.jar:/home/santos/work/frameworks/hadoop-2.6.0/share/
>>>h
>> 
>>>adoop/yarn/lib/leveldbjni-all-1.8.jar:/home/santosh/work/frameworks/hado
>>>op
>> 
>>>-2.6.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/homesantosh/work/fra
>>>m
>> 
>>>eworks/hadoop-2.6.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/sant
>>>os
>> 
>>>h/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/zookeeper-3.4.6.jar
>>>:/
>> 
>>>ome/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/netty-3.6
>>>.
>> 
>>>2.Final.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn
>>>/l
>> 
>>>ib/commons-collections-3.2.1.jar:/home/santosh/work/frameworks/hadoop-2.
>>>6.
>> 
>>>0/share/hadoop/yarn/lib/commons-lang-2.6.jar:/home/santosh/work/framewor
>>>ks
>> 
>>>/hadoop-2.6.0/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/home/sant
>>>os
>> 
>>>h/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/guava-11.0.2.jar:/h
>>>om
>> 
>>>e/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/asm-3.2.jar
>>>:/
>> 
>>>home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/jetty-6.
>>>1.
>> 
>>>26.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/
>>>je
>> 
>>>rsey-json-1.9.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoo
>>>p/
>> 
>>>yarn/lib/guice-3.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/
>>>ha
>> 
>>>doop/yarn/lib/commons-httpclient-3.1.jar:/home/santosh/work/frameworks/h
>>>ad
>> 
>>>oop-2.6.0/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/home/santosh/work/fr
>>>am
>> 
>>>eworks/hadoop-2.6.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/san
>>>to
>> 
>>>sh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/lib/ommons-codec-1.4.j
>>>a
>> 
>>>r:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/hadoop-ya
>>>rn
>> 
>>>-server-nodemanager-2.6.0.jar:/home/santosh/work/frameorks/hadoop-2.6.0
>>>/s
>> 
>>>hare/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:
>>>/h
>> 
>>>ome/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-c
>>>li
>> 
>>>ent-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/ya
>>>rn
>> 
>>>/hadoop-yarn-server-web-proxy-2.6.0.jar:/home/santosh/work/frameworks/ha
>>>do
>> 
>>>op-2.6.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launche
>>>r-
>> 
>>>2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/h
>>>ad
>> 
>>>oop-yarn-server-tets-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.
>>>0
>> 
>>>/share/hadoop/yarn/hadoop-yarn-server-common-2.6.0.jar:/home/santosh/wor
>>>k/
>> 
>>>frameworks/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-api-2.6.0.jar:/hom
>>>e/
>> 
>>>santosh/work/frameworks/hadoop-2.6.0/share/hadoop/yarn/hadoop-yarn-serve
>>>r-
>> 
>>>resourcemanager-2.6.0.jar:/home/santos/work/frameworks/hadoop-2.6.0/sha
>>>re
>> 
>>>/hadoop/yarn/hadoop-yarn-common-2.6.0.jar:/home/santosh/work/frameworks/
>>>ha
>> 
>>>doop-2.6.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2
>>>.6
>> 
>>>.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/arn/hado
>>>op
>> 
>>>-yarn-registry-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2..0/share
>>>/
>> 
>>>hadoop/mapreduce/lib/commons-io-2.4.jar:/home/santosh/work/frameworks/ha
>>>do
>> 
>>>op-2.6.0/shae/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/santos
>>>h/
>> 
>>>work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/lib/snappy-java-1.0.
>>>4.
>> 
>>>1.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/
>>>li
>> 
>>>b/paranamer-2.3.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/had
>>>oo
>> 
>>>p/mapreduce/lib/log4j-1.2.17.jar:/home/santosh/work/frameworks/hadoop-2.
>>>6.
>> 
>>>0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/santosh/work/frame
>>>wo
>> 
>>>rks/hadoop-2.6.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:
>>>/h
>> 
>>>ome/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/li/xz-1
>>>.0
>> 
>>>.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/l
>>>ib
>> 
>>>/avro-1.7.4.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/
>>>ma
>> 
>>>preduce/lib/hadoop-annotations-2.6.0.jar:/home/santosh/work/frameworks/h
>>>ad
>> 
>>>oop-2.6.0/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/hoe/santosh
>>>/w
>> 
>>>ork/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/lib/commons-compress-
>>>1.
>> 
>>>4.1.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduc
>>>e/
>> 
>>>lib/guice-servlet-3.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/sha
>>>re
>> 
>>>/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/santosh/work/frameworks/
>>>ha
>> 
>>>doop-2.6.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/hom
>>>e/
>> 
>>>santosh/work/fameworks/hadoop-2.6.0/share/hadoop/mapreduce/lib/leveldbj
>>>ni
>> 
>>>-all-1.8.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/map
>>>re
>> 
>>>duce/lib/jersey-server-1.9.jar:/home/santosh/work/frameworks/hadoop-2.6.
>>>0/
>> 
>>>share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/santosh/work/framew
>>>or
>> 
>>>ks/hadoop-2.6.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/s
>>>an
>> 
>>>tosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/lib/asm-3.2.jar
>>>:/
>> 
>>>home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/lib/jun
>>>it
>> 
>>>-4.11.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapred
>>>uc
>> 
>>>e/lib/guice-3.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/had
>>>oo
>> 
>>>p/mapreduce/lib/jersey-guice-1.9.jar:/home/santosh/work/frameworks/hadoo
>>>p-
>> 
>>>2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar:/home/s
>>>an
>> 
>>>tosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/adoop-mapreduce
>>>-
>> 
>>>client-jobclient-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/sh
>>>ar
>> 
>>>e/hadoop/mapreduce/hadoop-mapreduce-client-common-2.6.0.jar:/home/santos
>>>h/
>> 
>>>work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-cli
>>>en
>> 
>>>t-core-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop
>>>/m
>> 
>>>apreduce/hadoop-mapreduce-client-shuffle-2.6.0.jar:/home/santosh/work/fr
>>>am
>> 
>>>eworks/hadoop-2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2
>>>.6
>> 
>>>.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapreduce
>>>/h
>> 
>>>adoop-mapreduce-client-jobclient-2.6.0-tests.jar:/home/santosh/work/fram
>>>ew
>> 
>>>orks/hadoop2.6.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugi
>>>n
>> 
>>>s-2.6.0.jar:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoop/mapr
>>>ed
>> 
>>>uce/hadoop-mapreduce-client-hs-2.6.0.jar:/contrib/apacity-scheduler/*.ja
>>>r
>> >
>> >
>> >On Sat, Feb 28, 2015 at 9:06 AM, Shi, Shaofeng <shaoshi@ebay.com> 
>>wrote:
>> >
>> >> I don’t think downgrade hbase can fix that; The jar version in hbase 
>>is
>> >> low; I suggest to check the mapped-site.xml, find property
>> >> mapreduce.application.classpath, that is the class path that be 
>>loaded
>> >>in
>> >> MR; Check whether the hbase folder was put ahead of hadoop folders;
>> >>
>> >> On 2/28/15, 11:25 AM, "Santosh Akhilesh" <santoshakhilesh@gmail.com>
>> >> wrote:
>> >>
>> >> >Only difference I find in my setup is hbase mine hbase is 0.98.10 
>>and
>> >> >kylin is 0.98.4. I wil try downgrading my hbase. But I really doubt
>> >>that
>> >> >this will solve the problem. But since there is no alternate option 
>>in
>> >> >sight , I will anyhow give it a try.
>> >> >
>> >> >Sent fro Outlook on iPhone
>> >> >
>> >> >
>> >> >
>> >> >
>> >> >On Fri, Feb 27, 2015 at 7:00 PM -0800 "Shi, Shaofeng"
>> >><shaoshi@ebay.com>
>> >> >wrote:
>> >> >
>> >> >
>> >> >
>> >> >
>> >> >
>> >> >
>> >> >
>> >> >
>> >> >
>> >> >
>> >> >Hmm… please use the same level client jars; In Kylin’s pom.xml, it
>> >> >compiles with 2.6.0 jars:
>> >> >https://github.com/KylinOLAP/Kylin/blob/master/pom.xml#L19
>> >> >
>> >> >
>> >> >On 2/27/15, 8:53 PM, "Santoshakhilesh"  wrote:
>> >> >
>> >> >>Hi Shaofeng ,
>> >> >>    I checked the hbase libs , I am using hbas 0.98.10-hadoop2 its
>> >>using
>> >> >>hadoop-mapreduce-client-app-2.2.0.jar
>> >> >>but hadoop is using 2.6.0
>> >> >>
>> >> >>Is this the issue ?
>> >> >>
>> >> >>I checked kylin POM its using the 0.98.4-hadoop2
>> >> >>
>> >> >>Is this problem due to this mismatch ? do you sugegst me to try 
>>with
>> >> >>changing my hbase version ?
>> >> >>
>> >> >>Regards,
>> >> >>Santosh Akhilesh
>> >> >>Bangalore R&D
>> >> >>HUAWEI TECHNOLOGIES CO.,LTD.
>> >> >>
>> >> >>www.huawei.com
>> >>
>> 
>>>>>>---------------------------------------------------------------------
>>>>>>--
>> >>>>--
>> >> >>-
>> >> >>-----------------------------------------------------------
>> >> >>This e-mail and its attachments contain confidential information 
>>from
>> >> >>HUAWEI, which
>> >> >>is intended only for the person or entity whose address is listed
>> >>above.
>> >> >>Any use of the
>> >> >>information contained herein in any way (including, but not limited
>> >>to,
>> >> >>total or partial
>> >> >>disclosure, reproduction, or dissemination) by persons other than 
>>the
>> >> >>intended
>> >> >>recipient(s) is prohibited. If you receive this e-mail in error,
>> >>please
>> >> >>notifythe sender by
>> >> >>phone or email immediately and delete it!
>> >> >>
>> >> >>________________________________________
>> >> >>From: Santoshakhilesh [santosh.akhilesh@huawei.com]
>> >> >>Sent: Friday, February 27, 2015 4:49 PM
>> >> >>To: dev@kylin.incubator.apache.org
>> >> >>Cc: Kulbhushan Rana
>> >> >>Subject: RE: Cube Build Failed at Last Step//RE: Error while making
>> >>cube
>> >> >>& Measure option is not responding on GUI
>> >> >>
>> >> >>Hi Shaofeng ,
>> >> >>    I configured job histroy server and no more connection 
>>exception.
>> >>now
>> >> >>I get the MR counter exception which we were suspecting.
>> >> >>    My haddop version is indeed 2.6.0 , So any idea what can be 
>>done
>> >>for
>> >> >>this ?
>> >> >>
>> >> >>QuartzScheduler_Worker-8]:[2015-02-28
>> >>
>> 
>>>>>>00:36:26,507][DEBUG][com.kylinolap.job.tools.HadoopStatusChecker.chec
>>>>>>kS
>> >>>>ta
>> >> >>t
>> >> >>us(HadoopStatusChecker.java:74)] - State of Hadoop job:
>> >> >>job_1424957178195_0031:FINISHED-SUCCEEDED
>> >> >>[QuartzScheduler_Worker-8]:[2015-02-28
>> >>
>> 
>>>>>>00:36:27,204][ERROR][com.kylinolap.job.cmd.JavaHadoopCmdOutput.update
>>>>>>Jo
>> >>>>bC
>> >> >>o
>> >> >>unter(JavaHadoopCmdOutput.java:176)] - No enum constant
>> >> >>org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_REDUCES
>> >> >>java.lang.IllegalArgumentException: No enum constant
>> >> >>org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_REDUCES
>> >> >> at java.lang.Enum.valueOf(Enum.java:236)
>> >> >> at
>> >>
>> 
>>>>>>org.apache.hadoop.mapreduce.counters.FrameworkCounterGroup.valueOf(Fr
>>>>>>am
>> >>>>ew
>> >> >>o
>> >> >>rkCounterGroup.java:148)
>> >> >> at
>> >>
>> 
>>>>>>org.apache.hadoop.mapreduce.counters.FrameworkCounterGroup.findCounte
>>>>>>r(
>> >>>>Fr
>> >> >>a
>> >> >>meworkCounterGroup.java:182)
>> >> >> at
>> >>
>> 
>>>>>>org.apache.hadoop.mapreduce.counters.AbstractCounters.findCounter(Abs
>>>>>>tr
>> >>>>ac
>> >> >>t
>> >> >>Counters.java:154)
>> >> >> at
>> >>
>> 
>>>>>>org.apache.hadoop.mapreduce.TypeConverter.fromYarn(TypeConverter.java
>>>>>>:2
>> >>>>40
>> >> >>)
>> >> >> at
>> >>
>> 
>>>>>>org.apache.hadoop.mapred.ClientServiceDelegate.getJobCounters(ClientS
>>>>>>er
>> >>>>vi
>> >> >>c
>> >> >>eDelegate.java:370)
>> >> >> at
>> >>
>> 
>>>>>>org.apache.hadoop.mapred.YARNRunner.getJobCounters(YARNRunner.java:51
>>>>>>1)
>> >> >> at org.apache.hadoop.mapreduce.Job$7.run(Job.java:756)
>> >> >> at org.apache.hadoop.mapreduce.Job$7.run(Job.java:753)
>> >> >> at java.security.AccessController.doPrivileged(Native Method)
>> >> >> at javax.security.auth.Subject.doAs(Subject.java:415)
>> >> >> at
>> >>
>> 
>>>>>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
>>>>>>ti
>> >>>>on
>> >> >>.
>> >> >>java:1491)
>> >> >> at org.apache.hadoop.mapreduce.Job.getCounters(Job.java:753)
>> >> >> at
>> >>
>> 
>>>>>>com.kylinolap.job.hadoop.AbstractHadoopJob.getCounters(AbstractHadoop
>>>>>>Jo
>> >>>>b.
>> >> >>j
>> >> >>ava:287)
>> >> >> at
>> >>
>> 
>>>>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.updateJobCounter(JavaHadoop
>>>>>>Cm
>> >>>>dO
>> >> >>u
>> >> >>tput.java:162)
>> >> >> at
>> >>
>> 
>>>>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.getStatus(JavaHadoopCmdOutp
>>>>>>ut
>> >>>>.j
>> >> >>a
>> >> >>va:85)
>> >> >> at
>> >>
>> 
>>>>>>com.kylinolap.job.flow.AsyncJobFlowNode.execute(AsyncJobFlowNode.java
>>>>>>:8
>> >>>>6)
>> >> >> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
>> >> >> at
>> >>
>> 
>>>>>>org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.j
>>>>>>av
>> >>>>a:
>> >> >>5
>> >> >>73)
>> >> >>
>> >> >>Regards,
>> >> >>Santosh Akhilesh
>> >> >>Bangalore R&D
>> >> >>HUAWEI TECHNOLOGIES CO.,LTD.
>> >> >>
>> >> >>www.huawei.com
>> >>
>> 
>>>>>>---------------------------------------------------------------------
>>>>>>--
>> >>>>--
>> >> >>-
>> >> >>-----------------------------------------------------------
>> >> >>This e-mail and its attachments contain confidential information 
>>from
>> >> >>HUAWEI, which
>> >> >>is intended only for the person or entity whose address is listed
>> >>above.
>> >> >>Anyuse of the
>> >> >>information contained herein in any way (including, but not limited
>> >>to,
>> >> >total or partial
>> >> >>disclosure, reproduction, or dissemination) by persons other 
>>thanthe
>> >> >>intended
>> >> >>recipient(s) is prohibited. If you receive this e-mail in error,
>> >>please
>> >> >>notify th sender by
>> >> >>phone or email immediately and delete it!
>> >> >>
>> >> >>________________________________________
>> >> >>From: Shi, Shaofeng [shaoshi@ebay.com]
>> >> >>Sent: Friday, February 27, 2015 3:10 PM
>> >> >>To: dev@kylin.incubator.apache.org
>> >> >>Subject: Re: Cube Build Failed at Last Step//RE: Error while making
>> >>cube
>> >> >>& Measure option is not responding on GUI
>> >> >>
>> >> >>0.0.0.0:10020 isn’t a valid network address I think; please check 
>>the
>> >> >>“mapreduce.jobhistory.address” in your mapred-site.xml; it should 
>>be
>> >> >>something like:
>> >> >>
>> >> >>
>> >> >>
>> >> >>  mapreduce.jobhistory.address
>> >> >>  sandbox.hortonworks.com:10020
>> >> >>
>> >> >>
>> >> >>
>> >> >>
>> >> >>On 2/27/15, 5:29 PM, "Santoshakhilesh"
>> >> >>wrote:
>> >> >>
>> >> >>>Hi Shaofeng ,
>> >> >>>   No I have not found MR counter exception. I get following
>> >>exception
>> >> >>>frequently. I think this is related LogHistory server of hadoop.
>> >> >>>
>> >> >>>[QuartzScheduler_Worker-23]:[2015-02-27
>> >>
>> 
>>>>>>>22:18:37,299][ERROR][com.kylinolap.job.cmd.JavaHadoopCmdOutput.updat
>>>>>>>eJ
>> >>>>>ob
>> >> >>>C
>> >> >>>o
>> >> >>>unter(JavaHadoopCmdOutput.java:176)] - java.io.IOException:
>> >> >>>java.net.ConnectException: Call From linux/10.19.93.68 to
>> >>0.0.0.0:10020
>> >> >>>failed on connection exception: java.net.ConnectException: 
>>Connection
>> >> >>>refused; For more details see:
>> >> >>>http://wiki.apache.org/hadoop/ConnectionRefused
>> >> >>>com.kylinolap.job.exception.JobException: java.io.IOException:
>> >> >>>java.net.ConnectException: Call From linux/10.19.93.68 to
>> >>0.0.0.0:10020
>> >> >>>failed on connection exception: java.net.ConnectException: 
>>Connection
>> >> >>>refused; For more details see:
>> >> >>>http://wiki.apache.org/hadoop/ConnectionRefused
>> >> >>> at
>> >>
>> 
>>>>>>>com.kylinolap.job.hadoop.AbstractHadoopJob.getCounters(AbstractHadoo
>>>>>>>pJ
>> >>>>>ob
>> >> >>>.
>> >> >>>j
>> >> >>>ava:289)
>> >> >>> at
>> >>
>> 
>>>>>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.updateJobCounter(JavaHadoo
>>>>>>>pC
>> >>>>>md
>> >> >>>O
>> >> >>>u
>> >> >>>tput.java:162)
>> >> >>> at
>> >>
>> 
>>>>>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.getStatus(JavaHadoopCmdOut
>>>>>>>pu
>> >>>>>t.
>> >> >>>j
>> >> >>>a
>> >> >>>va:85)
>> >> >>> at
>> >>
>> 
>>>>>>>com.kylinolap.job.flow.AsyncJobFlowNode.execute(AsyncJobFlowNode.jav
>>>>>>>a:
>> >>>>>86
>> >> >>>)
>> >> >>> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
>> >> >>> at
>> >>
>> 
>>>>>>>rg.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.j
>>>>>>>av
>> >>>>>a:
>> >> >>>5
>> >> >>>73)
>> >> >>>Caused by: java.io.IOException: java.net.ConnectException: Call 
>>From
>> >> >>>linux/10.19.93.68 to 0.0.0.0:10020 failed on connection exception:
>> >> >>>java.net.ConnectException: Connection refused; For more details 
>>see:
>> >> >>>http://wiki.apache.org/hadoop/ConnectionRefused
>> >> >>>
>> >> >>>Regards,
>> >> >>>Santosh Akhilesh
>> >> >>>Bangalore R&D
>> >> >>>HUAWEI TECHNOLOGIES CO.,LTD.
>> >> >>>
>> >> >>>www.huawei.com
>> >>
>> 
>>>>>>>--------------------------------------------------------------------
>>>>>>>--
>> >>>>>--
>> >> >>>-
>> >> >>>-
>> >> >>>-----------------------------------------------------------
>> >> >>>This e-mail and its attachments contain confidential information 
>>from
>> >> >>>HUAWEI, which
>> >> >>>is intended only for the person or entity whose address is listed
>> >>above.
>> >> >>>Any use of the
>> >> >>>information contained herein in any way (including, but not 
>>limited
>> >>to,
>> >> >>>total or partial
>> >> >>>disclosure, reproduction, or dissemination) by persons other than 
>>the
>> >> >>>intended
>> >> >>>recipient(s) is prohibited. If you receive this e-mail in error,
>> >>please
>> >> >>>notify the sender by
>> >> >>>phone or email immediately and delete it!
>> >> >>>
>> >> >>>________________________________________
>> >> >>>From: Shi, Shaofeng [shaoshi@ebay.com]
>> >> >>>Sent: Friday, February 27, 2015 2:47 PM
>> >> >>>To: dev@klin.incubator.apache.org
>> >> >>>Cc: Kulbhushan Rana
>> >> >>>Subject: Re: Cube Build Failed at Last Step//RE: Error while 
>>making
>> >>cube
>> >> >>>& Measure option is not responding on GUI
>> >> >>>
>> >> >>>Did ou figure out the exception of "No enum constant
>> >> >>>org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_REDUCES” ? Is it
>> >>still
>> >> >>>be
>> >> >>>thrown in the logs? In the last step, Kyin need to parse the MR
>> >> >>>counters
>> >> >>>to update cube size; Please refer to
>> >> >>>https://issues.apache.org/jira/browse/MAPREDUCE-5831 for that 
>>error.
>> >> >>>
>> >> >>>On 2/27/15 5:04 PM, "Santoshakhilesh"
>> >> >>>wrote:
>> >> >>>
>> >> >>>>Hi Shaofeng ,
>> >> >>>>          Cube building is failed at last step while loading 
>>Hfile
>> >>to
>> >> >>>>Hbase with exception "Can't get cube segment size.
>> >> >>>>".  What could be reason ?
>> >> >>>>
>> >> >>>>parameter : -input
>> >> >>>>/tmp/kylin-17a4606f-905b-4ea1-922a-27c2bfb5c68b/RetailCube/hfile/
>> >> >>>>-htablename KYLIN_K27LDMX63W -cubename RetailCube
>> >> >>>>
>> >> >>>>Log:
>> >> >>>>
>> >> >>>>Start to execute command:
>> >> >>>> -input
>> >> >>>>/tmp/kylin-17a4606f-905b-4ea1-922a-27c2bfb5c68b/RetailCube/hfile/
>> >> >>>>-htablename KYLIN_K27LDMX63W -cubename RetailCube
>> >> >>>>Command execute return code 0
>> >> >>>>Failed with Exception:java.lang.RuntimeException: Can't get cube
>> >> >>>>segment
>> >> >>>>size.
>> >> >>>> at
>> >>
>> 
>>>>>>>>com.kylinolap.job.flow.JobFlowListener.updateCubeSegmentInfoOnSucce
>>>>>>>>ed
>> >>>>>>(J
>> >> >>>>o
>> >> >>>>b
>> >> >>>>F
>> >> >>>>lowListener.java:247)
>> >> >>>> at
>> >>
>> 
>>>>>>>>com.kylinolap.job.flow.JobFlowListener.jobWasExecuted(JobFlowListen
>>>>>>>>er
>> >>>>>>.j
>> >> >>>>a
>> >> >>>>v
>> >> >>>>a
>> >> >>>>:101)
>> >> >>>> at
>> >>
>> 
>>>>>>>>org.quartz.core.QuartzScheduler.notifyJobListenersWasExecuted(Quart
>>>>>>>>zS
>> >>>>>>ch
>> >> >>>>e
>> >> >>>>d
>> >> >>>>u
>> >> >>>>ler.java:1985)
>> >> >>>> at
>> >>
>> 
>>>>>>>>org.quartz.core.JobRunShell.notifyJobListenersComplete(JobRunShell.
>>>>>>>>ja
>> >>>>>>va
>> >> >>>>:
>> >> >>>>3
>> >> >>>>4
>> >> >>>>0)
>> >> >>>> at org.quartz.core.JobRunShell.run(JobRunShell.java:224)
>> >> >>>> at
>> >>
>> 
>>>>>>>>org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool
>>>>>>>>.j
>> >>>>>>av
>> >> >>>>a
>> >> >>>>:
>> >> >>>>5
>> >> >>>>73)
>> >> >>>>
>> >> >>>>I have checked in hbase shell and following are the tables in 
>>hbase;
>> >> >>>>hbase(main):001:0> list
>> >> >>>>TABLE
>> >> >>>>
>> >> >>>>KYLIN_K27LDMX63W
>> >> >>>>kylin_metadata_qa
>> >> >>>>kylin_metadata_qa_acl
>> >> >>>>kylin_metadata_qa_cube
>> >> >>>>kylin_metadata_qa_dict
>> >> >>>>kylin_metadata_qa_invertedindex
>> >> >>>>kylin_metadata_qa_job
>> >> >>>>kylin_metadata_qa_job_output
>> >> >>>>kylin_metadata_qa_proj
>> >> >>>>kylin_metadata_qa_table_snapshot
>> >> >>>>kylin_metadata_qa_user
>> >> >>>>11 row(s) in 0.8990 seconds
>> >> >>>>
>> >> >>>>
>> >> >>>>Regards,
>> >> >>>>Santosh Akhilesh
>> >> >>>>Bangalore R&D
>> >> >>>>HUAWEI TECHNOLOGIES CO.,LTD.
>> >> >>>>
>> >> >>>>www.huawei.com
>> >>
>> 
>>>>>>>>-------------------------------------------------------------------
>>>>>>>>--
>> >>>>>>--
>> >> >>>>-
>> >> >>>>-
>> >> >>>>-
>> >> >>>>-----------------------------------------------------------
>> >> >>>>This e-mail and its attachments contain confidential information
>> >>from
>> >> >>>>HUAWEI, which
>> >> >>>>is intended only for the person or entity whose address is listed
>> >> >>>>above.
>> >> >>>>Any use of the
>> >> >>>>information contained herein in any way (including, but not 
>>limited
>> >>to,
>> >> >>>>total or partial
>> >> >>>>disclosure, reproduction, or dissemination) by persons other than
>> >>the
>> >> >>>>intended
>> >> >>>>recipient(s) is prohibited. If you receive this e-mail in error,
>> >>please
>> >> >>>>notify the sender by
>> >> >>>>phone or email immediately and delete it!
>> >> >>>>
>> >> >>>>________________________________________
>> >> >>>>From: Santoshakhilesh
>> >> >>>>Sent: Friday, February 27, 2015 2:15 PM
>> >> >>>>To: dev@kylin.incubator.apache.org
>> >> >>>>Subject: RE: Error while making cube & Measure option is not
>> >>responding
>> >> >>>>on GUI
>> >> >>>>
>> >> >>>>I have manually copied the jar to /tmp/kylin , now satge 2 is 
>>done ,
>> >> >>>>thanks.
>> >> >>>>
>> >> >>>>Regards,
>> >> >>>>Santosh Akhilesh
>> >> >>>>Bangalore R&D
>> >> >>>>HUAWEI TECHNOLOGIES CO.,LTD.
>> >> >>>>
>> >> >>>>www.huawei.com
>> >>
>> 
>>>>>>>>-------------------------------------------------------------------
>>>>>>>>--
>> >>>>>>--
>> >> >>>>-
>> >> >>>>-
>> >> >>>>-
>> >> >>>>-----------------------------------------------------------
>> >> >>>>This e-mail and its attachments contain confidential information
>> >>from
>> >> >>>>HUAWEI, which
>> >> >>>>is intended only for the person or entity whose address is listed
>> >> >>>>above.
>> >> >>>>Any use of the
>> >> >>>>information contained herein in any way (including, but not 
>>limited
>> >>to,
>> >> >>>>total or partial
>> >> >>>>disclosure, reproduction, or dissemination) by persons other than
>> >>the
>> >> >>>>intended
>> >> >>>>recipient(s) is prohibited. If you receive this e-mail in error,
>> >>please
>> >> >>>>notify the sender by
>> >> >>>>phone or email immediately and delete it!
>> >> >>>>
>> >> >>>>________________________________________
>> >> >>>>From: Shi, Shaofeng [shaoshi@ebay.com]
>> >> >>>>Sent: Friday, February 27, 2015 1:00 PM
>> >> >>>>To: dev@kylin.incubator.apache.org
>> >> >>>>Cc: Kulbhushan Rana
>> >> >>>>Subject: Re: Error while making cube & Measure option is not
>> >>responding
>> >> >>>>on GUI
>> >> >>>>
>> >> >>>>In 0.6.x the packages are named with “com.kylinolap.xxx”, from 
>>0.7
>> >>we
>> >> >>>>renamed the package to “org.apache.kylin.xxx”; When you 
>>downgrade to
>> >> >>>>0.6,
>> >> >>>>did you also replace the jar location with 0.6 ones in
>> >> >>>>kylin.properties?
>> >> >>>>
>> >> >>>>On 2/27/15, 3:13 PM, "Santoshakhilesh"
>> >> >>>>wrote:
>> >> >>>>
>> >> >>>>>Hi Shaofeng ,
>> >> >>>>>         I have added my fact and dimension tables under default
>> >> >>>>>database
>> >> >>>>>of hive.
>> >> >>>>>         Now stage 1 of Cube Build is ok. And there is failure 
>>at
>> >> >>>>>step2.
>> >> >>>>>        The map reduce job for the finding distinct columns of 
>>fact
>> >> >>>>>table
>> >> >>>>>is error. Yarn log is as below.
>> >> >>>>>        Strangely this is class not found error. I have checked 
>>the
>> >> >>>>>Kylin.properties and the jar is already set as below.
>> >> >>>>>kylin. log has one exception connecting to linux/10.19.93.68 to
>> >> >>>>>0.0.0.0:10020
>> >> >>>>> Please help me to give a clue , I am also trying to check
>> >>meanwhile
>> >> >>>>>
>> >> >>>>>Thanks.
>> >> >>>>>kylin property
>> >> >>>>># Temp folder in hdfs
>> >> >>>>>kylin.hdfs.working.dir=/tmp
>> >> >>>>># Path to the local(relative to job engine) job jar, job engine
>> >>will
>> >> >>>>>use
>> >> >>>>>this jar
>> >> >>>>>kylin.job.jar=/tmp/kylin/kylin-job-latest.jar
>> >> >>>>>
>> >> >>>>>Map Reduce error
>> >> >>>>>----------------------------
>> >> >>>>>2015-02-27 20:24:25,262 FATAL [main]
>> >> >>>>>org.apache.hadoop.mapred.YarnChild:
>> >> >>>>>Error running child : java.lang.NoClassDefFoundError:
>> >> >>>>>com/kylinolap/common/mr/KylinMapper
>> >> >>>>> at java.lang.ClassLoader.defineClass1(Native Method)
>> >> >>>>> at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
>> >> >>>>> at
>> >>
>> 
>>>>>>>>>java.security.SecureClassLoader.defineClass(SecureClassLoader.java
>>>>>>>>>:1
>> >>>>>>>42
>> >> >>>>>)
>> >> >>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
>> >> >>>>> at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
>> >> >>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>> >> >>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>> >> >>>>> at java.security.AccessController.doPrivileged(Native Method)
>> >> >>>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>> >> >>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>> >> >>>>> at 
>>sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>> >> >>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>> >> >>>>> at java.lang.Class.forName0(Native Method)
>> >> >>>>> at java.lang.Class.forName(Class.java:274)
>> >> >>>>> at
>> >>
>> 
>>>>>>>>>org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configur
>>>>>>>>>at
>> >>>>>>>io
>> >> >>>>>n
>> >> >>>>>.
>> >> >>>>>j
>> >> >>>>>a
>> >> >>>>>va:2013)
>> >> >>>>>
>> >> >>>>>Kylin.log
>> >> >>>>>QuartzScheduler_Worker-20]:[2015-02-27
>> >>
>> 
>>>>>>>>>20:25:00,663][DEBUG][com.kylinolap.job.engine.JobFetcher.execute(J
>>>>>>>>>ob
>> >>>>>>>Fe
>> >> >>>>>t
>> >> >>>>>c
>> >> >>>>>h
>> >> >>>>>e
>> >> >>>>>r.java:60)] - 0 pending jobs
>> >> >>>>>[QuartzScheduler_Worker-19]:[2015-02-27
>> >>
>> 
>>>>>>>>>20:25:01,730][ERROR][com.kylinolap.job.cmd.JavaHadoopCmdOutput.upd
>>>>>>>>>at
>> >>>>>>>eJ
>> >> >>>>>o
>> >> >>>>>b
>> >> >>>>>C
>> >> >>>>>o
>> >> >>>>>unter(JavaHadoopCmdOutput.java:176)] - java.io.IOException:
>> >> >>>>>java.net.ConnectException: Call From linux/10.19.93.68 to
>> >> >>>>>0.0.0.0:10020
>> >> >>>>>failed on connection exception: java.net.ConnectException:
>> >>Connection
>> >> >>>>>refused; For more details see:
>> >> >>>>>http://wiki.apache.org/hadoop/ConnectionRefused
>> >> >>>>>com.kylinolap.job.exception.JobException: java.io.IOException:
>> >> >>>>>java.net.ConnectException: Call From linux/10.19.93.68 to
>> >> >>>>>0.0.0.0:10020
>> >> >>>>>failed on connection exception: java.net.ConnectException:
>> >>Connection
>> >> >>>>>refused; For more details see:
>> >> >>>>>http://wiki.apache.org/hadoop/ConnectionRefused
>> >> >>>>> at
>> >>
>> 
>>>>>>>>>com.kylinolap.job.hadoop.AbstractHadoopJob.getCounters(AbstractHad
>>>>>>>>>oo
>> >>>>>>>pJ
>> >> >>>>>o
>> >> >>>>>b
>> >> >>>>>.
>> >> >>>>>j
>> >> >>>>>ava:289)
>> >> >>>>> at
>> >>
>> 
>>>>>>>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.updateJobCounter(JavaHad
>>>>>>>>>oo
>> >>>>>>>pC
>> >> >>>>>m
>> >> >>>>>d
>> >> >>>>>O
>> >> >>>>>u
>> >> >>>>>tput.java:162)
>> >> >>>>> at
>> >>
>> 
>>>>>>>>>com.kylinolap.job.cmd.JavaHadoopCmdOutput.getStatus(JavaHadoopCmdO
>>>>>>>>>ut
>> >>>>>>>pu
>> >> >>>>>t
>> >> >>>>>.
>> >> >>>>>j
>> >> >>>>>a
>> >> >>>>>va:85)
>> >> >>>>> at
>> >>
>> 
>>>>>>>>>com.kylinolap.job.flow.AsyncJobFlowNode.execute(AsyncJobFlowNode.j
>>>>>>>>>av
>> >>>>>>>a:
>> >> >>>>>8
>> >> >>>>>6
>> >> >>>>>)
>> >> >>>>> at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
>> >> >>>>> at
>> >>
>> 
>>>>>>>>>org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPoo
>>>>>>>>>l.
>> >>>>>>>ja
>> >> >>>>>v
>> >> >>>>>a
>> >> >>>>>:
>> >> >>>>>5
>> >> >>>>>73)
>> >> >>>>>Caused by: java.io.IOException: java.net.ConnectException: Call
>> >>From
>> >> >>>>>linux/10.19.93.68 to 0.0.0.0:10020 failed on connection 
>>exception:
>> >> >>>>>java.net.ConnectException: Connection refused; For more details
>> >>see:
>> >> >>>>>http://wiki.apache.org/hadoop/ConnectionRefused
>> >> >>>>> at
>> >>
>> 
>>>>>>>>>org.apache.hadoop.mapred.ClientServiceDelegate.invoke(ClientServic
>>>>>>>>>eD
>> >>>>>>>el
>> >> >>>>>e
>> >> >>>>>g
>> >> >>>>>a
>> >> >>>>>t
>> >> >>>>>e.java:331)
>> >> >>>>> at
>> >>
>> 
>>>>>>>>>org.apache.hadoop.mapred.ClientServiceDelegate.getJobCounters(Clie
>>>>>>>>>nt
>> >>>>>>>Se
>> >> >>>>>r
>> >> >>>>>v
>> >> >>>>>i
>> >> >>>>>c
>> >> >>>>>eDelegate.java:368)
>> >> >>>>> at
>> >>
>> 
>>>>>>>>>org.apache.hadoop.mapred.YARNRunner.getJobCounters(YARNRunner.java
>>>>>>>>>:5
>> >>>>>>>11
>> >> >>>>>)
>> >> >>>>> at org.apache.hadoop.mapreduce.Job$7.run(Job.java:756)
>> >> >>>>> at org.apache.hadoop.mapreduce.Job$7.run(Job.java:753)
>> >> >>>>> at java.security.AccessController.doPrivileged(Native Method)
>> >> >>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> >> >>>>> at
>> >>
>> 
>>>>>>>>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInfo
>>>>>>>>>rm
>> >>>>>>>at
>> >> >>>>>i
>> >> >>>>>o
>> >> >>>>>n
>> >> >>>>>.
>> >> >>>>>java:1491)
>> >> >>>>> at org.apache.hadoop.mapreduce.Job.getCounters(Job.java:753)
>> >> >>>>> at
>> >>
>> 
>>>>>>>>>com.kylinolap.job.hadoop.AbstractHadoopJob.getCounters(AbstractHad
>>>>>>>>>oo
>> >>>>>>>pJ
>> >> >>>>>o
>> >> >>>>>b
>> >> >>>>>.
>> >> >>>>>j
>> >> >>>>>ava:287)
>> >> >>>>> ... 5 more
>> >> >>>>>
>> >> >>>>>Regards,
>> >> >>>>>Santosh Akhilesh
>> >> >>>>>Bangalore R&D
>> >> >>>>>HUAWEI TECHNOLOGIES CO.,LTD.
>> >> >>>>>
>> >> >>>>>www.huawei.com
>> >>
>> 
>>>>>>>>>------------------------------------------------------------------
>>>>>>>>>--
>> >>>>>>>--
>> >> >>>>>-
>> >> >>>>>-
>> >> >>>>>-
>> >> >>>>>-
>> >> >>>>>-----------------------------------------------------------
>> >> >>>>>This e-mail and its attachments contain confidential information
>> >>from
>> >> >>>>>HUAWEI, which
>> >> >>>>>is intended only for the person or entity whose address is 
>>listed
>> >> >>>>>above.
>> >> >>>>>Any use of the
>> >> >>>>>information contained herein in any way (including, but not 
>>limited
>> >> >>>>>to,
>> >> >>>>>total or partial
>> >> >>>>>disclosure, reproduction, or dissemination) by persons other 
>>than
>> >>the
>> >> >>>>>intended
>> >> >>>>>recipient(s) is prohibited. If you receive this e-mail in error,
>> >> >>>>>please
>> >> >>>>>notify the sender by
>> >> >>>>>phone or email immediately and delete it!
>> >> >>>>>
>> >> >>>>>________________________________________
>> >> >>>>>From: Shi, Shaofeng [shaoshi@ebay.com]
>> >> >>>>>Sent: Friday, February 27, 2015 8:01 AM
>> >> >>>>>To: dev@kylin.incubator.apache.org
>> >> >>>>>Subject: Re: Error while making cube & Measure option is not
>> >> >>>>>responding
>> >> >>>>>on GUI
>> >> >>>>>
>> >> >>>>>In 0.6.x it only support tables in default database, this is a
>> >> >>>>>limitation;
>> >> >>>>>The support for non-default tables will be released in 0.7;
>> >> >>>>>
>> >> >>>>>To bypass this issue for now, please copy the table to default
>> >> >>>>>database
>> >> >>>>>as
>> >> >>>>>a workaround;
>> >> >>>>>
>> >> >>>>>On 2/27/15, 10:16 AM, "Santosh Akhilesh"
>> >> >>>>>wrote:
>> >> >>>>>
>> >> >>>>>>@Jason
>> >> >>>>>>thanks , but now as suggested by Saofeng I m not using the
>> >>inverted
>> >> >>>>>>index
>> >> >>>>>>brach as its not stable.
>> >> >>>>>>I have switched back to 0.6 branch , in this branch yesterday
>> >>night I
>> >> >>>>>>could
>> >> >>>>>>crete the cube successfully but there is issue while building 
>>it.
>> >>I
>> >> >>>>>>feel
>> >> >>>>>>that at step 1 of cube build  while creating flat table when
>> >>command
>> >> >>>>>>is
>> >> >>>>>>issued to hive if the tables are not under default datbase flat
>> >>table
>> >> >>>>>>creation is failed and cube build fails. my fact and dimension
>> >>tables
>> >> >>>>>>are
>> >> >>>>>>under a database called retail.
>> >> >>>>>>
>> >> >>>>>>@Saofeng - Can you please confirm this behavior ? Do I need to
>> >>create
>> >> >>>>>>the
>> >> >>>>>>hive tables under default database?
>> >> >>>>>>
>> >> >>>>>>On Fri, Feb 27, 2015 at 7:32 AM, jason zhong
>> >> >>>>>>wrote:
>> >> >>>>>>
>> >> >>>>>>> @Santoshakhilesh
>> >> >>>>>>>
>> >> >>>>>>> 1. When I go to measure section and click on measure option ,
>> >>there
>> >> >>>>>>>is
>> >> >>>>>>>no
>> >> >>>>>>> response , I want add measure on qty and price with sum
>> >> >>>>>>>          --bug fixed on inverted-index branch
>> >> >>>>>>>
>> >> >>>>>>>
>> >> >>>>>>> On Fri, Feb 27, 2015 at 3:03 AM, Santosh Akhilesh <
>> >> >>>>>>> santoshakhilesh@gmail.com
>> >> >>>>>>> > wrote:
>> >> >>>>>>>
>> >> >>>>>>> > Hi Shaofeng ,
>> >> >>>>>>> >      I have build the 0.6 version and now able to create 
>>the
>> >>cube
>> >> >>>>>>> > successfully.
>> >> >>>>>>> >      While building the cube , it fails at step1 with
>> >>following
>> >> >>>>>>>error.
>> >> >>>>>>> > Table not found 'DIM_ITEM'
>> >> >>>>>>> >      the table exists , but its under retail data base and 
>>not
>> >> >>>>>>>under
>> >> >>>>>>> > default database.
>> >> >>>>>>> >      does kylin require hive taables to be under default
>> >>database
>> >> >>>>>>>?
>> >> >>>>>>>I
>> >> >>>>>>>see
>> >> >>>>>>> > the flat table being created under default database.
>> >> >>>>>>> >
>> >> >>>>>>> > Logging initialized using configuration in
>> >> >>>>>>> >
>> >> >>>>>>> >
>> >> >>>>>>>
>> >>
>> 
>>>>>>>>>>>jar:file:/home/santosh/work/frameworks/apache-hive-1.0.0/lib/hiv
>>>>>>>>>>>e-
>> >>>>>>>>>co
>> >> >>>>>>>m
>> >> >>>>>>>m
>> >> >>>>>>>o
>> >> >>>>>>>n
>> >> >>>>>>>-
>> >> >>>>>>>1.0.0.jar!/hive-log4j.properties
>> >> >>>>>>> > SLF4J: Class path contains multiple SLF4J bindings.
>> >> >>>>>>> > SLF4J: Found binding in
>> >> >>>>>>> >
>> >> >>>>>>> >
>> >> >>>>>>>
>> >>
>> 
>>>>>>>>>>>[jar:file:/home/santosh/work/frameworks/hadoop-2.6.0/share/hadoo
>>>>>>>>>>>p/
>> >>>>>>>>>co
>> >> >>>>>>>m
>> >> >>>>>>>m
>> >> >>>>>>>o
>> >> >>>>>>>n
>> >> >>>>>>>/
>> >>
>> 
>>>>>>>>>>>lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.c
>>>>>>>>>>>la
>> >>>>>>>>>ss
>> >> >>>>>>>]
>> >> >>>>>>> > SLF4J: Found binding in
>> >> >>>>>>> >
>> >> >>>>>>> >
>> >> >>>>>>>
>> >>
>> 
>>>>>>>>>>>[jar:file:/home/santosh/work/frameworks/apache-hive-1.0.0/lib/hi
>>>>>>>>>>>ve
>> >>>>>>>>>-j
>> >> >>>>>>>d
>> >> >>>>>>>b
>> >> >>>>>>>c
>> >> >>>>>>>-
>> >> >>>>>>>1
>> >> >>>>>>>.0.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> >> >>>>>>> > SLF4J: See 
>>http://www.slf4j.org/codes.html#multiple_bindings
>> >>for
>> >> >>>>>>>an
>> >> >>>>>>> > explanation.
>> >> >>>>>>> > SLF4J: Actual binding is of type
>> >> >>>>>>>[org.slf4j.impl.Log4jLoggerFactory]
>> >> >>>>>>> > OK
>> >> >>>>>>> > Time taken: 0.964 seconds
>> >> >>>>>>> > OK
>> >> >>>>>>> > Time taken: 0.948 seconds
>> >> >>>>>>> > FAILED: SemanticException [Error 10001]: Line 12:11 Table 
>>not
>> >> >>>>>>>found
>> >> >>>>>>> > 'DIM_ITEM'
>> >> >>>>>>> >
>> >> >>>>>>> >
>> >> >>>>>>> >
>> >> >>>>>>> > Command is as below.
>> >> >>>>>>> >
>> >> >>>>>>> > hive -e "DROP TABLE IF EXISTS
>> >> >>>>>>> >
>> >>
>> 
>>>>>>>>>>>kylin_intermediate_test_FULL_BUILD_8b30b29b_5f2c_4b63_8c0f_07d1f
>>>>>>>>>>>55
>> >>>>>>>>>9d
>> >> >>>>>>>d
>> >> >>>>>>>4
>> >> >>>>>>>4
>> >> >>>>>>>;
>> >> >>>>>>> > CREATE EXTERNAL TABLE IF NOT EXISTS
>> >> >>>>>>> >
>> >>
>> 
>>>>>>>>>>>kylin_intermediate_test_FULL_BUILD_8b30b29b_5f2c_4b63_8c0f_07d1f
>>>>>>>>>>>55
>> >>>>>>>>>9d
>> >> >>>>>>>d
>> >> >>>>>>>4
>> >> >>>>>>>4
>> >> >>>>>>> > (
>> >> >>>>>>> > STOREID int
>> >> >>>>>>> > ,ITEMID int
>> >> >>>>>>> > ,CUSTID int
>> >> >>>>>>> > ,QTY int
>> >> >>>>>>> > ,AMOUNT double
>> >> >>>>>>> > )
>> >> >>>>>>> > ROW FORMAT DELIMITED FIELDS TERMINATED BY '\177'
>> >> >>>>>>> > STORED AS SEQUENCEFILE
>> >> >>>>>>> > LOCATION
>> >> >>>>>>> >
>> >> >>>>>>>
>> >>
>> 
>>>>>>>>>>>'/tmp/kylin-8b30b29b-5f2c-4b63-8c0f-07d1f559dd44/kylin_intermedi
>>>>>>>>>>>at
>> >>>>>>>>>e_
>> >> >>>>>>>t
>> >> >>>>>>>e
>> >> >>>>>>>s
>> >> >>>>>>>t
>> >> >>>>>>>_
>> >> >>>>>>>FULL_BUILD_8b30b29b_5f2c_4b63_8c0f_07d1f559dd44';
>> >> >>>>>>> > SET hive.exec.compress.output=true;
>> >> >>>>>>> > SET hive.auto.convert.join.noconditionaltask = true;
>> >> >>>>>>> > SET hive.auto.convert.join.noconditionaltask.size = 
>>300000000;
>> >> >>>>>>> > INSERT OVERWRITE TABLE
>> >> >>>>>>> >
>> >>
>> 
>>>>>>>>>>>kylin_intermediate_test_FULL_BUILD_8b30b29b_5f2c_4b63_8c0f_07d1f
>>>>>>>>>>>55
>> >>>>>>>>>9d
>> >> >>>>>>>d
>> >> >>>>>>>4
>> >> >>>>>>>4
>> >> >>>>>>> > SELECT
>> >> >>>>>>> > FACT_SALES.STOREID
>> >> >>>>>>> > ,FACT_SALES.ITEMID
>> >> >>>>>>> > ,FACT_SALES.CUSTID
>> >> >>>>>>> > ,FACT_SALES.QTY
>> >> >>>>>>> > ,FACT_SALES.AMOUNT
>> >> >>>>>>> > FROM FACT_SALES
>> >> >>>>>>> > INNER JOIN DIM_STORE
>> >> >>>>>>> > ON FACT_SALES.STOREID = DIM_STORE.SROREID
>> >> >>>>>>> > INNER JOIN DIM_ITEM
>> >> >>>>>>> > ON FACT_SALES.ITEMID = DIM_ITEM.ITEMID
>> >> >>>>>>> > INNER JOIN DIM_CUSTOMER
>> >> >>>>>>> > ON FACT_SALES.CUSTID = DIM_CUSTOMER.CUSTID
>> >> >>>>>>> > ;
>> >> >>>>>>> > "
>> >> >>>>>>> >
>> >> >>>>>>> >
>> >> >>>>>>> >
>> >> >>>>>>> > On Thu, Feb 26, 2015 at 8:11 PM, Shi, Shaofeng
>> >> >>>>>>>wrote:
>> >> >>>>>>> >
>> >> >>>>>>> > > The 0.7.1 is test version, its package contains the
>> >>“snapshot”
>> >> >>>>>>>suffix;
>> >> >>>>>>> we
>> >> >>>>>>> > > will upload a new package there; Luke will also add a
>> >>message
>> >> >>>>>>>there
>> >> >>>>>>>to
>> >> >>>>>>> > > avoid this confusion;
>> >> >>>>>>> > >
>> >> >>>>>>> > > Regarding the problem that you encountered, could you 
>>please
>> >> >>>>>>>open
>> >> >>>>>>>a
>> >> >>>>>>> JIRA
>> >> >>>>>>> > > ticket for tracking? Here is link of Apache JIRA:
>> >> >>>>>>> > >
>> >> >>>>>>> > > https://issues.apache.org/jira/secure/Dashboard.jspa
>> >> >>>>>>> > >
>> >> >>>>>>> > >
>> >> >>>>>>> > > Thanks for the feedback!
>> >> >>>>>>> > >
>> >> >>>>>>> > > On 2/26/15, 10:21 PM, "Santosh Akhilesh"
>> >> >>>>>>>
>> >> >>>>>>> > > wrote:
>> >> >>>>>>> > >
>> >> >>>>>>> > > >Actually I see this being published on kylin webpage.
>> >> >>>>>>> > > >http://kylin.incubator.apache.org/download/
>> >> >>>>>>> > > >I am using 0.7.1 inverted index branch binary 
>>distribution.
>> >> >>>>>>> > > >If this is not stable please give me the link of stable
>> >>branch
>> >> >>>>>>>I
>> >> >>>>>>>would
>> >> >>>>>>> > try
>> >> >>>>>>> > > >building and testing tonight.
>> >> >>>>>>> > > >On Thu, 26 Feb 2015 at 7:30 pm, Shi, Shaofeng
>> >> >>>>>>>
>> >> >>>>>>> wrote:
>> >> >>>>>>> > > >
>> >> >>>>>>> > > >> Hi Santosh, it is not recommended to use the dev code
>> >>branch
>> >> >>>>>>> > (actually I
>> >> >>>>>>> > > >> don’t know how you get the v0.7.x build and what’s the
>> >> >>>>>>>detail
>> >> >>>>>>> version
>> >> >>>>>>> > of
>> >> >>>>>>> > > >> that; each day we submit many changes to that);
>> >> >>>>>>> > > >>
>> >> >>>>>>> > > >> The options are 1) switch back to latest release 
>>v0.6.5;
>> >>or
>> >> >>>>>>>2)
>> >> >>>>>>>wait
>> >> >>>>>>> > for
>> >> >>>>>>> > > >> the formal release of 0.7, that should be in March;
>> >> >>>>>>>Otherwise,
>> >> >>>>>>>we
>> >> >>>>>>> > > >>couldn’t
>> >> >>>>>>> > > >> ensure there is no new problems come out in your next
>> >>steps;
>> >> >>>>>>> > > >>
>> >> >>>>>>> > > >> On 2/26/15, 5:39 PM, "Santosh Akhilesh"
>> >> >>>>>>>
>> >> >>>>>>> > > >>wrote:
>> >> >>>>>>> > > >>
>> >> >>>>>>> > > >> >Hi Shaofeng
>> >> >>>>>>> > > >> >So what do you suggest , how should I proceed further
>> >>with
>> >> >>>>>>>this
>> >> >>>>>>> > > >>release?
>> >> >>>>>>> > > >> >Will there be a patch? Any alternate way I can create
>> >>cube?
>> >> >>>>>>> > > >> >Please suggest.
>> >> >>>>>>> > > >> >Regards
>> >> >>>>>>> > > >> >Santosh
>> >> >>>>>>> > > >> >On Thu, 26 Feb 2015 at 3:04 pm, Shi, Shaofeng
>> >> >>>>>>>
>> >> >>>>>>> > > wrote:
>> >> >>>>>>> > > >> >
>> >> >>>>>>> > > >> >> Hi Santosh,
>> >> >>>>>>> > > >> >>
>> >> >>>>>>> > > >> >> 0.7.1 hasn’t been formally released; from 0.6.x to
>> >>0.7.x
>> >> >>>>>>>we
>> >> >>>>>>>have
>> >> >>>>>>> > > >> >>metadata
>> >> >>>>>>> > > >> >> structure change; While, the web UI (cube wizard) 
>>for
>> >> >>>>>>>this
>> >> >>>>>>>change
>> >> >>>>>>> > > >>hasn’t
>> >> >>>>>>> > > >> >> been stabilized; So it is not strange that you got
>> >> >>>>>>>trouble
>> >> >>>>>>>when
>> >> >>>>>>> > > >>saving
>> >> >>>>>>> > > >> >>the
>> >> >>>>>>> > > >> >> cube;
>> >> >>>>>>> > > >> >>
>> >> >>>>>>> > > >> >> @Jason, any idea about the JS error?
>> >> >>>>>>> > > >> >>
>> >> >>>>>>> > > >> >> On 2/26/15, 5:08 PM, "Santosh Akhilesh" <
>> >> >>>>>>> santoshakhilesh@gmail.com
>> >> >>>>>>> > >
>> >> >>>>>>> > > >> >>wrote:
>> >> >>>>>>> > > >> >>
>> >> >>>>>>> > > >> >> >Hi Shaofeng,
>> >> >>>>>>> > > >> >> >
>> >> >>>>>>> > > >> >> >I am using the binary distribution 0.7.1. I have 
>>not
>> >> >>>>>>>been
>> >> >>>>>>>able
>> >> >>>>>>> to
>> >> >>>>>>> > > >>save
>> >> >>>>>>> > > >> >> >cube
>> >> >>>>>>> > > >> >> >even once. I have tried creating new project and 
>>from
>> >> >>>>>>>local
>> >> >>>>>>> > machine
>> >> >>>>>>> > > >>and
>> >> >>>>>>> > > >> >> >server machine. But I am always stuck with this
>> >>error. I
>> >> >>>>>>>am
>> >> >>>>>>> never
>> >> >>>>>>> > > >> >>allowed
>> >> >>>>>>> > > >> >> >to add measures and never been able to save the
>> >>cube. I
>> >> >>>>>>>also
>> >> >>>>>>>see
>> >> >>>>>>> > the
>> >> >>>>>>> > > >> >> >kylin.log and it always tries to save cube with
>> >>append
>> >> >>>>>>>mode.
>> >> >>>>>>>One
>> >> >>>>>>> > > >>thing
>> >> >>>>>>> > > >> >>I
>> >> >>>>>>> > > >> >> >need to tell that at partition stage since I don't
>> >>have
>> >> >>>>>>>a
>> >> >>>>>>>big
>> >> >>>>>>> fact
>> >> >>>>>>> > > >> >>table
>> >> >>>>>>> > > >> >> >now I have not partititioned the fact table and I
>> >>skip
>> >> >>>>>>>this
>> >> >>>>>>> step.
>> >> >>>>>>> > > >>Does
>> >> >>>>>>> > > >> >> >this
>> >> >>>>>>> > > >> >> >have affect in saving the cube. Is this because 
>>some
>> >> >>>>>>>metadata is
>> >> >>>>>>> > > >> >>available
>> >> >>>>>>> > > >> >> >and it tries to modify the cube? I am using latest
>> >> >>>>>>>Hadoop
>> >> >>>>>>>2.6.6.
>> >> >>>>>>> > Yes
>> >> >>>>>>> > > >> >>kylin
>> >> >>>>>>> > > >> >> >propert I have not added the jar. I will add them 
>>and
>> >> >>>>>>>check.
>> >> >>>>>>>But
>> >> >>>>>>> > > >>cube
>> >> >>>>>>> > > >> >> >creation failure is really puzzling me. I could 
>>see
>> >>no
>> >> >>>>>>>error
>> >> >>>>>>> logs
>> >> >>>>>>> > in
>> >> >>>>>>> > > >> >> >kylin.log.
>> >> >>>>>>> > > >> >> >Regards
>> >> >>>>>>> > > >> >> >Santosh
>> >> >>>>>>> > > >> >> >On Thu, 26 Feb 2015 at 1:40 pm, Shi, Shaofeng
>> >> >>>>>>>>>>>> >
>> >> >>>>>>> > > >> wrote:
>> >> >>>>>>> > > >> >> >
>> >> >>>>>>> > > >> >> >> Which version or code branch are you using? I
>> >>assume
>> >> >>>>>>>you’re
>> >> >>>>>>> > using
>> >> >>>>>>> > > >>the
>> >> >>>>>>> > > >> >> >> stable version from master; Seems you’re trying 
>>to
>> >> >>>>>>>edit
>> >> >>>>>>>an
>> >> >>>>>>> > > >>existing
>> >> >>>>>>> > > >> >>cube
>> >> >>>>>>> > > >> >> >> to add new measurement, try refresh your 
>>browser's
>> >> >>>>>>>cache;
>> >> >>>>>>>if
>> >> >>>>>>> it
>> >> >>>>>>> > > >>still
>> >> >>>>>>> > > >> >> >> couldn’t be saved, try to create a new cube;
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>> > > >> >> >> The two error traces in tomcat need be taken 
>>care:
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>> > > >> >> >> 1) java.lang.NoClassDefFoundError:
>> >> >>>>>>> > > >> >> >>org/apache/kylin/common/mr/KylinMapper
>> >> >>>>>>> > > >> >> >>         Please check kylin.properties file, 
>>making
>> >> >>>>>>>sure
>> >> >>>>>>>the
>> >> >>>>>>> > > >> >> >>“kylin.job.jar”
>> >> >>>>>>> > > >> >> >> points to a right jar file; It will be loaded in
>> >> >>>>>>>Map-reduce;
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>> > > >> >> >> 2) java.lang.IllegalArgumentException: No enum
>> >> >>>>>>>constant
>> >> >>>>>>> > > >> >> >>
>> >>org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_MAPS
>> >> >>>>>>> > > >> >> >>         This indicates your hadoop version 
>>might be
>> >> >>>>>>>old;
>> >> >>>>>>> Please
>> >> >>>>>>> > > >>check
>> >> >>>>>>> > > >> >> >>and
>> >> >>>>>>> > > >> >> >> ensure
>> >> >>>>>>> > > >> >> >> hadoop version is 2.2 or above.
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>> > > >> >> >> On 2/26/15, 3:21 PM, "Santoshakhilesh"
>> >> >>>>>>> > > >>
>> >> >>>>>>> > > >> >> >> wrote:
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>> > > >> >> >> >Hi Shaofeng ,
>> >> >>>>>>> > > >> >> >> >
>> >> >>>>>>> > > >> >> >> >   I am using chrome , When I click on button 
>>to
>> >>add
>> >> >>>>>>> measures ,
>> >> >>>>>>> > > >> >> >>following
>> >> >>>>>>> > > >> >> >> >is error on chrome console. When I try to save 
>>the
>> >> >>>>>>>cube
>> >> >>>>>>>there
>> >> >>>>>>> > is
>> >> >>>>>>> > > >>no
>> >> >>>>>>> > > >> >> >>error
>> >> >>>>>>> > > >> >> >> >in console. I just get a error dialog saying
>> >>failed
>> >> >>>>>>>to
>> >> >>>>>>>take
>> >> >>>>>>> > > >>action
>> >> >>>>>>> > > >> >>and
>> >> >>>>>>> > > >> >> >> >gives me the JSON cube schema.
>> >> >>>>>>> > > >> >> >> >
>> >> >>>>>>> > > >> >> >> >Error on chrome debug console is as below;
>> >> >>>>>>> > > >> >> >> >
>> >> >>>>>>> > > >> >> >> > ReferenceError: CubeDescModel is not defined
>> >> >>>>>>> > > >> >> >> >    at h.$scope.addNewMeasure
>> >> >>>>>>>(scripts.min.0.js:15984)
>> >> >>>>>>> > > >> >> >> >    at scripts.min.0.js:180
>> >> >>>>>>> > > >> >> >> >    at scripts.min.0.js:197
>> >> >>>>>>> > > >> >> >> >    at h.$eval (scripts.min.0.js:119)
>> >> >>>>>>> > > >> >> >> >    at h.$apply (scripts.min.0.js:119)
>> >> >>>>>>> > > >> >> >> >    at HTMLButtonElement.
>> >> >>>>>>>(scripts.min.0.js:197)
>> >> >>>>>>> > > >> >> >> >    at HTMLButtonElement.m.event.dispatch
>> >> >>>>>>> (scripts.min.0.js:3)
>> >> >>>>>>> > > >> >> >> >    at HTMLButtonElement.r.handle
>> >> >>>>>>> > > >> >> >> >(scripts.min.0.js:3)scripts.min.0.js:100
>> >>(anonymous
>> >> >>>>>>> > > >> >> >> >function)scripts.min.0.js:77 (anonymous
>> >> >>>>>>> > > >> >>function)scripts.min.0.js:119
>> >> >>>>>>> > > >> >> >> >h.$applyscripts.min.0.js:197 (anonymous
>> >> >>>>>>> > > >>function)scripts.min.0.js:3
>> >> >>>>>>> > > >> >> >> >m.event.dispatchscripts.min.0.js:3 r.handle
>> >> >>>>>>> > > >> >> >> >
>> >> >>>>>>> > > >> >> >> >   About the hive table import , I got pass the
>> >>run
>> >> >>>>>>>shell
>> >> >>>>>>> > command
>> >> >>>>>>> > > >> >> >> >exception but it still fails the hadoop log is;
>> >> >>>>>>> > > >> >> >> >2015-02-26 20:46:48,332 INFO [main]
>> >> >>>>>>>org.apache.hadoop.mapred.
>> >> >>>>>>> > > >> >> YarnChild:
>> >> >>>>>>> > > >> >> >> >mapreduce.cluster.local.dir for child:
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>/tmp/hadoop-root/nm-local-dir/usercache/root/appcache/appli
>> >> >>>>>>> > > >> >> >> cation_14249530
>> >> >>>>>>> > > >> >> >> >91340_0002
>> >> >>>>>>> > > >> >> >> >2015-02-26 20:46:48,776 INFO [main]
>> >> >>>>>>> > > >> >> >> 
>>>org.apache.hadoop.conf.Configuration.deprecation:
>> >> >>>>>>>session.id
>> >> >>>>>>> > is
>> >> >>>>>>> > > >> >> >> >deprecated. Instead, use dfs.metrics.session-id
>> >> >>>>>>> > > >> >> >> >2015-02-26 20:46:49,310 INFO [main]
>> >> >>>>>>> > > >>org.apache.hadoop.mapred.Task:
>> >> >>>>>>> > > >> >> >>Using
>> >> >>>>>>> > > >> >> >> >ResourceCalculatorProcessTree : [ ]
>> >> >>>>>>> > > >> >> >> >2015-02-26 20:46:49,386 FATAL [main]
>> >> >>>>>>> > > >> >> >>org.apache.hadoop.mapred.YarnChild:
>> >> >>>>>>> > > >> >> >> >Error running child :
>> >>java.lang.NoClassDefFoundError:
>> >> >>>>>>> > > >> >> >> >org/apache/kylin/common/mr/KylinMapper
>> >> >>>>>>> > > >> >> >> > at java.lang.ClassLoader.defineClass1(Native
>> >>Method)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>>java.lang.ClassLoader.defineClass(ClassLoader.java:800)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >> >> >> >java.security.SecureClassLoader.defineClass(
>> >> >>>>>>> > > >> >> SecureClassLoader.java:142)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > 
>>java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> java.net.URLClassLoader.access$100(URLClassLoader.java:71)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>>java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>>java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>>java.security.AccessController.doPrivileged(Native
>> >> >>>>>>> Method)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>>java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > 
>>sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>> >> >>>>>>> > > >> >> >> >
>> >> >>>>>>> > > >> >> >> >tomcat logs:
>> >> >>>>>>> > > >> >> >> >usage: HiveColumnCardinalityJob
>> >> >>>>>>> > > >> >> >> > -output         Output path
>> >> >>>>>>> > > >> >> >> > -table >>>>> > > >> >
>> >> >>>>>>> > > >> >> >> >wrote:
>> >> >>>>>>> > > >> >> >> >
>> >> >>>>>>> > > >> >> >> >>Hi Shaofeng ,
>> >> >>>>>>> > > >> >> >> >>   Thanks for replying.
>> >> >>>>>>> > > >> >> >> >>   Yes I am checking the yarn exception, But I
>> >>find
>> >> >>>>>>>that
>> >> >>>>>>> this
>> >> >>>>>>> > > >>error
>> >> >>>>>>> > > >> >> >>comes
>> >> >>>>>>> > > >> >> >> >>while importing the hive table to kylin.
>> >> >>>>>>> > > >> >> >> >>   Even if this error comes , hive tables is
>> >> >>>>>>>exported
>> >> >>>>>>> > > >>successfully
>> >> >>>>>>> > > >> >>in
>> >> >>>>>>> > > >> >> >> >>kylin. Is this the reason why cube saving has
>> >> >>>>>>>failed
>> >> >>>>>>>?
>> >> >>>>>>> > > >> >> >> >>
>> >> >>>>>>> > > >> >> >> >>   Next step when I go on creating the cube 
>>for
>> >> >>>>>>>following
>> >> >>>>>>> > > >>schema  I
>> >> >>>>>>> > > >> >> >>get
>> >> >>>>>>> > > >> >> >> >>error at last step while saving and  I am 
>>unable
>> >>to
>> >> >>>>>>>add
>> >> >>>>>>>any
>> >> >>>>>>> > > >> >>measures ,
>> >> >>>>>>> > > >> >> >> >>clicking on measure option just dont pop up 
>>any
>> >> >>>>>>>dialog,
>> >> >>>>>>> > > >> >> >> >>
>> >> >>>>>>> > > >> >> >> >>I am using star schema with fact_sales as fact
>> >>table
>> >> >>>>>>>and
>> >> >>>>>>> dim_*
>> >> >>>>>>> > > >>as
>> >> >>>>>>> > > >> >> >> >>dimension tables.
>> >> >>>>>>> > > >> >> >> >>
>> >> >>>>>>> > > >> >> >> >> fact_sales:
>> >> >>>>>>> > > >> >> >> >> storeid                 int
>> >> >>>>>>> > > >> >> >> >> itemid                  int
>> >> >>>>>>> > > >> >> >> >> custid                  int
>> >> >>>>>>> > > >> >> >> >> qty                     int
>> >> >>>>>>> > > >> >> >> >> price                   double
>> >> >>>>>>> > > >> >> >> >>
>> >> >>>>>>> > > >> >> >> >>dim_customer
>> >> >>>>>>> > > >> >> >> >> custid                  int
>> >> >>>>>>> > > >> >> >> >> name                    string
>> >> >>>>>>> > > >> >> >> >>
>> >> >>>>>>> > > >> >> >> >> dim_item
>> >> >>>>>>> > > >> >> >> >> itemid                  int
>> >> >>>>>>> > > >> >> >> >> category                string
>> >> >>>>>>> > > >> >> >> >> brand                   string
>> >> >>>>>>> > > >> >> >> >> color                   string
>> >> >>>>>>> > > >> >> >> >>
>> >> >>>>>>> > > >> >> >> >>dim_store
>> >> >>>>>>> > > >> >> >> >> storeid                 int
>> >> >>>>>>> > > >> >> >> >> city                    string
>> >> >>>>>>> > > >> >> >> >> state                   string
>> >> >>>>>>> > > >> >> >> >>
>> >> >>>>>>> > > >> >> >> >>JSON is as below.
>> >> >>>>>>> > > >> >> >> >> The JSON is as below.
>> >> >>>>>>> > > >> >> >> >>
>> >> >>>>>>> > > >> >> >> >> {
>> >> >>>>>>> > > >> >> >> >>   "name": "Retail_Cube",
>> >> >>>>>>> > > >> >> >> >>   "description": "",
>> >> >>>>>>> > > >> >> >> >>   "dimensions": [
>> >> >>>>>>> > > >> >> >> >>     {
>> >> >>>>>>> > > >> >> >> >>       "name": "RETAIL.FACT_SALES.STOREID",
>> >> >>>>>>> > > >> >> >> >>       "table": "RETAIL.FACT_SALES",
>> >> >>>>>>> > > >> >> >> >>       "hierarchy": false,
>> >> >>>>>>> > > >> >> >> >>       "derived": null,
>> >> >>>>>>> > > >> >> >> >>       "column": [
>> >> >>>>>>> > > >> >> >> >>         "STOREID"
>> >> >>>>>>> > > >> >> >> >>       ],
>> >> >>>>>>> > > >> >> >> >>       "id": 1
>> >> >>>>>>> > > >> >> >> >>     },
>> >> >>>>>>> > > >> >> >> >>     {
>> >> >>>>>>> > > >> >> >> >>       "name": "RETAIL.FACT_SALES.ITEMID",
>> >> >>>>>>> > > >> >> >> >>       "table": "RETAIL.FACT_SALES",
>> >> >>>>>>> > > >> >> >> >>       "hierarchy": false,
>> >> >>>>>>> > > >> >> >> >>       "derived": null,
>> >> >>>>>>> > > >> >> >> >>       "column": [
>> >> >>>>>>> > > >> >> >> >>         "ITEMID"
>> >> >>>>>>> > > >> >> >> >>       ],
>> >> >>>>>>> > > >> >> >> >>       "id": 2
>> >> >>>>>>> > > >> >> >> >>     },
>> >> >>>>>>> > > >> >> >> >>     {
>> >> >>>>>>> > > >> >> >> >>       "name": "RETAIL.FACT_SALES.CUSTID",
>> >> >>>>>>> > > >> >> >> >>       "table": "RETAIL.FACT_SALES",
>> >> >>>>>>> > > >> >> >> >>       "hierarchy": false,
>> >> >>>>>>> > > >> >> >> >>       "derived": null,
>> >> >>>>>>> > > >> >> >> >>       "column": [
>> >> >>>>>>> > > >> >> >> >>         "CUSTID"
>> >> >>>>>>> > > >> >> >> >>       ],
>> >> >>>>>>> > > >> >> >> >>       "id": 3
>> >> >>>>>>> > > >> >> >> >>     }
>> >> >>>>>>> > > >> >> >> >>   ],
>> >> >>>>>>> > > >> >> >> >>   "measures": [
>> >> >>>>>>> > > >> >> >> >>     {
>> >> >>>>>>> > > >> >> >> >>       "id": 1,
>> >> >>>>>>> > > >> >> >> >>       "name": "_COUNT_",
>> >> >>>>>>> > > >> >> >> >>       "function": {
>> >> >>>>>>> > > >> >> >> >>         "expression": "COUNT",
>> >> >>>>>>> > > >> >> >> >>         "returntype": "bigint",
>> >> >>>>>>> > > >> >> >> >>         "parameter": {
>> >> >>>>>>> > > >> >> >> >>           "type": "constant",
>> >> >>>>>>> > > >> >> >> >>           "value": 1
>> >> >>>>>>> > > >> >> >> >>         }
>> >> >>>>>>> > > >> >> >> >>       }
>> >> >>>>>>> > > >> >> >> >>     }
>> >> >>>>>>> > > >> >> >> >>   ],
>> >> >>>>>>> > > >> >> >> >>   "rowkey": {
>> >> >>>>>>> > > >> >> >> >>     "rowkey_columns": [
>> >> >>>>>>> > > >> >> >> >>       {
>> >> >>>>>>> > > >> >> >> >>         "column": "STOREID",
>> >> >>>>>>> > > >> >> >> >>         "length": 0,
>> >> >>>>>>> > > >> >> >> >>         "dictionary": "true",
>> >> >>>>>>> > > >> >> >> >>         "mandatory": false
>> >> >>>>>>> > > >> >> >> >>       },
>> >> >>>>>>> > > >> >> >> >>       {
>> >> >>>>>>> > > >> >> >> >>         "column": "ITEMID",
>> >> >>>>>>> > > >> >> >> >>         "length": 0,
>> >> >>>>>>> > > >> >> >> >>         "dictionary": "true",
>> >> >>>>>>> > > >> >> >> >>         "mandatory": false
>> >> >>>>>>> > > >> >> >> >>       },
>> >> >>>>>>> > > >> >> >> >>       {
>> >> >>>>>>> > > >> >> >> >>         "column": "CUSTID",
>> >> >>>>>>> > > >> >> >> >>         "length": 0,
>> >> >>>>>>> > > >> >> >> >>         "dictionary": "true",
>> >> >>>>>>> > > >> >> >> >>         "mandatory": false
>> >> >>>>>>> > > >> >> >> >>       }
>> >> >>>>>>> > > >> >> >> >>     ],
>> >> >>>>>>> > > >> >> >> >>     "aggregation_groups": [
>> >> >>>>>>> > > >> >> >> >>       [
>> >> >>>>>>> > > >> >> >> >>         "STOREID",
>> >> >>>>>>> > > >> >> >> >>         "ITEMID",
>> >> >>>>>>> > > >> >> >> >>         "CUSTID"
>> >> >>>>>>> > > >> >> >> >>       ]
>> >> >>>>>>> > > >> >> >> >>     ]
>> >> >>>>>>> > > >> >> >> >>   },
>> >> >>>>>>> > > >> >> >> >>   "notify_list": [],
>> >> >>>>>>> > > >> >> >> >>   "capacity": "",
>> >> >>>>>>> > > >> >> >> >>   "hbase_mapping": {
>> >> >>>>>>> > > >> >> >> >>     "column_family": [
>> >> >>>>>>> > > >> >> >> >>       {
>> >> >>>>>>> > > >> >> >> >>         "name": "f1",
>> >> >>>>>>> > > >> >> >> >>         "columns": [
>> >> >>>>>>> > > >> >> >> >>           {
>> >> >>>>>>> > > >> >> >> >>             "qualifier": "m",
>> >> >>>>>>> > > >> >> >> >>             "measure_refs": [
>> >> >>>>>>> > > >> >> >> >>               "_COUNT_"
>> >> >>>>>>> > > >> >> >> >>             ]
>> >> >>>>>>> > > >> >> >> >>           }
>> >> >>>>>>> > > >> >> >> >>         ]
>> >> >>>>>>> > > >> >> >> >>       }
>> >> >>>>>>> > > >> >> >> >>     ]
>> >> >>>>>>> > > >> >> >> >>   },
>> >> >>>>>>> > > >> >> >> >>   "project": "RetailProject",
>> >> >>>>>>> > > >> >> >> >>   "model_name": "Retail_Cube"
>> >> >>>>>>> > > >> >> >> >> }
>> >> >>>>>>> > > >> >> >> >>
>> >> >>>>>>> > > >> >> >> >>Regards,
>> >> >>>>>>> > > >> >> >> >>Santosh Akhilesh
>> >> >>>>>>> > > >> >> >> >>Bangalore R&D
>> >> >>>>>>> > > >> >> >> >>HUAWEI TECHNOLOGIES CO.,LTD.
>> >> >>>>>>> > > >> >> >> >>
>> >> >>>>>>> > > >> >> >> >>www.huawei.com
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>----------------------------------------------------------
>> >> >>>>>>> > > >> >> >> ---------------
>> >> >>>>>>> > > >> >> >> >>-
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>-----------------------------------------------------------
>> >> >>>>>>> > > >> >> >> >>This e-mail and its attachments contain
>> >>confidential
>> >> >>>>>>> > information
>> >> >>>>>>> > > >> >>from
>> >> >>>>>>> > > >> >> >> >>HUAWEI, which
>> >> >>>>>>> > > >> >> >> >>is intended only for the person or entity 
>>whose
>> >> >>>>>>>address
>> >> >>>>>>>is
>> >> >>>>>>> > > >>listed
>> >> >>>>>>> > > >> >> >>above.
>> >> >>>>>>> > > >> >> >> >>Any use of the
>> >> >>>>>>> > > >> >> >> >>information contained herein in any way
>> >>(including,
>> >> >>>>>>>but
>> >> >>>>>>>not
>> >> >>>>>>> > > >>limited
>> >> >>>>>>> > > >> >> >>to,
>> >> >>>>>>> > > >> >> >> >>total or partial
>> >> >>>>>>> > > >> >> >> >>disclosure, reproduction, or dissemination) by
>> >> >>>>>>>persons
>> >> >>>>>>>other
>> >> >>>>>>> > > >>than
>> >> >>>>>>> > > >> >>the
>> >> >>>>>>> > > >> >> >> >>intended
>> >> >>>>>>> > > >> >> >> >>recipient(s) is prohibited. If you receive 
>>this
>> >> >>>>>>>e-mail
>> >> >>>>>>>in
>> >> >>>>>>> > error,
>> >> >>>>>>> > > >> >> >>please
>> >> >>>>>>> > > >> >> >> >>notify the sender by
>> >> >>>>>>> > > >> >> >> >>phone or email immediately and delete it!
>> >> >>>>>>> > > >> >> >> >>
>> >> >>>>>>> > > >> >> >> >>________________________________________
>> >> >>>>>>> > > >> >> >> >>From: Shi, Shaofeng [shaoshi@ebay.com]
>> >> >>>>>>> > > >> >> >> >>Sent: Thursday, February 26, 2015 7:01 AM
>> >> >>>>>>> > > >> >> >> >>To: dev@kylin.incubator.apache.org
>> >> >>>>>>> > > >> >> >> >>Subject: Re: Error while making cube & Measure
>> >> >>>>>>>option
>> >> >>>>>>>is
>> >> >>>>>>>not
>> >> >>>>>>> > > >> >> >>responding
>> >> >>>>>>> > > >> >> >> >>on GUI
>> >> >>>>>>> > > >> >> >> >>
>> >> >>>>>>> > > >> >> >> >>Hi Santosh,
>> >> >>>>>>> > > >> >> >> >>
>> >> >>>>>>> > > >> >> >> >>It looks like hadoop failed to execute some 
>>shell
>> >> >>>>>>>command in
>> >> >>>>>>> > the
>> >> >>>>>>> > > >> >> >> >>container; You need dive into hadoop to see
>> >>what¹s
>> >> >>>>>>>the
>> >> >>>>>>> > concrete
>> >> >>>>>>> > > >> >>error.
>> >> >>>>>>> > > >> >> >> >>You
>> >> >>>>>>> > > >> >> >> >>can use yarn logs command to fetch all logs:
>> >> >>>>>>> > > >> >> >> >>
>> >> >>>>>>> > > >> >> >> >>yarn logs -applicationId
>> >> >>>>>>> > > >> >> >> >>
>> >> >>>>>>> > > >> >> >> >>
>> >> >>>>>>> > > >> >> >> >>On 2/25/15, 7:39 PM, "Santosh Akhilesh"
>> >> >>>>>>> > > >>>>>>> > > >> >
>> >> >>>>>>> > > >> >> >> >>wrote:
>> >> >>>>>>> > > >> >> >> >>
>> >> >>>>>>> > > >> >> >> >>>Hi Luke / Shaofeng ,
>> >> >>>>>>> > > >> >> >> >>>           Can you please help me to check 
>>this
>> >> >>>>>>>issue.
>> >> >>>>>>> > > >> >> >> >>>Regards,
>> >> >>>>>>> > > >> >> >> >>>Santosh Akhilesh
>> >> >>>>>>> > > >> >> >> >>>
>> >> >>>>>>> > > >> >> >> >>>On Tue, Feb 24, 2015 at 10:41 PM, Santosh
>> >>Akhilesh
>> >> >>>>>>><
>> >> >>>>>>> > > >> >> >> >>>santoshakhilesh@gmail.com> wrote:
>> >> >>>>>>> > > >> >> >> >>>
>> >> >>>>>>> > > >> >> >> >>>> Hi All ,
>> >> >>>>>>> > > >> >> >> >>>>         is it because of following error in
>> >>map
>> >> >>>>>>>reduce
>> >> >>>>>>> job
>> >> >>>>>>> > ?
>> >> >>>>>>> > > >> >>what
>> >> >>>>>>> > > >> >> >> >>>>could
>> >> >>>>>>> > > >> >> >> >>>>be
>> >> >>>>>>> > > >> >> >> >>>> way to resolve this , a google search says
>> >>that
>> >> >>>>>>>its
>> >> >>>>>>>issue
>> >> >>>>>>> > of
>> >> >>>>>>> > > >> >>Yarn
>> >> >>>>>>> > > >> >> >> >>>>class
>> >> >>>>>>> > > >> >> >> >>>> path , but I am not sure what it is ?
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> Kylin Hive Column Cardinality Job
>> >> >>>>>>>table=RETAIL.FACT_SALES
>> >> >>>>>>> > > >> >> >> >>>> output=/tmp/cardinality/RETAIL.FACT_SALES
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> Application application_1424791969399_0008
>> >> >>>>>>>failed
>> >> >>>>>>>2
>> >> >>>>>>>times
>> >> >>>>>>> > due
>> >> >>>>>>> > > >> >>to AM
>> >> >>>>>>> > > >> >> >> >>>> Container for
>> >> >>>>>>>appattempt_1424791969399_0008_000002
>> >> >>>>>>>exited
>> >> >>>>>>> > > >>with
>> >> >>>>>>> > > >> >> >> >>>>exitCode: 1
>> >> >>>>>>> > > >> >> >> >>>> For more detailed output, check application
>> >> >>>>>>>tracking
>> >> >>>>>>> page:
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > >
>> >> 
>>>>>>>>>>>http://santosh:8088/proxy/application_1424791969399_0008/Then,
>> >> >>>>>>> > > >> >> >>click
>> >> >>>>>>> > > >> >> >> >>>>on
>> >> >>>>>>> > > >> >> >> >>>> links to logs of each attempt.
>> >> >>>>>>> > > >> >> >> >>>> Diagnostics: Exception from 
>>container-launch.
>> >> >>>>>>> > > >> >> >> >>>> Container id:
>> >> >>>>>>>container_1424791969399_0008_02_000001
>> >> >>>>>>> > > >> >> >> >>>> Exit code: 1
>> >> >>>>>>> > > >> >> >> >>>> Stack trace: ExitCodeException exitCode=1:
>> >> >>>>>>> > > >> >> >> >>>> at
>> >> >>>>>>> org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
>> >> >>>>>>> > > >> >> >> >>>> at
>> >> >>>>>>>org.apache.hadoop.util.Shell.run(Shell.java:455)
>> >> >>>>>>> > > >> >> >> >>>> at
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>>>org.apache.hadoop.util.Shell$ShellCommandExecutor.execut
>> >> >>>>>>> > > >> >> >> e(Shell.java:71
>> >> >>>>>>> > > >> >> >> >>>>5
>> >> >>>>>>> > > >> >> >> >>>>)
>> >> >>>>>>> > > >> >> >> >>>> at
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>>>org.apache.hadoop.yarn.server.nodemanager.DefaultContain
>> >> >>>>>>> > > >> >> >> erExecutor.laun
>> >> >>>>>>> > > >> >> >> >>>>c
>> >> >>>>>>> > > >> >> >> >>>>h
>> >> >>>>>>> > > >> >> >> >>>>Container(DefaultContainerExecutor.java:211)
>> >> >>>>>>> > > >> >> >> >>>> at
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>>>org.apache.hadoop.yarn.server.nodemanager.containermanag
>> >> >>>>>>> > > >> >> >> er.launcher.Con
>> >> >>>>>>> > > >> >> >> >>>>t
>> >> >>>>>>> > > >> >> >> >>>>a
>> >> >>>>>>> > > >> >> >> >>>>inerLaunch.call(ContainerLaunch.java:302)
>> >> >>>>>>> > > >> >> >> >>>> at
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>>>org.apache.hadoop.yarn.server.nodemanager.containermanag
>> >> >>>>>>> > > >> >> >> er.launcher.Con
>> >> >>>>>>> > > >> >> >> >>>>t
>> >> >>>>>>> > > >> >> >> >>>>a
>> >> >>>>>>> > > >> >> >> >>>>inerLaunch.call(ContainerLaunch.java:82)
>> >> >>>>>>> > > >> >> >> >>>> at
>> >> >>>>>>> java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> >> >>>>>>> > > >> >> >> >>>> at
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>>>java.util.concurrent.ThreadPoolExecutor.runWorker(Thread
>> >> >>>>>>> > > >> >> >> PoolExecutor.ja
>> >> >>>>>>> > > >> >> >> >>>>v
>> >> >>>>>>> > > >> >> >> >>>>a
>> >> >>>>>>> > > >> >> >> >>>>:1145)
>> >> >>>>>>> > > >> >> >> >>>> at
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>>>java.util.concurrent.ThreadPoolExecutor$Worker.run(Threa
>> >> >>>>>>> > > >> >> >> dPoolExecutor.j
>> >> >>>>>>> > > >> >> >> >>>>a
>> >> >>>>>>> > > >> >> >> >>>>v
>> >> >>>>>>> > > >> >> >> >>>>a:615)
>> >> >>>>>>> > > >> >> >> >>>> at java.lang.Thread.run(Thread.java:745)
>> >> >>>>>>> > > >> >> >> >>>> Container exited with a non-zero exit code 
>>1
>> >> >>>>>>> > > >> >> >> >>>> Failing this attempt. Failing the 
>>application.
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> ---------- Forwarded message ----------
>> >> >>>>>>> > > >> >> >> >>>> From: Santoshakhilesh
>> >> >>>>>>>
>> >> >>>>>>> > > >> >> >> >>>> Date: Tue, Feb 24, 2015 at 7:41 PM
>> >> >>>>>>> > > >> >> >> >>>> Subject: FW: Error while making cube & 
>>Measure
>> >> >>>>>>>option
>> >> >>>>>>>is
>> >> >>>>>>> > not
>> >> >>>>>>> > > >> >> >> >>>>responding
>> >> >>>>>>> > > >> >> >> >>>>on
>> >> >>>>>>> > > >> >> >> >>>> GUI
>> >> >>>>>>> > > >> >> >> >>>> To: "dev@kylin.incubator.apache.org"
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> hi ,
>> >> >>>>>>> > > >> >> >> >>>>    please someone give me a hand to resolve
>> >>this
>> >> >>>>>>>issue ,
>> >> >>>>>>> > > >>thanks.
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> Regards,
>> >> >>>>>>> > > >> >> >> >>>> Santosh Akhilesh
>> >> >>>>>>> > > >> >> >> >>>> Bangalore R&D
>> >> >>>>>>> > > >> >> >> >>>> HUAWEI TECHNOLOGIES CO.,LTD.
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> www.huawei.com
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>>>--------------------------------------------------------
>> >> >>>>>>> > > >> >> >> ---------------
>> >> >>>>>>> > > >> >> >> >>>>-
>> >> >>>>>>> > > >> >> >> >>>>-
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>> 
>>>>>>------------------------------------------------------------
>> >> >>>>>>> > > >> >> >> >>>> This e-mail and its attachments contain
>> >> >>>>>>>confidential
>> >> >>>>>>> > > >>information
>> >> >>>>>>> > > >> >> >>from
>> >> >>>>>>> > > >> >> >> >>>> HUAWEI, which
>> >> >>>>>>> > > >> >> >> >>>> is intended only for the person or entity
>> >>whose
>> >> >>>>>>>address
>> >> >>>>>>> is
>> >> >>>>>>> > > >> >>listed
>> >> >>>>>>> > > >> >> >> >>>>above.
>> >> >>>>>>> > > >> >> >> >>>> Any use of the
>> >> >>>>>>> > > >> >> >> >>>> information contained herein in any way
>> >> >>>>>>>(including,
>> >> >>>>>>>but
>> >> >>>>>>> not
>> >> >>>>>>> > > >> >>limited
>> >> >>>>>>> > > >> >> >> >>>>to,
>> >> >>>>>>> > > >> >> >> >>>> total or partial
>> >> >>>>>>> > > >> >> >> >>>> disclosure, reproduction, or 
>>dissemination) by
>> >> >>>>>>>persons
>> >> >>>>>>> > other
>> >> >>>>>>> > > >> >>than
>> >> >>>>>>> > > >> >> >>the
>> >> >>>>>>> > > >> >> >> >>>> intended
>> >> >>>>>>> > > >> >> >> >>>> recipient(s) is prohibited. If you receive
>> >>this
>> >> >>>>>>>e-mail in
>> >> >>>>>>> > > >>error,
>> >> >>>>>>> > > >> >> >> >>>>please
>> >> >>>>>>> > > >> >> >> >>>> notify the sender by
>> >> >>>>>>> > > >> >> >> >>>> phone or email immediately and delete it!
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> ________________________________________
>> >> >>>>>>> > > >> >> >> >>>> From: Santoshakhilesh
>> >> >>>>>>>[santosh.akhilesh@huawei.com]
>> >> >>>>>>> > > >> >> >> >>>> Sent: Tuesday, February 24, 2015 12:55 PM
>> >> >>>>>>> > > >> >> >> >>>> To: dev@kylin.incubator.apache.org
>> >> >>>>>>> > > >> >> >> >>>> Cc: Kulbhushan Rana
>> >> >>>>>>> > > >> >> >> >>>> Subject: FW: Error while making cube & 
>>Measure
>> >> >>>>>>>option
>> >> >>>>>>>is
>> >> >>>>>>> > not
>> >> >>>>>>> > > >> >> >> >>>>responding
>> >> >>>>>>> > > >> >> >> >>>>on
>> >> >>>>>>> > > >> >> >> >>>> GUI
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> 2. If  I ignore and continue and try to 
>>save
>> >>the
>> >> >>>>>>>cube
>> >> >>>>>>>I
>> >> >>>>>>> get
>> >> >>>>>>> > > >>an
>> >> >>>>>>> > > >> >> >> >>>>exception
>> >> >>>>>>> > > >> >> >> >>>> in Kylin.log , I have checked the path is 
>>set
>> >> >>>>>>>correctly
>> >> >>>>>>> and
>> >> >>>>>>> > > >> >> >> >>>>HCatInputFormat
>> >> >>>>>>> > > >> >> >> >>>> this file is present in
>> >> >>>>>>>hive-hcatalog-core-0.14.0.jar
>> >> >>>>>>>.
>> >> >>>>>>> > > >>Please
>> >> >>>>>>> > > >> >>let
>> >> >>>>>>> > > >> >> >>me
>> >> >>>>>>> > > >> >> >> >>>>know
>> >> >>>>>>> > > >> >> >> >>>> what can I do to resolve this ?
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>  -- This was path issue , now no more
>> >>exception
>> >> >>>>>>>in
>> >> >>>>>>> > kylin.log
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> But saveing cube still fails with error. 
>>And
>> >> >>>>>>>still
>> >> >>>>>>>can't
>> >> >>>>>>> > add
>> >> >>>>>>> > > >> >> >>measures.
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> Error Message
>> >> >>>>>>> > > >> >> >> >>>> Failed to take action.
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> In log I can find no exception. Following 
>>is
>> >>the
>> >> >>>>>>>last
>> >> >>>>>>>log
>> >> >>>>>>> > in
>> >> >>>>>>> > > >> >> >>kylin.log
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> [pool-3-thread-1]:[2015-02-24
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>>>20:47:15,613][INFO][org.apache.kylin.job.impl.threadpool
>> >> >>>>>>> > > >> >> >> .DefaultSchedul
>> >> >>>>>>> > > >> >> >> >>>>e
>> >> >>>>>>> > > >> >> >> >>>>r
>> >> >>>>>>> > > >> >> >> 
>>>>>>$FetcherRunner.run(DefaultScheduler.java:117)]
>> >> >>>>>>> > > >> >> >> >>>> - Job Fetcher: 0 running, 0 actual 
>>running, 0
>> >> >>>>>>>ready,
>> >> >>>>>>>6
>> >> >>>>>>> > others
>> >> >>>>>>> > > >> >> >> >>>> [http-bio-7070-exec-2]:[2015-02-24
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>>>20:47:51,610][DEBUG][org.apache.kylin.rest.controller.Cu
>> >> >>>>>>> > > >> >> >> beController.de
>> >> >>>>>>> > > >> >> >> >>>>s
>> >> >>>>>>> > > >> >> >> >>>>e
>> >> >>>>>>> > > >> >> >> 
>>>>>>rializeDataModelDesc(CubeController.java:459)]
>> >> >>>>>>> > > >> >> >> >>>> - Saving cube {
>> >> >>>>>>> > > >> >> >> >>>>   "name": "",
>> >> >>>>>>> > > >> >> >> >>>>   "fact_table": "RETAIL.FACT_SALES",
>> >> >>>>>>> > > >> >> >> >>>>   "lookups": [],
>> >> >>>>>>> > > >> >> >> >>>>   "filter_condition": "",
>> >> >>>>>>> > > >> >> >> >>>>   "capacity": "SMALL",
>> >> >>>>>>> > > >> >> >> >>>>   "partition_desc": {
>> >> >>>>>>> > > >> >> >> >>>>     "partition_date_column": "",
>> >> >>>>>>> > > >> >> >> >>>>     "partition_date_start": 0,
>> >> >>>>>>> > > >> >> >> >>>>     "partition_type": "APPEND"
>> >> >>>>>>> > > >> >> >> >>>>   },
>> >> >>>>>>> > > >> >> >> >>>>   "last_modified": 0
>> >> >>>>>>> > > >> >> >> >>>> }
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> local access logs all with 200 , so seems 
>>ok.
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> 10.18.146.105 - - [24/Feb/2015:20:46:56 
>>+0800]
>> >> >>>>>>>"GET
>> >> >>>>>>> > > >> >> >> >>>> /kylin/api/user/authentication HTTP/1.1" 
>>200
>> >>246
>> >> >>>>>>> > > >> >> >> >>>> 10.18.146.105 - - [24/Feb/2015:20:47:07 
>>+0800]
>> >> >>>>>>>"GET
>> >> >>>>>>> > > >> >> >> >>>> /kylin/api/user/authentication HTTP/1.1" 
>>200
>> >>246
>> >> >>>>>>> > > >> >> >> >>>> 10.18.146.105 - - [24/Feb/2015:20:47:27 
>>+0800]
>> >> >>>>>>>"GET
>> >> >>>>>>> > > >> >> >> >>>> /kylin/api/user/authentication HTTP/1.1" 
>>200
>> >>246
>> >> >>>>>>> > > >> >> >> >>>> 10.18.146.105 - - [24/Feb/2015:20:47:28 
>>+0800]
>> >> >>>>>>>"GET
>> >> >>>>>>> > > >> >> >> >>>> /kylin/api/user/authentication HTTP/1.1" 
>>200
>> >>246
>> >> >>>>>>> > > >> >> >> >>>> 10.18.146.105 - - [24/Feb/2015:20:47:34 
>>+0800]
>> >> >>>>>>>"GET
>> >> >>>>>>> > > >> >> >> >>>> /kylin/api/user/authentication HTTP/1.1" 
>>200
>> >>246
>> >> >>>>>>> > > >> >> >> >>>> 10.18.146.105 - - [24/Feb/2015:20:47:48 
>>+0800]
>> >> >>>>>>>"GET
>> >> >>>>>>> > > >> >> >> >>>> /kylin/api/user/authentication HTTP/1.1" 
>>200
>> >>246
>> >> >>>>>>> > > >> >> >> >>>> 10.18.146.105 - - [24/Feb/2015:20:47:51 
>>+0800]
>> >> >>>>>>>"POST
>> >> >>>>>>> > > >> >> >>/kylin/api/cubes
>> >> >>>>>>> > > >> >> >> >>>> HTTP/1.1" 200 701
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> Regards,
>> >> >>>>>>> > > >> >> >> >>>> Santosh Akhilesh
>> >> >>>>>>> > > >> >> >> >>>> Bangalore R&D
>> >> >>>>>>> > > >> >> >> >>>> HUAWEI TECHNOLOGIES CO.,LTD.
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> www.huawei.com
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>>>--------------------------------------------------------
>> >> >>>>>>> > > >> >> >> ---------------
>> >> >>>>>>> > > >> >> >> >>>>-
>> >> >>>>>>> > > >> >> >> >>>>-
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>> 
>>>>>>------------------------------------------------------------
>> >> >>>>>>> > > >> >> >> >>>> This e-mail and its attachments contain
>> >> >>>>>>>confidential
>> >> >>>>>>> > > >>information
>> >> >>>>>>> > > >> >> >>from
>> >> >>>>>>> > > >> >> >> >>>> HUAWEI, which
>> >> >>>>>>> > > >> >> >> >>>> is intended only for the person or entity
>> >>whose
>> >> >>>>>>>address
>> >> >>>>>>> is
>> >> >>>>>>> > > >> >>listed
>> >> >>>>>>> > > >> >> >> >>>>above.
>> >> >>>>>>> > > >> >> >> >>>> Any use of the
>> >> >>>>>>> > > >> >> >> >>>> information contained herein in any way
>> >> >>>>>>>(including,
>> >> >>>>>>>but
>> >> >>>>>>> not
>> >> >>>>>>> > > >> >>limited
>> >> >>>>>>> > > >> >> >> >>>>to,
>> >> >>>>>>> > > >> >> >> >>>> total or partial
>> >> >>>>>>> > > >> >> >> >>>> disclosure, reproduction, or 
>>dissemination) by
>> >> >>>>>>>persons
>> >> >>>>>>> > other
>> >> >>>>>>> > > >> >>than
>> >> >>>>>>> > > >> >> >>the
>> >> >>>>>>> > > >> >> >> >>>> intended
>> >> >>>>>>> > > >> >> >> >>>> recipient(s) is prohibited. If you receive
>> >>this
>> >> >>>>>>>e-mail in
>> >> >>>>>>> > > >>error,
>> >> >>>>>>> > > >> >> >> >>>>please
>> >> >>>>>>> > > >> >> >> >>>> notify the sender by
>> >> >>>>>>> > > >> >> >> >>>> phone or email immediately and delete it!
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> ________________________________________
>> >> >>>>>>> > > >> >> >> >>>> From: Santoshakhilesh
>> >> >>>>>>>[santosh.akhilesh@huawei.com]
>> >> >>>>>>> > > >> >> >> >>>> Sent: Tuesday, February 24, 2015 12:09 PM
>> >> >>>>>>> > > >> >> >> >>>> To: dev@kylin.incubator.apache.org
>> >> >>>>>>> > > >> >> >> >>>> Cc: Kulbhushan Rana
>> >> >>>>>>> > > >> >> >> >>>> Subject: Error while making cube & Measure
>> >>option
>> >> >>>>>>>is
>> >> >>>>>>>not
>> >> >>>>>>> > > >> >> >>responding on
>> >> >>>>>>> > > >> >> >> >>>>GUI
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> Hi All ,
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>     I am building a simple cube for test 
>>and
>> >> >>>>>>>using
>> >> >>>>>>>the
>> >> >>>>>>> > binary
>> >> >>>>>>> > > >> >>build
>> >> >>>>>>> > > >> >> >> >>>>0.7.1
>> >> >>>>>>> > > >> >> >> >>>> . I have following hive tables with 
>>columns.
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> fact_sales:
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> storeid                 int
>> >> >>>>>>> > > >> >> >> >>>> itemid                  int
>> >> >>>>>>> > > >> >> >> >>>> custid                  int
>> >> >>>>>>> > > >> >> >> >>>> qty                     int
>> >> >>>>>>> > > >> >> >> >>>> price                   double
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> dim_customer
>> >> >>>>>>> > > >> >> >> >>>> custid                  int
>> >> >>>>>>> > > >> >> >> >>>> name                    string
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> dim_item
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> itemid                  int
>> >> >>>>>>> > > >> >> >> >>>> category                string
>> >> >>>>>>> > > >> >> >> >>>> brand                   string
>> >> >>>>>>> > > >> >> >> >>>> color                   string
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> dim_store
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> storeid                 int
>> >> >>>>>>> > > >> >> >> >>>> city                    string
>> >> >>>>>>> > > >> >> >> >>>> state                   string
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> Please help me to answer following issues;
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> 1. When I go to measure section and click 
>>on
>> >> >>>>>>>measure
>> >> >>>>>>> > option ,
>> >> >>>>>>> > > >> >> >>there is
>> >> >>>>>>> > > >> >> >> >>>>no
>> >> >>>>>>> > > >> >> >> >>>> response , I want add measure on qty and 
>>price
>> >> >>>>>>>with
>> >> >>>>>>>sum
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> 2. If  I ignore and continue and try to 
>>save
>> >>the
>> >> >>>>>>>cube
>> >> >>>>>>>I
>> >> >>>>>>> get
>> >> >>>>>>> > > >>an
>> >> >>>>>>> > > >> >> >> >>>>exception
>> >> >>>>>>> > > >> >> >> >>>> in Kylin.log , I have checked the path is 
>>set
>> >> >>>>>>>correctly
>> >> >>>>>>> and
>> >> >>>>>>> > > >> >> >> >>>>HCatInputFormat
>> >> >>>>>>> > > >> >> >> >>>> this file is present in
>> >> >>>>>>>hive-hcatalog-core-0.14.0.jar
>> >> >>>>>>>.
>> >> >>>>>>> > > >>Please
>> >> >>>>>>> > > >> >>let
>> >> >>>>>>> > > >> >> >>me
>> >> >>>>>>> > > >> >> >> >>>>know
>> >> >>>>>>> > > >> >> >> >>>> what can I do to resolve this ?
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> 3. Also I have another question since this 
>>is
>> >>a
>> >> >>>>>>>test
>> >> >>>>>>>and
>> >> >>>>>>> > > >>data is
>> >> >>>>>>> > > >> >> >>small
>> >> >>>>>>> > > >> >> >> >>>>I
>> >> >>>>>>> > > >> >> >> >>>> have not partitioned the fact table , is it
>> >>ok to
>> >> >>>>>>>skip
>> >> >>>>>>> > > >>partition
>> >> >>>>>>> > > >> >> >>stage
>> >> >>>>>>> > > >> >> >> >>>> while cube build ?
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> Exception
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> pool-4-thread-4]:[2015-02-24
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>>>19:26:32,577][ERROR][org.apache.kylin.job.impl.threadpoo
>> >> >>>>>>> > > >> >> >> l.DefaultSchedu
>> >> >>>>>>> > > >> >> >> >>>>l
>> >> >>>>>>> > > >> >> >> >>>>e
>> >> >>>>>>> > > >> >> >> >>>>r$JobRunner.run(DefaultScheduler.java:134)]
>> >> >>>>>>> > > >> >> >> >>>> - ExecuteException
>> >> >>>>>>> job:c3532a6f-97ea-474a-b36a-218dd517cedb
>> >> >>>>>>> > > >> >> >> >>>>
>> >>org.apache.kylin.job.exception.ExecuteException:
>> >> >>>>>>> > > >> >> >> >>>>
>> >>org.apache.kylin.job.exception.ExecuteException:
>> >> >>>>>>> > > >> >> >> >>>> java.lang.NoClassDefFoundError:
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>>org/apache/hive/hcatalog/mapreduce/HCatInputFormat
>> >> >>>>>>> > > >> >> >> >>>>  at
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>>>org.apache.kylin.job.execution.AbstractExecutable.execut
>> >> >>>>>>> > > >> >> >> e(AbstractExecu
>> >> >>>>>>> > > >> >> >> >>>>t
>> >> >>>>>>> > > >> >> >> >>>>a
>> >> >>>>>>> > > >> >> >> >>>>ble.java:102)
>> >> >>>>>>> > > >> >> >> >>>>  at
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>>>org.apache.kylin.job.impl.threadpool.DefaultScheduler$Jo
>> >> >>>>>>> > > >> >> >> bRunner.run(Def
>> >> >>>>>>> > > >> >> >> >>>>a
>> >> >>>>>>> > > >> >> >> >>>>u
>> >> >>>>>>> > > >> >> >> >>>>ltScheduler.java:132)
>> >> >>>>>>> > > >> >> >> >>>>  at
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>>>java.util.concurrent.ThreadPoolExecutor.runWorker(Thread
>> >> >>>>>>> > > >> >> >> PoolExecutor.ja
>> >> >>>>>>> > > >> >> >> >>>>v
>> >> >>>>>>> > > >> >> >> >>>>a
>> >> >>>>>>> > > >> >> >> >>>>:1145)
>> >> >>>>>>> > > >> >> >> >>>>  at
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>>>java.util.concurrent.ThreadPoolExecutor$Worker.run(Threa
>> >> >>>>>>> > > >> >> >> dPoolExecutor.j
>> >> >>>>>>> > > >> >> >> >>>>a
>> >> >>>>>>> > > >> >> >> >>>>v
>> >> >>>>>>> > > >> >> >> >>>>a:615)
>> >> >>>>>>> > > >> >> >> >>>>  at java.lang.Thread.run(Thread.java:745)
>> >> >>>>>>> > > >> >> >> >>>> Caused by:
>> >> >>>>>>> org.apache.kylin.job.exception.ExecuteException:
>> >> >>>>>>> > > >> >> >> >>>> java.lang.NoClassDefFoundError:
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>>org/apache/hive/hcatalog/mapreduce/HCatInputFormat
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> The JSON is as below.
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> {
>> >> >>>>>>> > > >> >> >> >>>>   "name": "Retail_Cube",
>> >> >>>>>>> > > >> >> >> >>>>   "description": "",
>> >> >>>>>>> > > >> >> >> >>>>   "dimensions": [
>> >> >>>>>>> > > >> >> >> >>>>     {
>> >> >>>>>>> > > >> >> >> >>>>       "name": "RETAIL.FACT_SALES.STOREID",
>> >> >>>>>>> > > >> >> >> >>>>       "table": "RETAIL.FACT_SALES",
>> >> >>>>>>> > > >> >> >> >>>>       "hierarchy": false,
>> >> >>>>>>> > > >> >> >> >>>>       "derived": null,
>> >> >>>>>>> > > >> >> >> >>>>       "column": [
>> >> >>>>>>> > > >> >> >> >>>>         "STOREID"
>> >> >>>>>>> > > >> >> >> >>>>       ],
>> >> >>>>>>> > > >> >> >> >>>>       "id": 1
>> >> >>>>>>> > > >> >> >> >>>>     },
>> >> >>>>>>> > > >> >> >> >>>>     {
>> >> >>>>>>> > > >> >> >> >>>>       "name": "RETAIL.FACT_SALES.ITEMID",
>> >> >>>>>>> > > >> >> >> >>>>       "table": "RETAIL.FACT_SALES",
>> >> >>>>>>> > > >> >> >> >>>>       "hierarchy": false,
>> >> >>>>>>> > > >> >> >> >>>>       "derived": null,
>> >> >>>>>>> > > >> >> >> >>>>       "column": [
>> >> >>>>>>> > > >> >> >> >>>>         "ITEMID"
>> >> >>>>>>> > > >> >> >> >>>>       ],
>> >> >>>>>>> > > >> >> >> >>>>       "id": 2
>> >> >>>>>>> > > >> >> >> >>>>     },
>> >> >>>>>>> > > >> >> >> >>>>     {
>> >> >>>>>>> > > >> >> >> >>>>       "name": "RETAIL.FACT_SALES.CUSTID",
>> >> >>>>>>> > > >> >> >> >>>>       "table": "RETAIL.FACT_SALES",
>> >> >>>>>>> > > >> >> >> >>>>       "hierarchy": false,
>> >> >>>>>>> > > >> >> >> >>>>       "derived": null,
>> >> >>>>>>> > > >> >> >> >>>>       "column": [
>> >> >>>>>>> > > >> >> >> >>>>         "CUSTID"
>> >> >>>>>>> > > >> >> >> >>>>       ],
>> >> >>>>>>> > > >> >> >> >>>>       "id": 3
>> >> >>>>>>> > > >> >> >> >>>>     }
>> >> >>>>>>> > > >> >> >> >>>>   ],
>> >> >>>>>>> > > >> >> >> >>>>   "measures": [
>> >> >>>>>>> > > >> >> >> >>>>     {
>> >> >>>>>>> > > >> >> >> >>>>       "id": 1,
>> >> >>>>>>> > > >> >> >> >>>>       "name": "_COUNT_",
>> >> >>>>>>> > > >> >> >> >>>>       "function": {
>> >> >>>>>>> > > >> >> >> >>>>         "expression": "COUNT",
>> >> >>>>>>> > > >> >> >> >>>>         "returntype": "bigint",
>> >> >>>>>>> > > >> >> >> >>>>         "parameter": {
>> >> >>>>>>> > > >> >> >> >>>>           "type": "constant",
>> >> >>>>>>> > > >> >> >> >>>>           "value": 1
>> >> >>>>>>> > > >> >> >> >>>>         }
>> >> >>>>>>> > > >> >> >> >>>>       }
>> >> >>>>>>> > > >> >> >> >>>>     }
>> >> >>>>>>> > > >> >> >> >>>>   ],
>> >> >>>>>>> > > >> >> >> >>>>   "rowkey": {
>> >> >>>>>>> > > >> >> >> >>>>     "rowkey_columns": [
>> >> >>>>>>> > > >> >> >> >>>>       {
>> >> >>>>>>> > > >> >> >> >>>>         "column": "STOREID",
>> >> >>>>>>> > > >> >> >> >>>>         "length": 0,
>> >> >>>>>>> > > >> >> >> >>>>         "dictionary": "true",
>> >> >>>>>>> > > >> >> >> >>>>         "mandatory": false
>> >> >>>>>>> > > >> >> >> >>>>       },
>> >> >>>>>>> > > >> >> >> >>>>       {
>> >> >>>>>>> > > >> >> >> >>>>         "column": "ITEMID",
>> >> >>>>>>> > > >> >> >> >>>>         "length": 0,
>> >> >>>>>>> > > >> >> >> >>>>         "dictionary": "true",
>> >> >>>>>>> > > >> >> >> >>>>         "mandatory": false
>> >> >>>>>>> > > >> >> >> >>>>       },
>> >> >>>>>>> > > >> >> >> >>>>       {
>> >> >>>>>>> > > >> >> >> >>>>         "column": "CUSTID",
>> >> >>>>>>> > > >> >> >> >>>>         "length": 0,
>> >> >>>>>>> > > >> >> >> >>>>         "dictionary": "true",
>> >> >>>>>>> > > >> >> >> >>>>         "mandatory": false
>> >> >>>>>>> > > >> >> >> >>>>       }
>> >> >>>>>>> > > >> >> >> >>>>     ],
>> >> >>>>>>> > > >> >> >> >>>>     "aggregation_groups": [
>> >> >>>>>>> > > >> >> >> >>>>       [
>> >> >>>>>>> > > >> >> >> >>>>         "STOREID",
>> >> >>>>>>> > > >> >> >> >>>>         "ITEMID",
>> >> >>>>>>> > > >> >> >> >>>>         "CUSTID"
>> >> >>>>>>> > > >> >> >> >>>>       ]
>> >> >>>>>>> > > >> >> >> >>>>     ]
>> >> >>>>>>> > > >> >> >> >>>>   },
>> >> >>>>>>> > > >> >> >> >>>>   "notify_list": [],
>> >> >>>>>>> > > >> >> >> >>>>   "capacity": "",
>> >> >>>>>>> > > >> >> >> >>>>   "hbase_mapping": {
>> >> >>>>>>> > > >> >> >> >>>>     "column_family": [
>> >> >>>>>>> > > >> >> >> >>>>       {
>> >> >>>>>>> > > >> >> >> >>>>         "name": "f1",
>> >> >>>>>>> > > >> >> >> >>>>         "columns": [
>> >> >>>>>>> > > >> >> >> >>>>           {
>> >> >>>>>>> > > >> >> >> >>>>             "qualifier": "m",
>> >> >>>>>>> > > >> >> >> >>>>             "measure_refs": [
>> >> >>>>>>> > > >> >> >> >>>>               "_COUNT_"
>> >> >>>>>>> > > >> >> >> >>>>             ]
>> >> >>>>>>> > > >> >> >> >>>>           }
>> >> >>>>>>> > > >> >> >> >>>>         ]
>> >> >>>>>>> > > >> >> >> >>>>       }
>> >> >>>>>>> > > >> >> >> >>>>     ]
>> >> >>>>>>> > > >> >> >> >>>>   },
>> >> >>>>>>> > > >> >> >> >>>>   "project": "RetailProject",
>> >> >>>>>>> > > >> >> >> >>>>   "model_name": "Retail_Cube"
>> >> >>>>>>> > > >> >> >> >>>> }
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> Regards,
>> >> >>>>>>> > > >> >> >> >>>> Santosh Akhilesh
>> >> >>>>>>> > > >> >> >> >>>> Bangalore R&D
>> >> >>>>>>> > > >> >> >> >>>> HUAWEI TECHNOLOGIES CO.,LTD.
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> www.huawei.com
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>>>>--------------------------------------------------------
>> >> >>>>>>> > > >> >> >> ---------------
>> >> >>>>>>> > > >> >> >> >>>>-
>> >> >>>>>>> > > >> >> >> >>>>-
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>> 
>>>>>>------------------------------------------------------------
>> >> >>>>>>> > > >> >> >> >>>> This e-mail and its attachments contain
>> >> >>>>>>>confidential
>> >> >>>>>>> > > >>information
>> >> >>>>>>> > > >> >> >>from
>> >> >>>>>>> > > >> >> >> >>>> HUAWEI, which
>> >> >>>>>>> > > >> >> >> >>>> is intended only for the person or entity
>> >>whose
>> >> >>>>>>>address
>> >> >>>>>>> is
>> >> >>>>>>> > > >> >>listed
>> >> >>>>>>> > > >> >> >> >>>>above.
>> >> >>>>>>> > > >> >> >> >>>> Any use of the
>> >> >>>>>>> > > >> >> >> >>>> information contained herein in any way
>> >> >>>>>>>(including,
>> >> >>>>>>>but
>> >> >>>>>>> not
>> >> >>>>>>> > > >> >>limited
>> >> >>>>>>> > > >> >> >> >>>>to,
>> >> >>>>>>> > > >> >> >> >>>> total or partial
>> >> >>>>>>> > > >> >> >> >>>> disclosure, reproduction, or 
>>dissemination) by
>> >> >>>>>>>persons
>> >> >>>>>>> > other
>> >> >>>>>>> > > >> >>than
>> >> >>>>>>> > > >> >> >>the
>> >> >>>>>>> > > >> >> >> >>>> intended
>> >> >>>>>>> > > >> >> >> >>>> recipient(s) is prohibited. If you receive
>> >>this
>> >> >>>>>>>e-mail in
>> >> >>>>>>> > > >>error,
>> >> >>>>>>> > > >> >> >> >>>>please
>> >> >>>>>>> > > >> >> >> >>>> notify the sender by
>> >> >>>>>>> > > >> >> >> >>>> phone or email immediately and delete it!
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>> --
>> >> >>>>>>> > > >> >> >> >>>> Regards,
>> >> >>>>>>> > > >> >> >> >>>> Santosh Akhilesh
>> >> >>>>>>> > > >> >> >> >>>> +91-0-9845482201
>> >> >>>>>>> > > >> >> >> >>>>
>> >> >>>>>>> > > >> >> >> >>>
>> >> >>>>>>> > > >> >> >> >>>
>> >> >>>>>>> > > >> >> >> >>>
>> >> >>>>>>> > > >> >> >> >>>--
>> >> >>>>>>> > > >> >> >> >>>Regards,
>> >> >>>>>>> > > >> >> >> >>>Santosh Akhilesh
>> >> >>>>>>> > > >> >> >> >>>+91-0-9845482201
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>> > > >> >>
>> >> >>>>>>> > > >> >>
>> >> >>>>>>> > > >>
>> >> >>>>>>> > > >>
>> >> >>>>>>> > >
>> >> >>>>>>> > >
>> >> >>>>>>> >
>> >> >>>>>>> >
>> >> >>>>>>> > --
>> >> >>>>>>> > Regards,
>> >> >>>>>>> > Santosh Akhilesh
>> >> >>>>>>> > +91-0-9845482201
>> >> >>>>>>> >
>> >> >>>>>>>
>> >> >>>>>>
>> >> >>>>>>
>> >> >>>>>>
>> >> >>>>>>--
>> >> >>>>>>Regards,
>> >> >>>>>>Santosh Akhilesh
>> >> >>>>>>+91-0-9845482201
>> >> >
>> >> >
>> >> >
>> >> >
>> >> >
>> >> >
>> >> >   The hive table name
>> >> >>>>>>> > > >> >> >> >[pool-4-thread-2]:[2015-02-26
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>20:47:49,936][ERROR][org.apache.kylin.job.common.HadoopShel
>> >> >>>>>>> > > >> >> >> lExecutable.doW
>> >> >>>>>>> > > >> >> >> >ork(HadoopShellExecutable.java:64)] - error
>> >>execute
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>> >
>> >>
>> 
>>>>>>>>>>>>HadoopShellExecutable{id=d4730d26-7fe6-412e-9841-3288ab362c5b-0
>>>>>>>>>>>>0,
>> >> >>>>>>> > > >> >> >> >name=null, state=RUNNING}
>> >> >>>>>>> > > >> >> >> >java.lang.IllegalArgumentException: No enum
>> >>constant
>> >> >>>>>>> > > >> >> >>
>> >>>org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_MAPS
>> >> >>>>>>> > > >> >> >> > at java.lang.Enum.valueOf(Enum.java:236)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>> > >
>> >>
>> 
>>>>>>>>>>>>>>org.apache.hadoop.mapreduce.counters.FrameworkCounterGroup.va
>>>>>>>>>>>>>>lu
>> >>>>>>>>>>>>eO
>> >> >>>>>>>>>>f
>> >> >>>>>>>>>>(
>> >> >>>>>>> > > >> >> >> Framewo
>> >> >>>>>>> > > >> >> >> >rkCounterGroup.java:148)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>org.apache.hadoop.mapreduce.counters.FrameworkCounterGroup.
>> >> >>>>>>> > > >> >> >> findCounter(Fra
>> >> >>>>>>> > > >> >> >> >meworkCounterGroup.java:182)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>org.apache.hadoop.mapreduce.counters.AbstractCounters.findC
>> >> >>>>>>> > > >> >> >> ounter(Abstract
>> >> >>>>>>> > > >> >> >> >Counters.java:154)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>org.apache.hadoop.mapreduce.TypeConverter.fromYarn(TypeConv
>> >> >>>>>>> > > >> >> >> erter.java:240)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>org.apache.hadoop.mapred.ClientServiceDelegate.getJobCounte
>> >> >>>>>>> > > >> >> >> rs(ClientServic
>> >> >>>>>>> > > >> >> >> >eDelegate.java:370)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >> >> >>
>> >>>org.apache.hadoop.mapred.YARNRunner.getJobCounters(
>> >> >>>>>>> > > >> >> YARNRunner.java:511)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>>org.apache.hadoop.mapreduce.Job$7.run(Job.java:756)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>>org.apache.hadoop.mapreduce.Job$7.run(Job.java:753)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>>java.security.AccessController.doPrivileged(Native
>> >> >>>>>>> Method)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>>javax.security.auth.Subject.doAs(Subject.java:415)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGr
>> >> >>>>>>> > > >> >> >> oupInformation.
>> >> >>>>>>> > > >> >> >> >java:1491)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>>org.apache.hadoop.mapreduce.Job.getCounters(Job.java:753)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >>
>> >>
>> 
>>>>>>>>>>>>>org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:13
>>>>>>>>>>>>>61
>> >>>>>>>>>>>)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>>org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.
>> >> >>>>>>> > > >> java:1289)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>org.apache.kylin.job.hadoop.AbstractHadoopJob.waitForComple
>> >> >>>>>>> > > >> >> >> tion(AbstractHa
>> >> >>>>>>> > > >> >> >> >doopJob.java:134)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>> > > >>
>> >> >>>>>>> >
>> >>
>> 
>>>>>>>>>>>>>>org.apache.kylin.job.hadoop.cardinality.HiveColumnCardinality
>>>>>>>>>>>>>>Jo
>> >>>>>>>>>>>>b.
>> >> >>>>>>>>>>r
>> >> >>>>>>>>>>u
>> >> >>>>>>>>>>n
>> >> >>>>>>>>>>(
>> >> >>>>>>> > > >> >> >> HiveC
>> >> >>>>>>> > > >> >> >> >olumnCardinalityJob.java:114)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>>org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>>org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>org.apache.kylin.job.common.HadoopShellExecutable.doWork(Ha
>> >> >>>>>>> > > >> >> >> doopShellExecut
>> >> >>>>>>> > > >> >> >> >able.java:62)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>org.apache.kylin.job.execution.AbstractExecutable.execute(A
>> >> >>>>>>> > > >> >> >> bstractExecutab
>> >> >>>>>>> > > >> >> >> >le.java:99)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>org.apache.kylin.job.execution.DefaultChainedExecutable.doW
>> >> >>>>>>> > > >> >> >> ork(DefaultChai
>> >> >>>>>>> > > >> >> >> >nedExecutable.java:50)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>org.apache.kylin.job.execution.AbstractExecutable.execute(A
>> >> >>>>>>> > > >> >> >> bstractExecutab
>> >> >>>>>>> > > >> >> >> >le.java:99)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRu
>> >> >>>>>>> > > >> >> >> nner.run(Defaul
>> >> >>>>>>> > > >> >> >> >tScheduler.java:132)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoo
>> >> >>>>>>> > > >> >> >> lExecutor.java:
>> >> >>>>>>> > > >> >> >> >1145)
>> >> >>>>>>> > > >> >> >> > at
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPo
>> >> >>>>>>> > > >> >> >> olExecutor.java
>> >> >>>>>>> > > >> >> >> >:615)
>> >> >>>>>>> > > >> >> >> > at java.lang.Thread.run(Thread.java:745)
>> >> >>>>>>> > > >> >> >> >
>> >> >>>>>>> > > >> >> >> >Regards,
>> >> >>>>>>> > > >> >> >> >Santosh Akhilesh
>> >> >>>>>>> > > >> >> >> >Bangalore R&D
>> >> >>>>>>> > > >> >> >> >HUAWEI TECHNOLOGIES CO.,LTD.
>> >> >>>>>>> > > >> >> >> >
>> >> >>>>>>> > > >> >> >> >www.huawei.com
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>-----------------------------------------------------------
>> >> >>>>>>> > > >> >> >> ---------------
>> >> >>>>>>> > > >> >> >>
>> >> >>>>>>>>-----------------------------------------------------------
>> >> >>>>>>> > > >> >> >> >This e-mail and its attachments contain
>> >>confidential
>> >> >>>>>>> > information
>> >> >>>>>>> > > >> >>from
>> >> >>>>>>> > > >> >> >> >HUAWEI, which
>> >> >>>>>>> > > >> >> >> >is intended only for the person or entity whose
>> >> >>>>>>>address
>> >> >>>>>>>is
>> >> >>>>>>> > listed
>> >> >>>>>>> > > >> >> >>above.
>> >> >>>>>>> > > >> >> >> >Any use of the
>> >> >>>>>>> > > >> >> >> >information contained herein in any way
>> >>(including,
>> >> >>>>>>>but
>> >> >>>>>>>not
>> >> >>>>>>> > > >>limited
>> >> >>>>>>> > > >> >>to,
>> >> >>>>>>> > > >> >> >> >total or partial
>> >> >>>>>>> > > >> >> >> >disclosure, reproduction, or dissemination) by
>> >> >>>>>>>persons
>> >> >>>>>>>other
>> >> >>>>>>> > than
>> >> >>>>>>> > > >> >>the
>> >> >>>>>>> > > >> >> >> >intended
>> >> >>>>>>> > > >> >> >> >recipient(s) is prohibited. If you receive this
>> >> >>>>>>>e-mail
>> >> >>>>>>>in
>> >> >>>>>>> > error,
>> >> >>>>>>> > > >> >>please
>> >> >>>>>>> > > >> >> >> >notify the sender by
>> >> >>>>>>> > > >> >> >> >phone or email immediately and delete it!
>> >> >>>>>>> > > >> >> >> >
>> >> >>>>>>> > > >> >> >> >________________________________________
>> >> >>>>>>> > > >> >> >> >From: Shi, Shaofeng [shaoshi@ebay.com]
>> >> >>>>>>> > > >> >> >> >Sent: Thursday, February 26, 2015 11:32 AM
>> >> >>>>>>> > > >> >> >> >To: dev@kylin.incubator.apache.org
>> >> >>>>>>> > > >> >> >> >Cc: Kulbhushan Rana
>> >> >>>>>>> > > >> >> >> >Subject: Re: Error while making cube & Measure
>> >>option
>> >> >>>>>>>is
>> >> >>>>>>>not
>> >> >>>>>>> > > >> >>responding
>> >> >>>>>>> > > >> >> >> >on GUI
>> >> >>>>>>> > > >> >> >> >
>> >> >>>>>>> > > >> >> >> >Hi Santosh, hive table importing issue should 
>>not
>> >> >>>>>>>impact
>> >> >>>>>>>on
>> >> >>>>>>> > cube
>> >> >>>>>>> > > >> >> >>saving.
>> >> >>>>>>> > > >> >> >> >
>> >> >>>>>>> > > >> >> >> >If you couldn’t save the cube, firstly please
>> >>check
>> >> >>>>>>>whether
>> >> >>>>>>> > > >>there is
>> >> >>>>>>> > > >> >> >>error
>> >> >>>>>>> > > >> >> >> >in the tomcat’s log; If not please check your 
>>web
>> >> >>>>>>>browser; We
>> >> >>>>>>> > > >> >>suggest
>> >> >>>>>>> > > >> >> >>use
>> >> >>>>>>> > > >> >> >> >Firefox (with firebug add-on) or Chrome, open 
>>the
>> >>JS
>> >> >>>>>>>console
>> >> >>>>>>> > > >>(press
>> >> >>>>>>> > > >> >> >>F12)
>> >> >>>>>>> > > >> >> >> >and then operate web UI, check whether there is
>> >>any
>> >> >>>>>>>error
>> >> >>>>>>> > > >>reported
>> >> >>>>>>> > > >> >>in
>> >> >>>>>>> > > >> >> >> >browser.
>> >> >>>>>>> > > >> >> >> >
>> >> >>>>>>> > > >> >> >> >
>> >> >>>>>>> > > >> >> >> >On 2/26/15, 1:08 PM, "Santoshakhilesh"
>> >> >>>>>>> > > >>
>> >>
>> >>
>> >
>> >
>> >--
>> >Regards,
>> >Santosh Akhilesh
>> >+91-0-9845482201
>>
>>
>
>
>-- 
>Regards,
>Santosh Akhilesh
>+91-0-9845482201

Mime
View raw message