falcon-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Margus Roo <mar...@roo.ee>
Subject Re: Unable to schedule workflow after enabled kerberos
Date Thu, 04 Feb 2016 07:34:11 GMT
Yes :) it was just a snapshot from manul

Margus (margusja) Roo
http://margus.roo.ee
skype: margusja
+372 51 48 780

On 04/02/16 09:23, Peeyush Bishnoi wrote:
> Margus,
> In addition, Instead of "EXAMPLE.COM" realm use your configured realm like " TESTHADOOP.COM"
in kerberos principals used in properties section.
> ---
>
>      On Thursday, 4 February 2016 12:44 PM, Margus Roo <margus@roo.ee> wrote:
>   
>
>   I think my problem might be here
>
> https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.4/bk_installing_manually_book/content/configuring_for_secure_clusters_falcon.html
> I did not done 7. step:
>
> <properties>
>      <property name="dfs.namenode.kerberos.principal" value="nn/$my.internal@EXAMPLE.COM"/>
>      <property name="hive.metastore.kerberos.principal" value="hive/$my.internal@EXAMPLE.COM"/>
>      <property name="hive.metastore.uris" value="thrift://$my.internal:9083"/>
>      <property name="hive.metastore.sasl.enabled" value="true"/>
> </properties>
>
>
>
> Margus (margusja) Roo
> http://margus.roo.ee
> skype: margusja
> +372 51 48 780
>
> On 04/02/16 09:04, Margus Roo wrote:
>> Hi
>>
>> Here comes full set of xmls
>>
>> Cluster:
>> <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
>> <cluster xmlns="uri:falcon:cluster:0.1" name="Cluster2" description=""
>> colo="telekomarendus">
>>      <interfaces>
>>          <interface type="readonly" endpoint="hftp://hadoopnn2:50070"
>> version="2.3.0"/>
>>          <interface type="write" endpoint="hdfs://mycluster"
>> version="2.3.0"/>
>>          <interface type="execute" endpoint="hadoopnn1:8050"
>> version="2.3.0"/>
>>          <interface type="workflow"
>> endpoint="http://hadoopnn1:11000/oozie/" version="4.2.0"/>
>>          <interface type="messaging"
>> endpoint="tcp://hadoopnn2:61616?daemon=true" version="5.1.6"/>
>>          <interface type="registry" endpoint="thrift://hadoopnn1:9083"
>> version=""/>
>>      </interfaces>
>>      <locations>
>>          <location name="staging" path="/apps/falcon/Cluster2/staging"/>
>>          <location name="temp" path="/tmp"/>
>>          <location name="working" path="/apps/falcon/Cluster2/working"/>
>>      </locations>
>>      <ACL owner="margusja" group="users" permission="0755"/>
>> </cluster>
>>
>> falcon-output-table.xml:
>> <feed xmlns="uri:falcon:feed:0.1" name="falcon-input-table"
>> description="input table">
>>      <frequency>days(1)</frequency>
>>      <timezone>UTC</timezone>
>>      <late-arrival cut-off="hours(3)"/>
>>      <clusters>
>>          <cluster name="Cluster2" type="source">
>>              <validity start="2016-01-01T00:00Z" end="2030-01-01T00:00Z"/>
>>              <retention limit="months(99999999)" action="archive"/>
>>          </cluster>
>>      </clusters>
>>      <table
>> uri="catalog:default:falcon1#feed_date=${YEAR}-${MONTH}-${DAY}"/>
>>      <ACL owner="margusja" group="users" permission="0770"/>
>>      <schema location="hcat" provider="hcat"/>
>> </feed>
>>
>> falcon-output-table.xml:
>> <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
>> <feed xmlns="uri:falcon:feed:0.1" name="falcon-output-table"
>> description="output table">
>>      <frequency>days(1)</frequency>
>>      <timezone>UTC</timezone>
>>      <clusters>
>>          <cluster name="Cluster2" type="source">
>>              <validity start="2016-01-01T00:00Z" end="2030-01-01T00:00Z"/>
>>              <retention limit="months(9999)" action="archive"/>
>>          </cluster>
>>      </clusters>
>>      <table
>> uri="catalog:default:falcon2#feed_date=${YEAR}-${MONTH}-${DAY}"/>
>>      <ACL owner="margusja" group="users" permission="0770"/>
>>      <schema location="hcat" provider="hcat"/>
>> </feed>
>>
>> and finally proccess.xml:
>> <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
>> <process xmlns="uri:falcon:process:0.1" name="hive-proc5">
>> <tags>level=devel,owner=margusja,department=bigdata</tags>
>>      <pipelines>test_Pipeline, dataReplication,
>> clickStream_pipieline</pipelines>
>>      <clusters>
>>          <cluster name="Cluster2">
>>              <validity start="2016-01-26T00:10Z" end="2099-03-09T12:00Z"/>
>>          </cluster>
>>      </clusters>
>>      <parallel>1</parallel>
>>      <order>FIFO</order>
>>      <frequency>days(1)</frequency>
>>      <timezone>UTC</timezone>
>>      <inputs>
>>          <input name="in" feed="falcon-input-table"
>> start="yesterday(0,0)" end="yesterday(0,0)"/>
>>      </inputs>
>>      <outputs>
>>          <output name="out" feed="falcon-output-table"
>> instance="yesterday(0,0)"/>
>>      </outputs>
>>      <properties>
>>          <property name="margusja-demo-input"
>> value="margusja-demo-sisend"/>
>>      </properties>
>>      <workflow engine="hive" path="/user/margusja/hive.hql"/>
>>      <retry policy="periodic" delay="minutes(5)" attempts="1"/>
>>      <ACL owner="margusja" group="margusja"/>
>> </process>
>>
>>
>> What I tryed now is:
>> 1. Deleted all feeds and processes
>> 2. Tried to submit feed:
>> [margusja@hadoopnn2 ~]$ falcon entity -submit -type feed  -file
>> hive-in.xml
>> And got:
>> ERROR: Bad
>> Request;default/org.apache.falcon.FalconWebException::org.apache.falcon.FalconException:
>> Exception checking if the table exists:Exception creating Proxied
>> HiveMetaStoreClient: The value of property
>> hive.metastore.kerberos.principal must not be null
>>
>> So now I have something to work with :)
>>
>>
>> Margus (margusja) Roo
>> http://margus.roo.ee
>> skype: margusja
>> +372 51 48 780
>>
>> On 04/02/16 05:46, Peeyush Bishnoi wrote:
>>> Hi Margus,
>>> Can you also please share the cluster entity and feed entity xml file
>>> as well to have more analysis why it is failing in secure environment.
>>> Thanks,---Peeyush
>>>
>>>        On Wednesday, 3 February 2016 9:36 PM, Margus Roo
>>> <margus@roo.ee> wrote:
>>>
>>>    Hi
>>>
>>> Falcon- 0.6.1.2.3 from hdp- 2.1
>>>
>>> Before I enabled kerberos via ambari I was able to submit process and
>>> submit it via Falcon GUI
>>> After I enabled kerberos I can submit process from command line:
>>>    >falcon entity -submit -type process  -file hive-proc.xml
>>> and getting:
>>> falcon/default/Submit successful (process) hive-proc5
>>>
>>> But after i try to schedule it via Falcon GUI or command line I will get
>>> in log:
>>> 016-02-03 17:58:40,439 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Creating Oozie client object for
>>> http://hadoopnn1.estpak.ee:11000/oozie/ (OozieClientFactory:50)
>>> 2016-02-03 17:58:40,474 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Creating FS impersonating user margusja (HadoopClientFactory:196)
>>> 2016-02-03 17:58:40,519 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Creating FS impersonating user margusja (HadoopClientFactory:196)
>>> 2016-02-03 17:58:40,520 DEBUG - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Copying libs from
>>> /usr/hdp/current/falcon-server/server/webapp/falcon/WEB-INF/lib
>>> (SharedLibraryHostingService:117)
>>> 2016-02-03 17:58:40,563 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Copied
>>> /usr/hdp/current/falcon-server/server/webapp/falcon/WEB-INF/lib/falcon-client-0.6.1.2.3.4.0-3485.jar
>>>
>>> to
>>> /apps/falcon/Cluster2/staging/falcon/workflows/process/hive-proc5/a22bcab3f50782dcaaa168cf014cdad3_1454515120504/DEFAULT/lib/falcon-client-0.6.1.2.3.4.0-3485.jar
>>>
>>> in hdfs://mycluster (SharedLibraryHostingService:146)
>>> 2016-02-03 17:58:40,602 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Copied
>>> /usr/hdp/current/falcon-server/server/webapp/falcon/WEB-INF/lib/falcon-prism-0.6.1.2.3.4.0-3485-classes.jar
>>>
>>> to
>>> /apps/falcon/Cluster2/staging/falcon/workflows/process/hive-proc5/a22bcab3f50782dcaaa168cf014cdad3_1454515120504/DEFAULT/lib/falcon-prism-0.6.1.2.3.4.0-3485-classes.jar
>>>
>>> in hdfs://mycluster (SharedLibraryHostingService:146)
>>> 2016-02-03 17:58:40,636 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Copied
>>> /usr/hdp/current/falcon-server/server/webapp/falcon/WEB-INF/lib/falcon-hadoop-dependencies-0.6.1.2.3.4.0-3485.jar
>>>
>>> to
>>> /apps/falcon/Cluster2/staging/falcon/workflows/process/hive-proc5/a22bcab3f50782dcaaa168cf014cdad3_1454515120504/DEFAULT/lib/falcon-hadoop-dependencies-0.6.1.2.3.4.0-3485.jar
>>>
>>> in hdfs://mycluster (SharedLibraryHostingService:146)
>>> 2016-02-03 17:58:40,669 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Copied
>>> /usr/hdp/current/falcon-server/server/webapp/falcon/WEB-INF/lib/falcon-rerun-0.6.1.2.3.4.0-3485.jar
>>>
>>> to
>>> /apps/falcon/Cluster2/staging/falcon/workflows/process/hive-proc5/a22bcab3f50782dcaaa168cf014cdad3_1454515120504/DEFAULT/lib/falcon-rerun-0.6.1.2.3.4.0-3485.jar
>>>
>>> in hdfs://mycluster (SharedLibraryHostingService:146)
>>> 2016-02-03 17:58:40,703 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Copied
>>> /usr/hdp/current/falcon-server/server/webapp/falcon/WEB-INF/lib/falcon-retention-0.6.1.2.3.4.0-3485.jar
>>>
>>> to
>>> /apps/falcon/Cluster2/staging/falcon/workflows/process/hive-proc5/a22bcab3f50782dcaaa168cf014cdad3_1454515120504/DEFAULT/lib/falcon-retention-0.6.1.2.3.4.0-3485.jar
>>>
>>> in hdfs://mycluster (SharedLibraryHostingService:146)
>>> 2016-02-03 17:58:40,734 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Copied
>>> /usr/hdp/current/falcon-server/server/webapp/falcon/WEB-INF/lib/falcon-metrics-0.6.1.2.3.4.0-3485.jar
>>>
>>> to
>>> /apps/falcon/Cluster2/staging/falcon/workflows/process/hive-proc5/a22bcab3f50782dcaaa168cf014cdad3_1454515120504/DEFAULT/lib/falcon-metrics-0.6.1.2.3.4.0-3485.jar
>>>
>>> in hdfs://mycluster (SharedLibraryHostingService:146)
>>> 2016-02-03 17:58:40,767 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Copied
>>> /usr/hdp/current/falcon-server/server/webapp/falcon/WEB-INF/lib/falcon-common-0.6.1.2.3.4.0-3485.jar
>>>
>>> to
>>> /apps/falcon/Cluster2/staging/falcon/workflows/process/hive-proc5/a22bcab3f50782dcaaa168cf014cdad3_1454515120504/DEFAULT/lib/falcon-common-0.6.1.2.3.4.0-3485.jar
>>>
>>> in hdfs://mycluster (SharedLibraryHostingService:146)
>>> 2016-02-03 17:58:40,801 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Copied
>>> /usr/hdp/current/falcon-server/server/webapp/falcon/WEB-INF/lib/falcon-messaging-0.6.1.2.3.4.0-3485.jar
>>>
>>> to
>>> /apps/falcon/Cluster2/staging/falcon/workflows/process/hive-proc5/a22bcab3f50782dcaaa168cf014cdad3_1454515120504/DEFAULT/lib/falcon-messaging-0.6.1.2.3.4.0-3485.jar
>>>
>>> in hdfs://mycluster (SharedLibraryHostingService:146)
>>> 2016-02-03 17:58:40,843 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Copied
>>> /usr/hdp/current/falcon-server/server/webapp/falcon/WEB-INF/lib/falcon-distcp-replication-0.6.1.2.3.4.0-3485.jar
>>>
>>> to
>>> /apps/falcon/Cluster2/staging/falcon/workflows/process/hive-proc5/a22bcab3f50782dcaaa168cf014cdad3_1454515120504/DEFAULT/lib/falcon-distcp-replication-0.6.1.2.3.4.0-3485.jar
>>>
>>> in hdfs://mycluster (SharedLibraryHostingService:146)
>>> 2016-02-03 17:58:40,884 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Copied
>>> /usr/hdp/current/falcon-server/server/webapp/falcon/WEB-INF/lib/falcon-hive-replication-0.6.1.2.3.4.0-3485.jar
>>>
>>> to
>>> /apps/falcon/Cluster2/staging/falcon/workflows/process/hive-proc5/a22bcab3f50782dcaaa168cf014cdad3_1454515120504/DEFAULT/lib/falcon-hive-replication-0.6.1.2.3.4.0-3485.jar
>>>
>>> in hdfs://mycluster (SharedLibraryHostingService:146)
>>> 2016-02-03 17:58:40,929 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Copied
>>> /usr/hdp/current/falcon-server/server/webapp/falcon/WEB-INF/lib/falcon-oozie-adaptor-0.6.1.2.3.4.0-3485.jar
>>>
>>> to
>>> /apps/falcon/Cluster2/staging/falcon/workflows/process/hive-proc5/a22bcab3f50782dcaaa168cf014cdad3_1454515120504/DEFAULT/lib/falcon-oozie-adaptor-0.6.1.2.3.4.0-3485.jar
>>>
>>> in hdfs://mycluster (SharedLibraryHostingService:146)
>>> 2016-02-03 17:58:40,952 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Creating FS impersonating user margusja (HadoopClientFactory:196)
>>> 2016-02-03 17:58:40,954 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Successfully released lock for (process) hive-proc5 by
>>> 1064730004@qtp-525968792-4 - fe585770-aab9-4cff-b5b3-c5929ce5ec46
>>> (MemoryLocks:70)
>>> 2016-02-03 17:58:40,954 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Memory lock released for (process) hive-proc5
>>> (AbstractSchedulableEntityManager:100)
>>> 2016-02-03 17:58:40,954 ERROR - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Unable to schedule workflow (AbstractSchedulableEntityManager:76)
>>> org.apache.falcon.FalconException: Entity schedule failed for PROCESS:
>>> hive-proc5
>>>            at
>>> org.apache.falcon.resource.AbstractSchedulableEntityManager.scheduleInternal(AbstractSchedulableEntityManager.java:96)
>>>
>>>            at
>>> org.apache.falcon.resource.AbstractSchedulableEntityManager.schedule(AbstractSchedulableEntityManager.java:73)
>>>
>>>            at
>>> org.apache.falcon.resource.SchedulableEntityManager.schedule(SchedulableEntityManager.java:133)
>>>
>>>            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>            at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>
>>>            at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>
>>>            at java.lang.reflect.Method.invoke(Method.java:497)
>>>            at
>>> org.apache.falcon.resource.channel.IPCChannel.invoke(IPCChannel.java:49)
>>>            at
>>> org.apache.falcon.resource.proxy.SchedulableEntityManagerProxy$9.doExecute(SchedulableEntityManagerProxy.java:403)
>>>
>>>            at
>>> org.apache.falcon.resource.proxy.SchedulableEntityManagerProxy$EntityProxy.execute(SchedulableEntityManagerProxy.java:577)
>>>
>>>            at
>>> org.apache.falcon.resource.proxy.SchedulableEntityManagerProxy.schedule_aroundBody12(SchedulableEntityManagerProxy.java:405)
>>>
>>>            at
>>> org.apache.falcon.resource.proxy.SchedulableEntityManagerProxy$AjcClosure13.run(SchedulableEntityManagerProxy.java:1)
>>>
>>>            at
>>> org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:149)
>>>
>>>            at
>>> org.apache.falcon.aspect.AbstractFalconAspect.logAroundMonitored(AbstractFalconAspect.java:51)
>>>
>>>            at
>>> org.apache.falcon.resource.proxy.SchedulableEntityManagerProxy.schedule(SchedulableEntityManagerProxy.java:388)
>>>
>>>            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>            at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>
>>>            at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>
>>>            at java.lang.reflect.Method.invoke(Method.java:497)
>>>            at
>>> com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
>>>
>>>            at
>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:185)
>>>
>>>            at
>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
>>>
>>>            at
>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:288)
>>>
>>>            at
>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
>>>
>>>            at
>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
>>>
>>>            at
>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
>>>
>>>          at
>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
>>>
>>>            at
>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
>>>
>>>            at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1469)
>>>
>>>            at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1400)
>>>
>>>            at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1349)
>>>
>>>            at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1339)
>>>
>>>            at
>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
>>>
>>>            at
>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:537)
>>>
>>>            at
>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:699)
>>>
>>>            at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
>>>            at
>>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>>            at
>>> org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1221)
>>>
>>>            at
>>> org.apache.falcon.security.FalconAuthorizationFilter.doFilter(FalconAuthorizationFilter.java:108)
>>>
>>>            at
>>> org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212)
>>>
>>>            at
>>> org.apache.falcon.security.FalconAuthenticationFilter$2.doFilter(FalconAuthenticationFilter.java:188)
>>>
>>>            at
>>> org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:615)
>>>
>>>            at
>>> org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:574)
>>>
>>>            at
>>> org.apache.falcon.security.FalconAuthenticationFilter.doFilter(FalconAuthenticationFilter.java:197)
>>>
>>>            at
>>> org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212)
>>>
>>>            at
>>> org.apache.falcon.security.FalconAuditFilter.doFilter(FalconAuditFilter.java:64)
>>>
>>>            at
>>> org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212)
>>>
>>>            at
>>> org.apache.falcon.security.HostnameFilter.doFilter(HostnameFilter.java:82)
>>>
>>>            at
>>> org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212)
>>>
>>>            at
>>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:399)
>>>            at
>>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>>
>>>            at
>>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>>            at
>>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:767)
>>>            at
>>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>>            at
>>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>>            at org.mortbay.jetty.Server.handle(Server.java:326)
>>>            at
>>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>>            at
>>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>>
>>>            at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>>            at
>>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>>            at
>>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>>            at
>>> org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:228)
>>>
>>>            at
>>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>>>
>>> Caused by: java.lang.NullPointerException
>>>            at java.util.Hashtable.put(Hashtable.java:459)
>>>            at
>>> org.apache.falcon.oozie.OozieEntityBuilder.getHiveCredentials(OozieEntityBuilder.java:201)
>>>
>>>            at
>>> org.apache.falcon.oozie.OozieEntityBuilder.getHiveCredentialsAsConf(OozieEntityBuilder.java:210)
>>>
>>>            at
>>> org.apache.falcon.oozie.OozieOrchestrationWorkflowBuilder.createHiveConfiguration(OozieOrchestrationWorkflowBuilder.java:310)
>>>
>>>            at
>>> org.apache.falcon.oozie.process.ProcessExecutionWorkflowBuilder.setupHiveCredentials(ProcessExecutionWorkflowBuilder.java:139)
>>>
>>>            at
>>> org.apache.falcon.oozie.process.ProcessExecutionWorkflowBuilder.build(ProcessExecutionWorkflowBuilder.java:105)
>>>
>>>            at
>>> org.apache.falcon.oozie.process.ProcessExecutionCoordinatorBuilder.buildCoords(ProcessExecutionCoordinatorBuilder.java:89)
>>>
>>>            at
>>> org.apache.falcon.oozie.process.ProcessBundleBuilder.buildCoords(ProcessBundleBuilder.java:103)
>>>
>>>            at
>>> org.apache.falcon.oozie.OozieBundleBuilder.build(OozieBundleBuilder.java:69)
>>>
>>>            at
>>> org.apache.falcon.workflow.engine.OozieWorkflowEngine.schedule(OozieWorkflowEngine.java:165)
>>>
>>>            at
>>> org.apache.falcon.resource.AbstractSchedulableEntityManager.scheduleInternal(AbstractSchedulableEntityManager.java:94)
>>>
>>>            ... 61 more
>>> 2016-02-03 17:58:40,955 ERROR - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Action failed: Bad Request
>>> Error: Entity schedule failed for PROCESS: hive-proc5
>>> (FalconWebException:83)
>>> 2016-02-03 17:58:40,955 ERROR - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ Action failed: Bad Request
>>> Error:
>>> default/org.apache.falcon.FalconWebException::org.apache.falcon.FalconException:
>>>
>>> Entity schedule failed for PROCESS: hive-proc5
>>>      (FalconWebException:83)
>>> 2016-02-03 17:58:40,955 INFO  - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:margusja:POST//entities/schedule/PROCESS/hive-proc5]
>>>
>>> ~ {Action:schedule, Dimensions:{colo=NULL, entityType=PROCESS,
>>> entityName=hive-proc5}, Status: FAILED, Time-taken:517003440 ns}
>>> (METRIC:38)
>>> 2016-02-03 17:58:40,956 DEBUG - [1064730004@qtp-525968792-4 -
>>> fe585770-aab9-4cff-b5b3-c5929ce5ec46:] ~ Audit: margusja/10.65.104.45
>>> performed request
>>> http://hadoopnn2.estpak.ee:15000/api/entities/schedule/PROCESS/hive-proc5
>>> (88.196.164.43)
>>> at time 2016-02-03T15:58Z (FalconAuditFilter:86)
>>>
>>> Proccess xml:
>>> <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
>>> <process xmlns="uri:falcon:process:0.1" name="hive-proc5">
>>> <tags>level=devel,owner=margusja,department=bigdata</tags>
>>>        <pipelines>test_Pipeline, dataReplication,
>>> clickStream_pipieline</pipelines>
>>>        <clusters>
>>>            <cluster name="Cluster2">
>>>                <validity start="2016-01-26T00:10Z"
>>> end="2099-03-09T12:00Z"/>
>>>            </cluster>
>>>        </clusters>
>>>        <parallel>1</parallel>
>>>        <order>FIFO</order>
>>>        <frequency>days(1)</frequency>
>>>        <timezone>UTC</timezone>
>>>        <inputs>
>>>            <input name="in" feed="falcon-input-table"
>>> start="yesterday(0,0)" end="yesterday(0,0)"/>
>>>        </inputs>
>>>        <outputs>
>>>            <output name="out" feed="falcon-output-table"
>>> instance="yesterday(0,0)"/>
>>>        </outputs>
>>>        <properties>
>>>            <property name="margusja-demo-input"
>>> value="margusja-demo-sisend"/>
>>>        </properties>
>>>        <workflow engine="hive" path="/user/margusja/hive.hql"/>
>>>        <retry policy="periodic" delay="minutes(5)" attempts="1"/>
>>>        <ACL owner="margusja" group="margusja"/>
>>> </process>
>>>
>>>
>>> Any hint to debug it ?
>>>
>>>
>>>
>
>
>    


Mime
View raw message