flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Flavio Pompermaier <pomperma...@okkam.it>
Subject Re: Error running an hadoop job from web interface
Date Mon, 26 Oct 2015 14:06:48 GMT
No, I just use the default parallelism

On Mon, Oct 26, 2015 at 3:05 PM, Maximilian Michels <mxm@apache.org> wrote:

> Did you set the default parallelism of the cluster to 36? This is because
> the plan gets optimized against the cluster configuration when you try to
> run the uploaded program. Before, it doesn't do any optimization. This
> might not be very intuitive. We should probably change that.
>
> On Mon, Oct 26, 2015 at 2:03 PM, Flavio Pompermaier <pompermaier@okkam.it>
> wrote:
>
>> Now that I've recompiled flink and restarted the web-client everything
>> works fine.
>>
>> However, when I flag the job I want to run I see parallelism 1 in the
>> right panel, but when I click on "Run Job" button + show optimizer plan
>> flagged I see parallelism 36. Is that a bug of the first preview?
>>
>>
>> On Mon, Oct 26, 2015 at 10:01 AM, Maximilian Michels <mxm@apache.org>
>> wrote:
>>
>>> Correct. I'll fix it today.
>>>
>>> Cheers,
>>> Max
>>>
>>> On Mon, Oct 26, 2015 at 9:08 AM, Flavio Pompermaier <
>>> pompermaier@okkam.it> wrote:
>>>
>>>> Running from the shell everything works..is it a problem of
>>>> classloaders hierarchy in the webapp?
>>>>
>>>> On Fri, Oct 23, 2015 at 5:53 PM, Maximilian Michels <mxm@apache.org>
>>>> wrote:
>>>>
>>>>> ./bin/flink run /path/to/jar arguments
>>>>>
>>>>> or
>>>>>
>>>>> ./bin/flink run -c MainClass /path/to/jar arguments
>>>>>
>>>>> On Fri, Oct 23, 2015 at 5:50 PM, Stefano Bortoli <s.bortoli@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> What I normally do is to
>>>>>>
>>>>>> java -cp MYUBERJAR.jar my.package.mainclass
>>>>>>
>>>>>> does it make sense?
>>>>>>
>>>>>> 2015-10-23 17:22 GMT+02:00 Flavio Pompermaier <pompermaier@okkam.it>:
>>>>>>
>>>>>>> could you write ne the command please?I'm not in the office right
>>>>>>> now..
>>>>>>> On 23 Oct 2015 17:10, "Maximilian Michels" <mxm@apache.org>
wrote:
>>>>>>>
>>>>>>>> Could you try submitting the job from the command-line and
see if
>>>>>>>> it works?
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> Max
>>>>>>>>
>>>>>>>> On Fri, Oct 23, 2015 at 4:42 PM, Flavio Pompermaier <
>>>>>>>> pompermaier@okkam.it> wrote:
>>>>>>>>
>>>>>>>>> 0.10-snapshot
>>>>>>>>> On 23 Oct 2015 16:09, "Maximilian Michels" <mxm@apache.org>
wrote:
>>>>>>>>>
>>>>>>>>>> Hi Flavio,
>>>>>>>>>>
>>>>>>>>>> Which version of Flink are you using?
>>>>>>>>>>
>>>>>>>>>> Cheers,
>>>>>>>>>> Max
>>>>>>>>>>
>>>>>>>>>> On Fri, Oct 23, 2015 at 2:45 PM, Flavio Pompermaier
<
>>>>>>>>>> pompermaier@okkam.it> wrote:
>>>>>>>>>>
>>>>>>>>>>> Hi to all,
>>>>>>>>>>> I'm trying to run a job from the web interface
but I get this
>>>>>>>>>>> error:
>>>>>>>>>>>
>>>>>>>>>>> java.lang.RuntimeException: java.io.FileNotFoundException:
JAR entry core-site.xml not found in /tmp/webclient-jobs/EntitonsJsonizer.jar
>>>>>>>>>>> 	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2334)
>>>>>>>>>>> 	at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2187)
>>>>>>>>>>> 	at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2104)
>>>>>>>>>>> 	at org.apache.hadoop.conf.Configuration.get(Configuration.java:853)
>>>>>>>>>>> 	at org.apache.hadoop.mapred.JobConf.checkAndWarnDeprecation(JobConf.java:2088)
>>>>>>>>>>> 	at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:446)
>>>>>>>>>>> 	at org.apache.hadoop.mapreduce.Job.getInstance(Job.java:175)
>>>>>>>>>>> 	at org.apache.hadoop.mapreduce.Job.getInstance(Job.java:156)
>>>>>>>>>>> 	at it.okkam.flink.entitons.io.utils.ParquetThriftEntitons.readEntitons(ParquetThriftEntitons.java:42)
>>>>>>>>>>> 	at it.okkam.flink.entitons.io.utils.ParquetThriftEntitons.readEntitonsWithId(ParquetThriftEntitons.java:73)
>>>>>>>>>>> 	at org.okkam.entitons.EntitonsJsonizer.readAtomQuads(EntitonsJsonizer.java:235)
>>>>>>>>>>> 	at org.okkam.entitons.EntitonsJsonizer.main(EntitonsJsonizer.java:119)
>>>>>>>>>>> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
>>>>>>>>>>> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>>>>> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>>>> 	at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>>>>> 	at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:497)
>>>>>>>>>>> 	at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:395)
>>>>>>>>>>> 	at org.apache.flink.client.program.OptimizerPlanEnvironment.getOptimizedPlan(OptimizerPlanEnvironment.java:80)
>>>>>>>>>>> 	at org.apache.flink.client.program.Client.getOptimizedPlan(Client.java:220)
>>>>>>>>>>> 	at org.apache.flink.client.CliFrontend.info(CliFrontend.java:412)
>>>>>>>>>>> 	at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:977)
>>>>>>>>>>> 	at org.apache.flink.client.web.JobSubmissionServlet.doGet(JobSubmissionServlet.java:171)
>>>>>>>>>>> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:734)
>>>>>>>>>>> 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:847)
>>>>>>>>>>> 	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:532)
>>>>>>>>>>> 	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:453)
>>>>>>>>>>> 	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:227)
>>>>>>>>>>> 	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:965)
>>>>>>>>>>> 	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:388)
>>>>>>>>>>> 	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:187)
>>>>>>>>>>> 	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:901)
>>>>>>>>>>> 	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:117)
>>>>>>>>>>> 	at org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:47)
>>>>>>>>>>> 	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:113)
>>>>>>>>>>> 	at org.eclipse.jetty.server.Server.handle(Server.java:352)
>>>>>>>>>>> 	at org.eclipse.jetty.server.HttpConnection.handleRequest(HttpConnection.java:596)
>>>>>>>>>>> 	at org.eclipse.jetty.server.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:1048)
>>>>>>>>>>> 	at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:549)
>>>>>>>>>>> 	at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:211)
>>>>>>>>>>> 	at org.eclipse.jetty.server.HttpConnection.handle(HttpConnection.java:425)
>>>>>>>>>>> 	at org.eclipse.jetty.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:489)
>>>>>>>>>>> 	at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:436)
>>>>>>>>>>> 	at java.lang.Thread.run(Thread.java:745)
>>>>>>>>>>> Caused by: java.io.FileNotFoundException: JAR
entry core-site.xml not found in /tmp/webclient-jobs/EntitonsJsonizer.jar
>>>>>>>>>>> 	at sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:140)
>>>>>>>>>>> 	at sun.net.www.protocol.jar.JarURLConnection.getInputStream(JarURLConnection.java:150)
>>>>>>>>>>> 	at java.net.URL.openStream(URL.java:1037)
>>>>>>>>>>> 	at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2163)
>>>>>>>>>>> 	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2234)
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> I checked the jar and it contains the core-site.xml
file..Am I
>>>>>>>>>>> forced to configure the hadoop classpath in my
Flink cluster config files?
>>>>>>>>>>>
>>>>>>>>>>> Best,
>>>>>>>>>>> Flavio
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>>
>

Mime
View raw message