livy-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Saisai Shao <sai.sai.s...@gmail.com>
Subject Re: Livy on OpenJDK 8
Date Wed, 16 Aug 2017 09:33:56 GMT
Did you set "livy.spark.scala-version", for Spark 2.1.1 if you don't set
this configuration, Livy will use repl_2.11 jars by default, since default
scala version of Spark 2.1.1 is 2.11, so you have to explicitly specify "
livy.spark.scala-version" to "2.10.x".

On Wed, Aug 16, 2017 at 5:21 PM, Alex Rodrigues <arodrigues@actificent.com>
wrote:

> Hi
>
>
> Thanks for your response.
>
> The command executed by LivyServer when initializing a session is:
>
> /docker-java-home/bin/java -cp /apps/spark-2.1.1_2.10/conf/:/
> apps/spark-2.1.1_2.10/jars/*:/etc/hadoop/conf -Xmx1g
> org.apache.spark.deploy.SparkSubmit --properties-file /tmp/
> livyConf5628211771459249745.properties --class
> com.cloudera.livy.rsc.driver.RSCDriverBootstrapper spark-internal
>
>
> Livy server command:
>
>  /docker-java-home/bin/java -cp /apps/livy/jars/*:/apps/livy/conf:/etc/hadoop/conf:
> com.cloudera.livy.server.LivyServer
>
> Please see attached the list of files in /apps/spark-2.1.1_2.10/jars/*,
> and other with Livy jars (present in /apps/livy/jars/*).
>
> The behaviour seems pretty consistent and issue with Hotspot JVM cannot be
> reproduced.
>
>
> On Wed, Aug 16, 2017 at 9:51 AM, Saisai Shao <sai.sai.shao@gmail.com>
> wrote:
>
>> Does it fail constantly? Can you please check if Livy pick the right repl
>> jar (2.10) for Spark 2.1.1 + Scala 2.1.0?
>>
>> On Wed, Aug 16, 2017 at 2:45 AM, Alex Rodrigues <
>> arodrigues@actificent.com> wrote:
>>
>>> Hi Livyers,
>>>
>>> I'm hitting a rather nasty error when attempting to initialize a session.
>>>
>>> Livy is running on Docker
>>> Livy branch 0.3.0 is compiled with Scala 2.10
>>> Spark 2.1.1 compiled with Scala 2.10
>>> Spark Master = local[*]
>>>
>>>
>>> If running with OpenJDK 1.8.0_141, the server spills this to the log:
>>>
>>> 17/08/15 00:13:21.242 qtp2001223946-27 INFO InteractiveSession$:
>>> Creating LivyClient for sessionId: 6
>>> 17/08/15 00:13:21.246 qtp2001223946-27 WARN RSCConf: Your hostname,
>>> 394f110eb785, resolves to a loopback address, but we couldn't find any
>>> external IP address!
>>> 17/08/15 00:13:21.246 qtp2001223946-27 WARN RSCConf: Set
>>> livy.rsc.rpc.server.address if you need to bind to another address.
>>> 17/08/15 00:13:21.251 qtp2001223946-27 DEBUG RSCClient: Sending
>>> JobRequest[5cd32687-53b0-4d29-8436-a1ea38faab62].
>>> 17/08/15 00:13:21.251 qtp2001223946-27 INFO InteractiveSessionManager:
>>> Registering new session 6
>>> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN No
>>> appenders could be found for logger (com.cloudera.livy.rsc.driver.
>>> RSCDriver).
>>> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN
>>> Please initialize the log4j system properly.
>>> 17/08/15 00:13:21.934 stderr-redir-7 INFO ContextLauncher: log4j:WARN
>>> See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
>>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (41 bytes)
>>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>> Handling SASL challenge message...
>>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>> Sending SASL challenge response...
>>> 17/08/15 00:13:22.279 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (98 bytes)
>>> 17/08/15 00:13:22.287 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (275 bytes)
>>> 17/08/15 00:13:22.287 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>> Handling SASL challenge message...
>>> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>> Sending SASL challenge response...
>>> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (45 bytes)
>>> 17/08/15 00:13:22.288 RPC-Handler-4 DEBUG RpcServer$SaslServerHandler:
>>> SASL negotiation finished with QOP auth.
>>> 17/08/15 00:13:22.289 RPC-Handler-4 DEBUG ContextLauncher: New RPC
>>> client connected from [id: 0xd53993c6, /10.0.1.5:60310 => /
>>> 10.0.1.5:39450].
>>> 17/08/15 00:13:22.295 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>>> message of type com.cloudera.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
>>> 17/08/15 00:13:22.295 RPC-Handler-4 DEBUG KryoMessageCodec: Decoded
>>> message of type com.cloudera.livy.rsc.BaseProtocol$RemoteDriverAddress
>>> (73 bytes)
>>> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG RpcDispatcher:
>>> [RegistrationHandler] Received RPC message: type=CALL id=0
>>> payload=com.cloudera.livy.rsc.BaseProtocol$RemoteDriverAddress
>>> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG ContextLauncher: Received
>>> driver info for client [id: 0xd53993c6, /10.0.1.5:60310 => /
>>> 10.0.1.5:39450]: 394f110eb785/35784.
>>> 17/08/15 00:13:22.296 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>>> message of type com.cloudera.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
>>> 17/08/15 00:13:22.297 RPC-Handler-4 DEBUG KryoMessageCodec: Encoded
>>> message of type com.cloudera.livy.rsc.rpc.Rpc$NullMessage (2 bytes)
>>> 17/08/15 00:13:22.322 RSCClient-7-3 DEBUG KryoMessageCodec: Encoded
>>> message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (41 bytes)
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher: Exception in
>>> thread "main" java.lang.VerifyError: Bad type on operand stack*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher: Exception
>>> Details:*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Location:*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:
>>> com/cloudera/livy/repl/SparkInterpreter.bind(Ljava/lang/String;Ljava/lang/String;Ljava/lang/Object;Lscala/collection/immutable/List;)V
>>> @7: invokevirtual*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Reason:*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     Type
>>> 'org/apache/spark/repl/SparkILoop' (current frame, stack[1]) is not
>>> assignable to 'scala/tools/nsc/interpreter/ILoop'*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:   Current
>>> Frame:*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     bci: @7*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     flags: {
>>> }*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     locals:
>>> { 'com/cloudera/livy/repl/SparkInterpreter', 'java/lang/String',
>>> 'java/lang/String', 'java/lang/Object', 'scala/collection/immutable/List' }*
>>> *17/08/15 00:13:22.333 stderr-redir-7 INFO ContextLauncher:     stack: {
>>> 'scala/tools/nsc/interpreter/ILoop$', 'org/apache/spark/repl/SparkILoop' }*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:   Bytecode:*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:
>>> 0x0000000: b201 052a b600 52b6 0109 bb00 2359 2a2b*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:
>>> 0x0000010: 2c2d 1904 b701 63b6 0166 57b1*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> com.cloudera.livy.repl.ReplDriver.initializeContext(ReplDriver.scala:49)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> com.cloudera.livy.rsc.driver.RSCDriver.run(RSCDriver.java:328)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> com.cloudera.livy.rsc.driver.RSCDriverBootstrapper.main(RSCDriverBootstrapper.java:86)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> java.lang.reflect.Method.invoke(Method.java:498)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)*
>>> *17/08/15 00:13:22.334 stderr-redir-7 INFO ContextLauncher:  at
>>> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)*
>>> 17/08/15 00:13:22.338 RSCClient-7-4 ERROR RSCClient: Failed to connect
>>> to context.
>>> javax.security.sasl.SaslException: Client closed before SASL
>>> negotiation finished.
>>> at com.cloudera.livy.rsc.rpc.Rpc$SaslClientHandler.dispose(Rpc.java:416)
>>> at com.cloudera.livy.rsc.rpc.SaslHandler.channelInactive(SaslHa
>>> ndler.java:92)
>>> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
>>> Inactive(AbstractChannelHandlerContext.java:208)
>>> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
>>> active(AbstractChannelHandlerContext.java:194)
>>> at io.netty.channel.ChannelInboundHandlerAdapter.channelInactiv
>>> e(ChannelInboundHandlerAdapter.java:75)
>>> at com.cloudera.livy.rsc.rpc.KryoMessageCodec.channelInactive(K
>>> ryoMessageCodec.java:104)
>>> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
>>> Inactive(AbstractChannelHandlerContext.java:208)
>>> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
>>> active(AbstractChannelHandlerContext.java:194)
>>> at io.netty.channel.ChannelInboundHandlerAdapter.channelInactiv
>>> e(ChannelInboundHandlerAdapter.java:75)
>>> at io.netty.channel.AbstractChannelHandlerContext.invokeChannel
>>> Inactive(AbstractChannelHandlerContext.java:208)
>>> at io.netty.channel.AbstractChannelHandlerContext.fireChannelIn
>>> active(AbstractChannelHandlerContext.java:194)
>>> at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(
>>> DefaultChannelPipeline.java:828)
>>> at io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(Abstra
>>> ctChannel.java:621)
>>> at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTas
>>> ks(SingleThreadEventExecutor.java:357)
>>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>>> at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(Sin
>>> gleThreadEventExecutor.java:111)
>>> at java.lang.Thread.run(Thread.java:748)
>>> 17/08/15 00:13:22.339 RSCClient-7-4 INFO RSCClient: Failing pending job
>>> 5cd32687-53b0-4d29-8436-a1ea38faab62 due to shutdown.
>>> 17/08/15 00:13:22.340 RSCClient-7-4 DEBUG InteractiveSession:
>>> InteractiveSession 6 session state change from starting to error
>>> 17/08/15 00:13:22.341 ForkJoinPool-1-worker-11 INFO InteractiveSession:
>>> Stopping InteractiveSession 6...
>>>
>>>
>>> With Hotspot (1.8.0_144), it runs fine.
>>>
>>> It seems to be JDK-8077810
>>> <https://bugs.openjdk.java.net/browse/JDK-8077810> but would be great
>>> if someone could confirm this behaviour and if this is a known Livy problem.
>>>
>>> What JVM do you use to run Livy?
>>>
>>>
>>> Many thanks,
>>> Alex
>>>
>>>
>>>
>>
>

Mime
View raw message