spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Chandra Hasan (JIRA)" <j...@apache.org>
Subject [jira] [Comment Edited] (SPARK-24086) Exception while executing spark streaming examples
Date Fri, 27 Apr 2018 06:48:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-24086?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16456003#comment-16456003
] 

Chandra Hasan edited comment on SPARK-24086 at 4/27/18 6:47 AM:
----------------------------------------------------------------

[~hyukjin.kwon] Thanks mate, i included necessary dependencies while executing and its working
now.
 If someone is facing same issue here is the solution
{code:java}
spark-submit --jars kafka-clients-1.1.0.jar,spark-streaming_2.11-2.3.0.jar,spark-streaming-kafka-0-10_2.11-2.3.0.jar
--class org.apache.spark.examples.streaming.JavaDirectKafkaWordCount target/original-spark-examples_2.11-2.4.0-SNAPSHOT.jar
<brokerip:port> <topic>{code}
 

Also [~hyukjin.kwon] I would like to inform that the consumer properties mentioned in the
example file JavaDirectKafkaWordCount example isn't updated which throws configuration missing
error and i need to rewrite the code as below
{code:java}
kafkaParams.put("bootstrap.servers", brokers);
kafkaParams.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put("group.id", "<group_id>");{code}
 

{{What do you say, Is it fine or need to open a bug for this?}}

 


was (Author: hasan4791):
[~hyukjin.kwon] Thanks mate, i included necessary dependencies while executing and its working
now.
 If someone is facing same issue here is the solution
{code:java}
spark-submit --jars kafka-clients-1.1.0.jar,spark-streaming_2.11-2.3.0.jar,spark-streaming-kafka-0-10_2.11-2.3.0.jar
--class org.apache.spark.examples.streaming.JavaDirectKafkaWordCount target/original-spark-examples_2.11-2.4.0-SNAPSHOT.jar
<brokerip:port> <topic>{code}
 

Also [~hyukjin.kwon] I would like to inform that the consumer properties mentioned in the
example file JavaDirectKafkaWordCount example isn't updated which throws configuration missing
error and i need to rewrite the code as below

{{}}
{code:java}
kafkaParams.put("bootstrap.servers", brokers);
kafkaParams.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put("group.id", "<group_id>");{code}
 

{{What do you say, Is it fine or need to open a bug for this?}}

 

> Exception while executing spark streaming examples
> --------------------------------------------------
>
>                 Key: SPARK-24086
>                 URL: https://issues.apache.org/jira/browse/SPARK-24086
>             Project: Spark
>          Issue Type: Bug
>          Components: Examples
>    Affects Versions: 2.3.0
>            Reporter: Chandra Hasan
>            Priority: Major
>
> After running mvn clean package, I tried to execute one of the spark example program
JavaDirectKafkaWordCount.java but throws following exeception.
> {code:java}
> [cloud-user@server-2 examples]$ run-example streaming.JavaDirectKafkaWordCount 192.168.0.4:9092
msu
> 2018-04-25 09:39:22 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for
your platform... using builtin-java classes where applicable
> 2018-04-25 09:39:22 INFO SparkContext:54 - Running Spark version 2.3.0
> 2018-04-25 09:39:22 INFO SparkContext:54 - Submitted application: JavaDirectKafkaWordCount
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing view acls to: cloud-user
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing modify acls to: cloud-user
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing view acls groups to:
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing modify acls groups to:
> 2018-04-25 09:39:22 INFO SecurityManager:54 - SecurityManager: authentication disabled;
ui acls disabled; users with view permissions: Set(cloud-user); groups with view permissions:
Set(); users with modify permissions: Set(cloud-user); groups with modify permissions: Set()
> 2018-04-25 09:39:23 INFO Utils:54 - Successfully started service 'sparkDriver' on port
59333.
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering MapOutputTracker
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering BlockManagerMaster
> 2018-04-25 09:39:23 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper
for getting topology information
> 2018-04-25 09:39:23 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
> 2018-04-25 09:39:23 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-6fc11fc1-f638-42ea-a9df-dc01fb81b7b6
> 2018-04-25 09:39:23 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering OutputCommitCoordinator
> 2018-04-25 09:39:23 INFO log:192 - Logging initialized @1825ms
> 2018-04-25 09:39:23 INFO Server:346 - jetty-9.3.z-SNAPSHOT
> 2018-04-25 09:39:23 INFO Server:414 - Started @1900ms
> 2018-04-25 09:39:23 INFO AbstractConnector:278 - Started ServerConnector@6813a331{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
> 2018-04-25 09:39:23 INFO Utils:54 - Successfully started service 'SparkUI' on port 4040.
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4f7c0be3{/jobs,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4cfbaf4{/jobs/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@58faa93b{/jobs/job,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@127d7908{/jobs/job/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6b9c69a9{/stages,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6622a690{/stages/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@30b9eadd{/stages/stage,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3249a1ce{/stages/stage/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4dd94a58{/stages/pool,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2f4919b0{/stages/pool/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@a8a8b75{/storage,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@75b21c3b{/storage/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@72be135f{/storage/rdd,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@155d1021{/storage/rdd/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4bd2f0dc{/environment,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2e647e59{/environment/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2c42b421{/executors,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@51e37590{/executors/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@deb3b60{/executors/threadDump,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@701a32{/executors/threadDump/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@39aa45a1{/static,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@294bdeb4{/,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5300f14a{/api,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@54acff7d{/jobs/job/kill,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7bc9e6ab{/stages/stage/kill,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://server-2:4040
> 2018-04-25 09:39:23 INFO SparkContext:54 - Added JAR file:///home/cloud-user/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.3.0.jar
at spark://server-2:59333/jars/spark-examples_2.11-2.3.0.jar with timestamp 1524663563504
> 2018-04-25 09:39:23 INFO SparkContext:54 - Added JAR file:///home/cloud-user/spark-2.3.0-bin-hadoop2.7/examples/jars/scopt_2.11-3.7.0.jar
at spark://server-2:59333/jars/scopt_2.11-3.7.0.jar with timestamp 1524663563505
> 2018-04-25 09:39:23 INFO Executor:54 - Starting executor ID driver on host localhost
> 2018-04-25 09:39:23 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService'
on port 56363.
> 2018-04-25 09:39:23 INFO NettyBlockTransferService:54 - Server created on server-2:56363
> 2018-04-25 09:39:23 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy
for block replication policy
> 2018-04-25 09:39:23 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver,
server-2, 56363, None)
> 2018-04-25 09:39:23 INFO BlockManagerMasterEndpoint:54 - Registering block manager server-2:56363
with 366.3 MB RAM, BlockManagerId(driver, server-2, 56363, None)
> 2018-04-25 09:39:23 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver,
server-2, 56363, None)
> 2018-04-25 09:39:23 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver,
server-2, 56363, None)
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4e2916c3{/metrics/json,null,AVAILABLE,@Spark}
> Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/kafka010/LocationStrategies
> at org.apache.spark.examples.streaming.JavaDirectKafkaWordCount.main(JavaDirectKafkaWordCount.java:76)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
> at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879)
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.kafka010.LocationStrategies
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> ... 11 more
> 2018-04-25 09:39:23 INFO SparkContext:54 - Invoking stop() from shutdown hook
> 2018-04-25 09:39:23 INFO AbstractConnector:318 - Stopped Spark@6813a331{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
> 2018-04-25 09:39:23 INFO SparkUI:54 - Stopped Spark web UI at http://server-2:4040
> 2018-04-25 09:39:23 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint
stopped!
> 2018-04-25 09:39:23 INFO MemoryStore:54 - MemoryStore cleared
> 2018-04-25 09:39:23 INFO BlockManager:54 - BlockManager stopped
> 2018-04-25 09:39:23 INFO BlockManagerMaster:54 - BlockManagerMaster stopped
> 2018-04-25 09:39:23 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 -
OutputCommitCoordinator stopped!
> 2018-04-25 09:39:23 INFO SparkContext:54 - Successfully stopped SparkContext
> 2018-04-25 09:39:23 INFO ShutdownHookManager:54 - Shutdown hook called
> 2018-04-25 09:39:23 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-edc94694-ab74-4b66-9ef2-10d28b3f5359
> 2018-04-25 09:39:23 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-d57ca1de-b096-4036-ad4b-ed97295443c4
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message