spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: No logs from my cluster / worker ... (running DSE 4.6.1)
Date Mon, 04 May 2015 20:07:29 GMT
Looks like you're using Spark 1.1.0

Support for Kafka 0.8.2 was added by:
https://issues.apache.org/jira/browse/SPARK-2808

which would come in Spark 1.4.0

FYI

On Mon, May 4, 2015 at 12:22 PM, Eric Ho <eric.ho@intel.com> wrote:

> I'm submitting this via 'dse spark-submit' but somehow, I don't see any
> loggings in my cluster or worker machines...
>
> How can I find out ?
>
> My cluster is running DSE 4.6.1 with Spark enabled.
> My source is running Kafka 0.8.2.0
>
> I'm launching my program on one of my DSE machines.
>
> Any insights much appreciated.
>
> Thanks.
>
> ---------
> cas1.dev% dse spark-submit --verbose --deploy-mode cluster --master
> spark://cas1.dev.kno.com:7077 --class
> com.kno.highlights.counting.service.HighlightConsumer --driver-class-path
>
> /opt/kno/kno-highlights-counting-service/kno-highlights-counting-service-0.1/lib
> --driver-library-path
>
> /opt/kno/kno-highlights-counting-service/kno-highlights-counting-service-0.1/lib
> --properties-file /tmp/highlights-counting.properties
>
> /opt/kno/kno-highlights-counting-service/kno-highlights-counting-service-0.1/lib/kno-highlights-counting-service.kno-highlights-counting-service-0.1.jar
> --name HighlightConsumer
> Using properties file: /tmp/highlights-counting.properties
> Warning: Ignoring non-spark config property:
> checkpoint_directory=checkpointForHighlights
> Warning: Ignoring non-spark config property: zookeeper_port=2181
> Warning: Ignoring non-spark config property:
> default_num_of_cores_per_topic=1
> Warning: Ignoring non-spark config property: num_of_concurrent_streams=2
> Warning: Ignoring non-spark config property:
> kafka_consumer_group=highlight_consumer_group
> Warning: Ignoring non-spark config property: app_name=HighlightConsumer
> Warning: Ignoring non-spark config property: cassandra_keyspace=bookevents
> Warning: Ignoring non-spark config property: scheduler_mode=FIFO
> Warning: Ignoring non-spark config property:
> highlight_topic=highlight_topic
> Warning: Ignoring non-spark config property: cassandra_host=
> cas1.dev.kno.com
> Warning: Ignoring non-spark config property: checkpoint_interval=3
> Warning: Ignoring non-spark config property: zookeeper_host=
> cas1.dev.kno.com
> Adding default property: spark_master=spark://cas1.dev.kno.com:7077
> Warning: Ignoring non-spark config property: streaming_window=10
> Using properties file: /tmp/highlights-counting.properties
> Warning: Ignoring non-spark config property:
> checkpoint_directory=checkpointForHighlights
> Warning: Ignoring non-spark config property: zookeeper_port=2181
> Warning: Ignoring non-spark config property:
> default_num_of_cores_per_topic=1
> Warning: Ignoring non-spark config property: num_of_concurrent_streams=2
> Warning: Ignoring non-spark config property:
> kafka_consumer_group=highlight_consumer_group
> Warning: Ignoring non-spark config property: app_name=HighlightConsumer
> Warning: Ignoring non-spark config property: cassandra_keyspace=bookevents
> Warning: Ignoring non-spark config property: scheduler_mode=FIFO
> Warning: Ignoring non-spark config property:
> highlight_topic=highlight_topic
> Warning: Ignoring non-spark config property: cassandra_host=
> cas1.dev.kno.com
> Warning: Ignoring non-spark config property: checkpoint_interval=3
> Warning: Ignoring non-spark config property: zookeeper_host=
> cas1.dev.kno.com
> Adding default property: spark_master=spark://cas1.dev.kno.com:7077
> Warning: Ignoring non-spark config property: streaming_window=10
> Parsed arguments:
>   master                  spark://cas1.dev.kno.com:7077
>   deployMode              cluster
>   executorMemory          null
>   executorCores           null
>   totalExecutorCores      null
>   propertiesFile          /tmp/highlights-counting.properties
>   extraSparkProperties    Map()
>   driverMemory            null
>   driverCores             null
>   driverExtraClassPath
>
> /opt/kno/kno-highlights-counting-service/kno-highlights-counting-service-0.1/lib
>   driverExtraLibraryPath
>
> /opt/kno/kno-highlights-counting-service/kno-highlights-counting-service-0.1/lib
>   driverExtraJavaOptions  null
>   supervise               false
>   queue                   null
>   numExecutors            null
>   files                   null
>   pyFiles                 null
>   archives                null
>   mainClass
> com.kno.highlights.counting.service.HighlightConsumer
>   primaryResource
>
> file:/opt/kno/kno-highlights-counting-service/kno-highlights-counting-service-0.1/lib/kno-highlights-counting-service.kno-highlights-counting-service-0.1.jar
>   name
> com.kno.highlights.counting.service.HighlightConsumer
>   childArgs               [--name HighlightConsumer]
>   jars                    null
>   verbose                 true
>
> Default properties from /tmp/highlights-counting.properties:
>   spark_master -> spark://cas1.dev.kno.com:7077
>
>
> Using properties file: /tmp/highlights-counting.properties
> Warning: Ignoring non-spark config property:
> checkpoint_directory=checkpointForHighlights
> Warning: Ignoring non-spark config property: zookeeper_port=2181
> Warning: Ignoring non-spark config property:
> default_num_of_cores_per_topic=1
> Warning: Ignoring non-spark config property: num_of_concurrent_streams=2
> Warning: Ignoring non-spark config property:
> kafka_consumer_group=highlight_consumer_group
> Warning: Ignoring non-spark config property: app_name=HighlightConsumer
> Warning: Ignoring non-spark config property: cassandra_keyspace=bookevents
> Warning: Ignoring non-spark config property: scheduler_mode=FIFO
> Warning: Ignoring non-spark config property:
> highlight_topic=highlight_topic
> Warning: Ignoring non-spark config property: cassandra_host=
> cas1.dev.kno.com
> Warning: Ignoring non-spark config property: checkpoint_interval=3
> Warning: Ignoring non-spark config property: zookeeper_host=
> cas1.dev.kno.com
> Adding default property: spark_master=spark://cas1.dev.kno.com:7077
> Warning: Ignoring non-spark config property: streaming_window=10
> Main class:
> org.apache.spark.deploy.Client
> Arguments:
> launch
> spark://cas1.dev.kno.com:7077
>
> file:/opt/kno/kno-highlights-counting-service/kno-highlights-counting-service-0.1/lib/kno-highlights-counting-service.kno-highlights-counting-service-0.1.jar
> com.kno.highlights.counting.service.HighlightConsumer
> --name
> HighlightConsumer
> System properties:
> spark_master -> spark://cas1.dev.kno.com:7077
> spark.driver.extraLibraryPath ->
>
> /opt/kno/kno-highlights-counting-service/kno-highlights-counting-service-0.1/lib
> SPARK_SUBMIT -> true
> spark.app.name -> com.kno.highlights.counting.service.HighlightConsumer
> spark.jars ->
>
> file:/opt/kno/kno-highlights-counting-service/kno-highlights-counting-service-0.1/lib/kno-highlights-counting-service.kno-highlights-counting-service-0.1.jar
> spark.master -> spark://cas1.dev.kno.com:7077
> spark.driver.extraClassPath ->
>
> /opt/kno/kno-highlights-counting-service/kno-highlights-counting-service-0.1/lib
> Classpath elements:
>
>
>
> Sending launch command to spark://cas1.dev.kno.com:7077
>
> ------------
>
> Here's my build.sbt:
> ---------
> import AssemblyKeys._
> import NativePackagerHelper._
>
> assemblySettings
>
> jarName in assembly := "kno-highlights-counting-service.jar"
>
> name := "kno-highlights-counting-service"
>
> version := "0.1"
>
> scalaVersion := "2.10.4"
>
> exportJars := true
>
> enablePlugins(JavaServerAppPackaging)
>
> resolvers ++= Seq(
>   "spray repo" at "http://repo.spray.io/",
>   "Typesafe Repo" at "http://repo.typesafe.com/typesafe/releases/"
> )
>
> packageOptions in (Compile, packageBin) +=
>    Package.ManifestAttributes( java.util.jar.Attributes.Name.SEALED ->
> "true" )
>
> libraryDependencies ++= {
>   Seq(
>   // "org.apache.kafka" % "kafka_2.10" % "0.8.0" exclude("org.jboss.netty",
> "netty"),
>   "org.apache.spark" % "spark-core_2.10" % "1.1.0"
> exclude("org.jboss.netty", "netty"),
>   "org.apache.spark" % "spark-streaming_2.10" % "1.1.0"
> exclude("org.jboss.netty", "netty"),
>   "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.1.0"
> exclude("org.jboss.netty", "netty"),
>   "com.datastax.spark" %% "spark-cassandra-connector" % "1.1.0"
> exclude("org.jboss.netty", "netty"),
>   "commons-io" % "commons-io" % "2.4",
>   "org.apache.commons" % "commons-pool2" % "2.3",
>   "ch.qos.logback" % "logback-classic" % "1.1.2",
>   "io.spray" %% "spray-json" % "1.3.1",
>   "com.typesafe" % "config" % "1.2.1"
>   )
> }
>
> seq(Revolver.settings: _*)
>
> atmosSettings
> -------
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/No-logs-from-my-cluster-worker-running-DSE-4-6-1-tp22759.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Mime
View raw message