spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ashok Kumar <>
Subject Re: Basic question on using one's own classes in the Scala app
Date Sun, 05 Jun 2016 19:01:25 GMT
Hello for 1, I read the doc as
libraryDependencies += groupID % artifactID % revision
jar tvf utilities-assembly-0.1-SNAPSHOT.jar|grep CheckpointDirectory

Now I have added this
libraryDependencies += "com.databricks" %  "apps.twitter_classifier"

However, I am getting an error
error: No implicit for Append.Value[Seq[sbt.ModuleID], sbt.impl.GroupArtifactID] found,
  so sbt.impl.GroupArtifactID cannot be appended to Seq[sbt.ModuleID]
libraryDependencies += "com.databricks" %  "apps.twitter_classifier"
[error] Type error in expression
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore?

Any ideas very appreciated
Thanking yoou


    On Sunday, 5 June 2016, 17:39, Ted Yu <> wrote:

 For #1, please find examples on the nete.g.

For #2,
import <package-name>. getCheckpointDirectory
On Sun, Jun 5, 2016 at 8:36 AM, Ashok Kumar <> wrote:

Thank you sir.
At compile time can I do something similar to
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"

I have these
name := "scala"
version := "1.0"
scalaVersion := "2.10.4"
And if I look at jar file i have

jar tvf utilities-assembly-0.1-SNAPSHOT.jar|grep Check  1180 Sun Jun 05 10:14:36 BST 2016
com/databricks/apps/twitter_classifier/getCheckpointDirectory.class  1043 Sun Jun 05 10:14:36
BST 2016 getCheckpointDirectory.class  1216 Fri Sep 18 09:12:40 BST 2015 scala/collection/parallel/ParIterableLike$StrictSplitterCheckTask$class.class 
 615 Fri Sep 18 09:12:40 BST 2015 scala/collection/parallel/ParIterableLike$StrictSplitterCheckTask.class
two questions please
What do I need to put in libraryDependencies line
and what do I need to add to the top of scala app like
import org.apache.log4j.Loggerimport org.apache.log4j.Levelimport ?


    On Sunday, 5 June 2016, 15:21, Ted Yu <> wrote:

 At compilation time, you need to declare the dependence on getCheckpointDirectory.
At runtime, you can use '--jars utilities-assembly-0.1-SNAPSHOT.jar' to pass the jar.
On Sun, Jun 5, 2016 at 3:06 AM, Ashok Kumar <> wrote:

Hi all,
Appreciate any advice on this. It is about scala
I have created a very basic Utilities.scala that contains a test class and method. I intend
to add my own classes and methods as I expand and make references to these classes and methods
in my other apps
class getCheckpointDirectory {  def CheckpointDirectory (ProgramName: String) : String  =
{     var hdfsDir = "hdfs://host:9000/user/user/checkpoint/"+ProgramName     return
hdfsDir  }}I have used sbt to create a jar file for it. It is created as a jar file

Now I want to make a call to that method CheckpointDirectory in my app code myapp.dcala to
return the value for hdfsDir
   val ProgramName = this.getClass.getSimpleName.trim   val getCheckpointDirectory =  new
getCheckpointDirectory   val hdfsDir = getCheckpointDirectory.CheckpointDirectory(ProgramName)
However, I am getting a compilation error as expected
not found: type getCheckpointDirectory[error]     val getCheckpointDirectory =  new getCheckpointDirectory[error]
                                      ^[error] one error found[error] (compile:compileIncremental)
Compilation failed
So a basic question, in order for compilation to work do I need to create a package for my
jar file or add dependency like the following I do in sbt
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"libraryDependencies +=
"org.apache.spark" %% "spark-sql" % "1.5.1"libraryDependencies += "org.apache.spark" %% "spark-hive"
% "1.5.1"

Any advise will be appreciated.


View raw message