spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Mao, Wei" <>
Subject RE: Unable to compile and test Spark in IntelliJ
Date Wed, 27 Jan 2016 03:14:34 GMT
I used to meet same compile error within Intellij, and resolved by click:

View --> Tool Windows --> Maven Projects --> Spark Project Catalyst --> Plugins
--> antlr3, then remake project

William Mao

From: Iulian DragoČ™ []
Sent: Wednesday, January 27, 2016 12:12 AM
To: Hyukjin Kwon
Subject: Re: Unable to compile and test Spark in IntelliJ

On Tue, Jan 19, 2016 at 6:06 AM, Hyukjin Kwon <<>>
Hi all,

I usually have been working with Spark in IntelliJ.
Before this PR,
for `[SPARK-12575][SQL] Grammar parity with existing SQL parser`. I was able to just open
the project and then run some tests with IntelliJ Run button.

However, it looks that PR adds some ANTLR files for parsing and I cannot run the tests as
I did. So, I ended up with doing this by mvn compile first and then running some tests with

I can still run some tests with sbt or maven in comment line but this is a bit inconvenient.
I just want to run some tests as I did in IntelliJ.

I followed this several
times but it still emits some exceptions such as

Error:(779, 34) not found: value SparkSqlParser
    case ast if ast.tokenType == SparkSqlParser.TinyintLiteral =>

and I still should run mvn compile or mvn test first for them.

Is there any good way to run some Spark tests within IntelliJ as I did before?

I'm using Eclipse, but all I had to do in order to build in the IDE was to add `target/generated-sources/antlr3`
to the project sources, after building once in Sbt. You probably have the sources there already.




Iulian Dragos

Reactive Apps on the JVM<>

View raw message