spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Cheng Lian <lian.cs....@gmail.com>
Subject Re: HiveShim not found when building in Intellij
Date Wed, 29 Oct 2014 05:19:53 GMT
Hm, the shim source folder could be automatically recognized some time 
before, although at a wrong directory level (sql/hive/v0.12.0/src 
instead of sql/hive/v0.12.0/src/main/scala), it compiles.

Just tried against a fresh checkout, indeed need to add shim source 
folder manually. Sorry for the confusion.

Cheng

On 10/29/14 1:05 PM, Patrick Wendell wrote:
> Cheng - to make it recognize the new HiveShim for 0.12 I had to click
> on spark-hive under "packages" in the left pane, then go to "Open
> Module Settings" - then explicitly add the v0.12.0/src/main/scala
> folder to the sources by navigating to it and then <ctrl>+click to add
> it as a source. Did you have to do this?
>
> On Tue, Oct 28, 2014 at 9:57 PM, Patrick Wendell <pwendell@gmail.com> wrote:
>> I just started a totally fresh IntelliJ project importing from our
>> root pom. I used all the default options and I added "hadoop-2.4,
>> hive, hive-0.13.1" profiles. I was able to run spark core tests from
>> within IntelliJ. Didn't try anything beyond that, but FWIW this
>> worked.
>>
>> - Patrick
>>
>> On Tue, Oct 28, 2014 at 9:54 PM, Cheng Lian <lian.cs.zju@gmail.com> wrote:
>>> You may first open the root pom.xml file in IDEA, and then go for menu View
>>> / Tool Windows / Maven Projects, then choose desired Maven profile
>>> combination under the "Profiles" node (e.g. I usually use hadoop-2.4 + hive
>>> + hive-0.12.0). IDEA will ask you to re-import the Maven projects, confirm,
>>> then it should be OK.
>>>
>>> I can debug within IDEA with this approach. However, you have to clean the
>>> whole project before debugging Spark within IDEA if you compiled the project
>>> outside IDEA. Haven't got time to investigate this annoying issue.
>>>
>>> Also, you can remove sub projects unrelated to your tasks to accelerate
>>> compilation and/or avoid other IDEA build issues (e.g. Avro related Spark
>>> streaming build failure in IDEA).
>>>
>>>
>>> On 10/29/14 12:42 PM, Stephen Boesch wrote:
>>>
>>> I am interested specifically in how to build (and hopefully run/debug..)
>>> under Intellij.  Your posts sound like command line maven - which has always
>>> been working already.
>>>
>>> Do you have instructions for building in IJ?
>>>
>>> 2014-10-28 21:38 GMT-07:00 Cheng Lian <lian.cs.zju@gmail.com>:
>>>> Yes, these two combinations work for me.
>>>>
>>>>
>>>> On 10/29/14 12:32 PM, Zhan Zhang wrote:
>>>>> -Phive is to enable hive-0.13.1 and "-Phive -Phive-0.12.0" is to enable
>>>>> hive-0.12.0. Note that the thrift-server is not supported yet in hive-0.13,
>>>>> but expected to go to upstream soon (Spark-3720).
>>>>>
>>>>> Thanks.
>>>>>
>>>>> Zhan Zhang
>>>>>
>>>>>
>>>>>    On Oct 28, 2014, at 9:09 PM, Stephen Boesch <javadba@gmail.com>
wrote:
>>>>>
>>>>>> Thanks Patrick for the heads up.
>>>>>>
>>>>>> I have not been successful to discover a combination of profiles
(i.e.
>>>>>> enabling hive or hive-0.12.0 or hive-13.0) that works in Intellij
with
>>>>>> maven. Anyone who knows how to handle this - a quick note here would
be
>>>>>> appreciated.
>>>>>>
>>>>>>
>>>>>>
>>>>>> 2014-10-28 20:20 GMT-07:00 Patrick Wendell <pwendell@gmail.com>:
>>>>>>
>>>>>>> Hey Stephen,
>>>>>>>
>>>>>>> In some cases in the maven build we now have pluggable source
>>>>>>> directories based on profiles using the maven build helper plug-in.
>>>>>>> This is necessary to support cross building against different
Hive
>>>>>>> versions, and there will be additional instances of this due
to
>>>>>>> supporting scala 2.11 and 2.10.
>>>>>>>
>>>>>>> In these cases, you may need to add source locations explicitly
to
>>>>>>> intellij if you want the entire project to compile there.
>>>>>>>
>>>>>>> Unfortunately as long as we support cross-building like this,
it will
>>>>>>> be an issue. Intellij's maven support does not correctly detect
our
>>>>>>> use of the maven-build-plugin to add source directories.
>>>>>>>
>>>>>>> We should come up with a good set of instructions on how to import
the
>>>>>>> pom files + add the few extra source directories. Off hand I
am not
>>>>>>> sure exactly what the correct sequence is.
>>>>>>>
>>>>>>> - Patrick
>>>>>>>
>>>>>>> On Tue, Oct 28, 2014 at 7:57 PM, Stephen Boesch <javadba@gmail.com>
>>>>>>> wrote:
>>>>>>>> Hi Matei,
>>>>>>>>    Until my latest pull from upstream/master it had not been
necessary
>>>>>>>> to
>>>>>>>> add the hive profile: is it now??
>>>>>>>>
>>>>>>>> I am not using sbt gen-idea. The way to open in intellij
has been to
>>>>>>>> Open
>>>>>>>> the parent directory. IJ recognizes it as a maven project.
>>>>>>>>
>>>>>>>> There are several steps to do surgery on the yarn-parent
/ yarn
>>>>>>>> projects
>>>>>>> ,
>>>>>>>> then do a full rebuild.  That was working until one week
ago.
>>>>>>>> Intellij/maven is presently broken in  two ways:  this hive
shim
>>>>>>>> (which
>>>>>>> may
>>>>>>>> yet hopefully be a small/simple fix - let us see) and  (2)
the
>>>>>>>> "NoClassDefFoundError
>>>>>>>> on ThreadFactoryBuilder" from my prior emails -and which
is quite a
>>>>>>> serious
>>>>>>>> problem .
>>>>>>>>
>>>>>>>> 2014-10-28 19:46 GMT-07:00 Matei Zaharia <matei.zaharia@gmail.com>:
>>>>>>>>
>>>>>>>>> Hi Stephen,
>>>>>>>>>
>>>>>>>>> How did you generate your Maven workspace? You need to
make sure the
>>>>>>> Hive
>>>>>>>>> profile is enabled for it. For example sbt/sbt -Phive
gen-idea.
>>>>>>>>>
>>>>>>>>> Matei
>>>>>>>>>
>>>>>>>>>> On Oct 28, 2014, at 7:42 PM, Stephen Boesch <javadba@gmail.com>
>>>>>>> wrote:
>>>>>>>>>> I have run on the command line via maven and it is
fine:
>>>>>>>>>>
>>>>>>>>>> mvn   -Dscalastyle.failOnViolation=false -DskipTests
-Pyarn
>>>>>>> -Phadoop-2.3
>>>>>>>>>> compile package install
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> But with the latest code Intellij builds do not work.
Following is
>>>>>>> one of
>>>>>>>>>> 26 similar errors:
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Error:(173, 38) not found: value HiveShim
>>>>>>>>>>
>>>>>>>>> Option(tableParameters.get(HiveShim.getStatsSetupConstTotalSize))
>>>>>>>>>>                                      ^
>>>>>>>>>
>>>


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Mime
View raw message