hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Xuefu Zhang (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-8836) Enable automatic tests with remote spark client [Spark Branch]
Date Wed, 26 Nov 2014 14:57:13 GMT

    [ https://issues.apache.org/jira/browse/HIVE-8836?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14226270#comment-14226270
] 

Xuefu Zhang commented on HIVE-8836:
-----------------------------------

[~ruili], I think the number of reducers changed because of the cluster changes. Previously
the plan is generated with one node with 4 cores (local[4]). Now the cluster has 2 nodes and
one core each. Memory configuration is also different. I guess it's hard to tweek the cluster
configuration so that the same number of reducer results.

For now, I think we have to go thru the list and analyze failures one by one. It's a long
list, and maybe it can be divided among people so that each only take a slice of it.

Briefly checking the result, it seems the failures are caused by any of the following reasons:
1. reducer number change, which is okay.
2. result diff. It could be a matter of ordering, but could be different result also.
3. test failed to run.

I noticed that we are using local-cluster[2,1,2048]. Maybe we should have a more general case
where one node has more than one core. Also, we may need to adjust the memory settings. Once
we have a representative of a small cluster, we probably will stay with it for some time.




> Enable automatic tests with remote spark client [Spark Branch]
> --------------------------------------------------------------
>
>                 Key: HIVE-8836
>                 URL: https://issues.apache.org/jira/browse/HIVE-8836
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>    Affects Versions: spark-branch
>            Reporter: Chengxiang Li
>            Assignee: Rui Li
>              Labels: Spark-M3
>             Fix For: spark-branch
>
>         Attachments: HIVE-8836.1-spark.patch, HIVE-8836.2-spark.patch, HIVE-8836.3-spark.patch,
HIVE-8836.4-spark.patch, HIVE-8836.5-spark.patch, HIVE-8836.6-spark.patch, HIVE-8836.6-spark.patch,
HIVE-8836.6-spark.patch
>
>
> In real production environment, remote spark client should be used to submit spark job
for Hive mostly, we should enable automatic test with remote spark client to make sure the
Hive feature workable with it.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message