beam-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "liyuntian (JIRA)" <>
Subject [jira] [Commented] (BEAM-375) HadoopIO and runners-spark conflict with hadoop.version
Date Thu, 13 Apr 2017 12:40:41 GMT


liyuntian commented on BEAM-375:

Does HdfsIO must use hadoop2.7.0 or above? I use Hadoop2.6.0 in our system,but It is blocking
,when I execute "" with sparkrunner . If I change Hadoop version to 2.7.0, it
can run very well with sparkrunner.So,I think I must use Hadoop2.7.0,yes?

> HadoopIO and runners-spark conflict with hadoop.version
> -------------------------------------------------------
>                 Key: BEAM-375
>                 URL:
>             Project: Beam
>          Issue Type: Bug
>          Components: sdk-java-extensions
>            Reporter: Pei He
>            Assignee: Pei He
> HadoopIO currently uses 2.7.0 and runners-spark uses 2.2.0 for hadoop-client, hadoop-common.
> From [~amitsela]
> "Spark can be built against different hadoop versions, but the release in maven central
is a 2.2.0 build (latest). ''
> For HadoopIO, I don't know why 2.7.0 is picked at the beginning. I can check if it will
work with 2.2.0.
> I am creating this issue, since I think it there is a general question.
> In principle, HadoopIO and other sdks Sources should work with any runners. But, when
one set of runners require version A, but the other set of runners require version B, we will
need a general solution for it.

This message was sent by Atlassian JIRA

View raw message