flink-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From steveloughran <...@git.apache.org>
Subject [GitHub] flink issue #4926: [FLINK-7951] Load YarnConfiguration with default Hadoop c...
Date Thu, 02 Nov 2017 11:42:00 GMT
Github user steveloughran commented on the issue:

    creating YarnConfiguration & HdfsConfiguration through some dynamic classloading is
enough to force in these files & configs underneath your own Configurations. You shouldn't
be reading in all their values and sending them over the wire with your job, just the direct
values off your config, which you can just enum directly off the Configuration (which is Writeable,
BTW),  I have some scala code to send it around [using Java serialization](https://github.com/hortonworks-spark/cloud-integration/blob/master/spark-cloud-integration/src/main/scala/com/hortonworks/spark/cloud/utils/ConfigSerDeser.scala)
if that helps


View raw message