bahir-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From emlaver <...@git.apache.org>
Subject [GitHub] bahir pull request #59: [BAHIR-138] fix deprecated warnings in sql-cloudant
Date Fri, 15 Dec 2017 18:10:13 GMT
GitHub user emlaver opened a pull request:

    https://github.com/apache/bahir/pull/59

    [BAHIR-138] fix deprecated warnings in sql-cloudant

    _What_
    Fix warnings in `DefaultSource` class, and in `CloudantStreaming` and `CloudantStreamingSelector`
examples. 
    
    _How_
    - Imported `spark.implicits._` to convert Spark RDD to Dataset
    - Replaced deprecated `json(RDD[String])` with `json(Dataset[String])`
    
    Improved streaming examples:
    - Replaced registerTempTable with preferred createOrReplaceTempView
    - Replaced !isEmpty with nonEmpty
    - Use an accessible 'sales' database so users can run the example without any setup
    - Fixed error message when stopping tests by adding logic to streaming receiver to not
store documents in Spark memory when stream has stopped
    
    See [BAHIR-138](https://issues.apache.org/jira/browse/BAHIR-138)

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/emlaver/bahir 138-fix-deprecated-warnings-sql-cloudant

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/bahir/pull/59.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #59
    
----
commit 88e239c8ab1a24ef39ca969cd11c1834ccaa1931
Author: Esteban Laver <emlaver@us.ibm.com>
Date:   2017-10-02T20:18:40Z

    Fix deprecation warning messages
    Streaming example fixes:
    - Replaced registerTempTable with preferred createOrReplaceTempView
    - Replaced !isEmpty with nonEmpty

commit 19a0e15328c6b3026320b4cf6ff9a9255cc1ac9c
Author: Esteban Laver <emlaver@us.ibm.com>
Date:   2017-12-15T17:44:50Z

    Improved streaming tests by using an accessible 'sales' database
    - Added logic to streaming receiver to not store doc in Spark memory when stream has stopped

----


---

Mime
View raw message