hudi-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From GitBox <...@apache.org>
Subject [GitHub] [incubator-hudi] OpenOpened commented on a change in pull request #1200: [HUDI-514] A schema provider to get metadata through Jdbc
Date Wed, 12 Feb 2020 06:06:57 GMT
OpenOpened commented on a change in pull request #1200: [HUDI-514] A schema provider to get
metadata through Jdbc
URL: https://github.com/apache/incubator-hudi/pull/1200#discussion_r378055288
 
 

 ##########
 File path: hudi-utilities/src/main/java/org/apache/hudi/utilities/UtilHelpers.java
 ##########
 @@ -235,4 +248,57 @@ public static TypedProperties readConfig(InputStream in) throws IOException
{
     defaults.load(in);
     return defaults;
   }
+
+  /***
+   * call spark function get the schema through jdbc.
+   * The code logic implementation refers to spark 2.4.x and spark 3.x.
+   * @param options
+   * @return
+   * @throws Exception
+   */
+  public static Schema getJDBCSchema(Map<String, String> options) throws Exception
{
+    scala.collection.immutable.Map<String, String> ioptions = toScalaImmutableMap(options);
+    JDBCOptions jdbcOptions = new JDBCOptions(ioptions);
+    Connection conn = JdbcUtils.createConnectionFactory(jdbcOptions).apply();
+    String url = jdbcOptions.url();
+    String table = jdbcOptions.tableOrQuery();
+    JdbcOptionsInWrite jdbcOptionsInWrite = new JdbcOptionsInWrite(ioptions);
+    boolean tableExists = JdbcUtils.tableExists(conn, jdbcOptionsInWrite);
+
+    if (tableExists) {
+      JdbcDialect dialect = JdbcDialects.get(url);
+      try (PreparedStatement statement = conn.prepareStatement(dialect.getSchemaQuery(table)))
{
+        statement.setQueryTimeout(Integer.parseInt(options.get("timeout")));
+        try (ResultSet rs = statement.executeQuery()) {
+          StructType structType;
+          if (Boolean.parseBoolean(ioptions.get("nullable").get())) {
+            structType = JdbcUtils.getSchema(rs, dialect, true);
+          } else {
+            structType = JdbcUtils.getSchema(rs, dialect, false);
+          }
+          return AvroConversionUtils.convertStructTypeToAvroSchema(structType, table, "hoodie."
+ table);
+        }
+      }
+    } else {
+      throw new HoodieException(String.format("%s table does not exists!", table));
+    }
+  }
+
+  /**
+   * Replace java map with scala immutable map.
+   * refers: https://stackoverflow.com/questions/11903167/convert-java-util-hashmap-to-scala-collection-immutable-map-in-java/11903737#11903737
 
 Review comment:
   Do you guys have a better way? I checked the relevant information. If you want to achieve
conversion elegantly, there are only two methods. First, using scala code to do the conversion,
you need to implement a scala conversion class. Second, rewrite the related spark method with
java. The disadvantage is that it may cause compatibility problems in the future.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

Mime
View raw message