From issues-return-149819-archive-asf-public=cust-asf.ponee.io@flink.apache.org Fri Jan 26 18:55:46 2018 Return-Path: X-Original-To: archive-asf-public@eu.ponee.io Delivered-To: archive-asf-public@eu.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by mx-eu-01.ponee.io (Postfix) with ESMTP id 46080180657 for ; Fri, 26 Jan 2018 18:55:46 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 35C55160C51; Fri, 26 Jan 2018 17:55:46 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 5696A160C2E for ; Fri, 26 Jan 2018 18:55:45 +0100 (CET) Received: (qmail 56553 invoked by uid 500); 26 Jan 2018 17:55:44 -0000 Mailing-List: contact issues-help@flink.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@flink.apache.org Delivered-To: mailing list issues@flink.apache.org Received: (qmail 56544 invoked by uid 99); 26 Jan 2018 17:55:44 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 26 Jan 2018 17:55:44 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id 0DD81C0D9F for ; Fri, 26 Jan 2018 17:55:44 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -109.511 X-Spam-Level: X-Spam-Status: No, score=-109.511 tagged_above=-999 required=6.31 tests=[ENV_AND_HDR_SPF_MATCH=-0.5, KAM_ASCII_DIVIDERS=0.8, RCVD_IN_DNSWL_MED=-2.3, SPF_PASS=-0.001, T_RP_MATCHES_RCVD=-0.01, USER_IN_DEF_SPF_WL=-7.5, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id amjnzm_ig8ci for ; Fri, 26 Jan 2018 17:55:09 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTP id 5F37E60E0D for ; Fri, 26 Jan 2018 17:55:08 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id 0A438E257E for ; Fri, 26 Jan 2018 17:55:08 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id 2B83C247A1 for ; Fri, 26 Jan 2018 17:55:04 +0000 (UTC) Date: Fri, 26 Jan 2018 17:55:04 +0000 (UTC) From: "ASF GitHub Bot (JIRA)" To: issues@flink.apache.org Message-ID: In-Reply-To: References: Subject: ***UNCHECKED*** [jira] [Commented] (FLINK-8240) Create unified interfaces to configure and instatiate TableSources MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/FLINK-8240?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16341357#comment-16341357 ] ASF GitHub Bot commented on FLINK-8240: --------------------------------------- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/5240#discussion_r164168781 --- Diff: flink-libraries/flink-table/src/main/scala/org/apache/flink/table/descriptors/Rowtime.scala --- @@ -0,0 +1,131 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.flink.table.descriptors + +import org.apache.flink.table.api.Types +import org.apache.flink.table.descriptors.NormalizedProperties.{normalizeTimestampExtractor, normalizeWatermarkStrategy} +import org.apache.flink.table.sources.tsextractors.{ExistingField, StreamRecordTimestamp, TimestampExtractor} +import org.apache.flink.table.sources.wmstrategies.{AscendingTimestamps, BoundedOutOfOrderTimestamps, PreserveWatermarks, WatermarkStrategy} + +import scala.collection.mutable + +/** + * Rowtime descriptor for describing an event time attribute in the schema. + */ +class Rowtime extends Descriptor { + + private var timestampExtractor: Option[TimestampExtractor] = None + private var watermarkStrategy: Option[WatermarkStrategy] = None + + /** + * Sets a built-in timestamp extractor that converts an existing [[Long]] or + * [[Types.SQL_TIMESTAMP]] field into the rowtime attribute. + * + * @param fieldName The field to convert into a rowtime attribute. + */ + def timestampFromField(fieldName: String): Rowtime = { + timestampExtractor = Some(new ExistingField(fieldName)) + this + } + + /** + * Sets a built-in timestamp extractor that converts the assigned timestamp from + * a DataStream API record into the rowtime attribute. + * + * Note: This extractor only works in streaming environments. + */ + def timestampFromDataStream(): Rowtime = { --- End diff -- `preserveSourceTimestamps()` > Create unified interfaces to configure and instatiate TableSources > ------------------------------------------------------------------ > > Key: FLINK-8240 > URL: https://issues.apache.org/jira/browse/FLINK-8240 > Project: Flink > Issue Type: New Feature > Components: Table API & SQL > Reporter: Timo Walther > Assignee: Timo Walther > Priority: Major > > At the moment every table source has different ways for configuration and instantiation. Some table source are tailored to a specific encoding (e.g., {{KafkaAvroTableSource}}, {{KafkaJsonTableSource}}) or only support one encoding for reading (e.g., {{CsvTableSource}}). Each of them might implement a builder or support table source converters for external catalogs. > The table sources should have a unified interface for discovery, defining common properties, and instantiation. The {{TableSourceConverters}} provide a similar functionality but use an external catalog. We might generialize this interface. > In general a table source declaration depends on the following parts: > {code} > - Source > - Type (e.g. Kafka, Custom) > - Properties (e.g. topic, connection info) > - Encoding > - Type (e.g. Avro, JSON, CSV) > - Schema (e.g. Avro class, JSON field names/types) > - Rowtime descriptor/Proctime > - Watermark strategy and Watermark properties > - Time attribute info > - Bucketization > {code} > This issue needs a design document before implementation. Any discussion is very welcome. -- This message was sent by Atlassian JIRA (v7.6.3#76005)