From issues-return-181499-archive-asf-public=cust-asf.ponee.io@flink.apache.org Wed Aug 1 14:18:08 2018 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx-eu-01.ponee.io (Postfix) with SMTP id 4B146180718 for ; Wed, 1 Aug 2018 14:18:07 +0200 (CEST) Received: (qmail 74727 invoked by uid 500); 1 Aug 2018 12:18:06 -0000 Mailing-List: contact issues-help@flink.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@flink.apache.org Delivered-To: mailing list issues@flink.apache.org Received: (qmail 74700 invoked by uid 99); 1 Aug 2018 12:18:05 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 01 Aug 2018 12:18:05 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id 810A3C2055 for ; Wed, 1 Aug 2018 12:18:05 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -109.501 X-Spam-Level: X-Spam-Status: No, score=-109.501 tagged_above=-999 required=6.31 tests=[ENV_AND_HDR_SPF_MATCH=-0.5, KAM_ASCII_DIVIDERS=0.8, RCVD_IN_DNSWL_MED=-2.3, SPF_PASS=-0.001, USER_IN_DEF_SPF_WL=-7.5, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id 4bIr0QhcFj0r for ; Wed, 1 Aug 2018 12:18:04 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-eu.apache.org (ASF Mail Server at mx1-lw-eu.apache.org) with ESMTP id CD1035F490 for ; Wed, 1 Aug 2018 12:18:02 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id BCFA6E2637 for ; Wed, 1 Aug 2018 12:18:01 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id F34BC27766 for ; Wed, 1 Aug 2018 12:18:00 +0000 (UTC) Date: Wed, 1 Aug 2018 12:18:00 +0000 (UTC) From: "ASF GitHub Bot (JIRA)" To: issues@flink.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (FLINK-9833) End-to-end test: SQL Client with unified source/sink/format MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/FLINK-9833?page=3Dcom.atlassian= .jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=3D1656= 5203#comment-16565203 ]=20 ASF GitHub Bot commented on FLINK-9833: --------------------------------------- zentol commented on a change in pull request #6422: [FLINK-9833] [e2e] Add = a SQL Client end-to-end test with unified source/sink/format URL: https://github.com/apache/flink/pull/6422#discussion_r206855249 =20 =20 ########## File path: flink-end-to-end-tests/test-scripts/test_sql_client.sh ########## @@ -0,0 +1,283 @@ +#!/usr/bin/env bash +##########################################################################= ###### +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +##########################################################################= ###### + +source "$(dirname "$0")"/common.sh +source "$(dirname "$0")"/kafka-common.sh + +SQL_TOOLBOX_JAR=3D$END_TO_END_DIR/flink-sql-client-test/target/SqlToolbox.= jar +SQL_JARS_DIR=3D$END_TO_END_DIR/flink-sql-client-test/target/sql-jars + +##########################################################################= ###### +# Verify existing SQL jars +##########################################################################= ###### + +EXTRACTED_JAR=3D$TEST_DATA_DIR/extracted + +mkdir -p $EXTRACTED_JAR + +for SQL_JAR in $SQL_JARS_DIR/*.jar; do + echo "Checking SQL JAR: $SQL_JAR" + unzip $SQL_JAR -d $EXTRACTED_JAR > /dev/null + + # check for proper shading + for EXTRACTED_FILE in $(find $EXTRACTED_JAR -type f); do + + if ! [[ $EXTRACTED_FILE =3D "$EXTRACTED_JAR/org/apache/flink"* ]] && \ + ! [[ $EXTRACTED_FILE =3D "$EXTRACTED_JAR/META-INF"* ]] && \ + ! [[ $EXTRACTED_FILE =3D "$EXTRACTED_JAR/LICENSE"* ]] && \ + ! [[ $EXTRACTED_FILE =3D "$EXTRACTED_JAR/NOTICE"* ]] ; then + echo "Bad file in JAR: $EXTRACTED_FILE" + exit 1 + fi + done + + # check for proper factory + if [ ! -f $EXTRACTED_JAR/META-INF/services/org.apache.flink.table.factor= ies.TableFactory ]; then + echo "No table factory found in JAR: $SQL_JAR" + exit 1 + fi + + # clean up + rm -r $EXTRACTED_JAR/* +done + +##########################################################################= ###### +# Run a SQL statement +##########################################################################= ###### + +echo "Testing SQL statement..." + +function sql_cleanup() { + # don't call ourselves again for another signal interruption + trap "exit -1" INT + # don't call ourselves again for normal exit + trap "" EXIT + + stop_kafka_cluster +} +trap sql_cleanup INT +trap sql_cleanup EXIT + +# prepare Kafka +echo "Preparing Kafka..." + +setup_kafka_dist + +start_kafka_cluster + +create_kafka_topic 1 1 test-json +create_kafka_topic 1 1 test-avro + +# put JSON data into Kafka +echo "Sending messages to Kafka..." +send_messages_to_kafka '{"timestamp": "2018-03-12 08:00:00", "user": "Alic= e", "event": { "type": "WARNING", "message": "This is a warning."}}' test-j= son +send_messages_to_kafka '{"timestamp": "2018-03-12 08:10:00", "user": "Alic= e", "event": { "type": "WARNING", "message": "This is a warning."}}' test-j= son # duplicate +send_messages_to_kafka '{"timestamp": "2018-03-12 09:00:00", "user": "Bob"= , "event": { "type": "WARNING", "message": "This is another warning."}}' te= st-json +send_messages_to_kafka '{"timestamp": "2018-03-12 09:10:00", "user": "Alic= e", "event": { "type": "INFO", "message": "This is a info."}}' test-json +send_messages_to_kafka '{"timestamp": "2018-03-12 09:20:00", "user": "Stev= e", "event": { "type": "INFO", "message": "This is another info."}}' test-j= son +send_messages_to_kafka '{"timestamp": "2018-03-12 09:30:00", "user": "Stev= e", "event": { "type": "INFO", "message": "This is another info."}}' test-j= son # duplicate =20 Review comment: move comments to separate line ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. =20 For queries about this service, please contact Infrastructure at: users@infra.apache.org > End-to-end test: SQL Client with unified source/sink/format > ----------------------------------------------------------- > > Key: FLINK-9833 > URL: https://issues.apache.org/jira/browse/FLINK-9833 > Project: Flink > Issue Type: Sub-task > Components: Table API & SQL > Reporter: Timo Walther > Assignee: Timo Walther > Priority: Blocker > Labels: pull-request-available > Fix For: 1.6.0 > > > After=C2=A0FLINK-8858 is resolved we can add an end-to-end test for the S= QL Client. The test should perform the following steps: > - Put JSON data into Kafka > - Submit and execute a {{INSERT INTO}} statement that reads from a Kafka = connector with JSON format, does some ETL, and writes to Kafka with Avro fo= rmat > - Validate Avro data=20 -- This message was sent by Atlassian JIRA (v7.6.3#76005)