Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 872D7200C3A for ; Fri, 3 Mar 2017 04:08:48 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 85B99160B7D; Fri, 3 Mar 2017 03:08:48 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id C4691160B6F for ; Fri, 3 Mar 2017 04:08:47 +0100 (CET) Received: (qmail 94804 invoked by uid 500); 3 Mar 2017 03:08:47 -0000 Mailing-List: contact reviews-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list reviews@spark.apache.org Received: (qmail 94793 invoked by uid 99); 3 Mar 2017 03:08:46 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 03 Mar 2017 03:08:46 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id 9FB5ADFC15; Fri, 3 Mar 2017 03:08:46 +0000 (UTC) From: viirya To: reviews@spark.apache.org Reply-To: reviews@spark.apache.org References: In-Reply-To: Subject: [GitHub] spark pull request #16944: [SPARK-19611][SQL] Introduce configurable table s... Content-Type: text/plain Message-Id: <20170303030846.9FB5ADFC15@git1-us-west.apache.org> Date: Fri, 3 Mar 2017 03:08:46 +0000 (UTC) archived-at: Fri, 03 Mar 2017 03:08:48 -0000 Github user viirya commented on a diff in the pull request: https://github.com/apache/spark/pull/16944#discussion_r104081425 --- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala --- @@ -597,6 +597,16 @@ private[spark] class HiveExternalCatalog(conf: SparkConf, hadoopConf: Configurat } } + override def alterTableSchema(db: String, table: String, schema: StructType): Unit = withClient { + requireTableExists(db, table) + val rawTable = getRawTable(db, table) + val withNewSchema = rawTable.copy(schema = schema) --- End diff -- Why we do two `copy`? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastructure@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org For additional commands, e-mail: reviews-help@spark.apache.org