From commits-return-75241-archive-asf-public=cust-asf.ponee.io@airflow.apache.org Wed Nov 13 10:00:06 2019 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [207.244.88.153]) by mx-eu-01.ponee.io (Postfix) with SMTP id 20B1B180621 for ; Wed, 13 Nov 2019 11:00:06 +0100 (CET) Received: (qmail 18998 invoked by uid 500); 13 Nov 2019 10:00:05 -0000 Mailing-List: contact commits-help@airflow.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@airflow.apache.org Delivered-To: mailing list commits@airflow.apache.org Received: (qmail 18989 invoked by uid 99); 13 Nov 2019 10:00:05 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd4-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 13 Nov 2019 10:00:05 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd4-us-west.apache.org (ASF Mail Server at spamd4-us-west.apache.org) with ESMTP id DCB03C156D for ; Wed, 13 Nov 2019 10:00:04 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd4-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -112.199 X-Spam-Level: X-Spam-Status: No, score=-112.199 tagged_above=-999 required=6.31 tests=[ENV_AND_HDR_SPF_MATCH=-0.5, KAM_ASCII_DIVIDERS=0.8, RCVD_IN_DNSWL_HI=-5, SPF_HELO_NONE=0.001, SPF_PASS=-0.001, URIBL_BLOCKED=0.001, USER_IN_DEF_SPF_WL=-7.5, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-ec2-va.apache.org ([10.40.0.8]) by localhost (spamd4-us-west.apache.org [10.40.0.11]) (amavisd-new, port 10024) with ESMTP id tl2VvtdRHtFG for ; Wed, 13 Nov 2019 10:00:02 +0000 (UTC) Received-SPF: Pass (mailfrom) identity=mailfrom; client-ip=207.244.88.153; helo=mail.apache.org; envelope-from=jira@apache.org; receiver= Received: from mail.apache.org (hermes.apache.org [207.244.88.153]) by mx1-ec2-va.apache.org (ASF Mail Server at mx1-ec2-va.apache.org) with SMTP id 5C1DDBC560 for ; Wed, 13 Nov 2019 10:00:02 +0000 (UTC) Received: (qmail 18907 invoked by uid 99); 13 Nov 2019 10:00:01 -0000 Received: from mailrelay1-us-west.apache.org (HELO mailrelay1-us-west.apache.org) (209.188.14.139) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 13 Nov 2019 10:00:01 +0000 Received: from jira-he-de.apache.org (static.172.67.40.188.clients.your-server.de [188.40.67.172]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id E0FC2E30A8 for ; Wed, 13 Nov 2019 10:00:00 +0000 (UTC) Received: from jira-he-de.apache.org (localhost.localdomain [127.0.0.1]) by jira-he-de.apache.org (ASF Mail Server at jira-he-de.apache.org) with ESMTP id 5CB9E78047E for ; Wed, 13 Nov 2019 10:00:00 +0000 (UTC) Date: Wed, 13 Nov 2019 10:00:00 +0000 (UTC) From: "Paul De (Jira)" To: commits@airflow.incubator.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Comment Edited] (AIRFLOW-5249) BigQueryCheckOperator fails for datasets outside of 'US' region MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/AIRFLOW-5249?page=3Dcom.atlassi= an.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=3D16= 973189#comment-16973189 ]=20 Paul De edited comment on AIRFLOW-5249 at 11/13/19 9:59 AM: ------------------------------------------------------------ The latest version avail on GCP (Cloud Composer service) is 1.10.3 and I ge= t the same error (404 Job not found) in europe-west2 for BigQueryIntervalCh= eckOperator using=C2=A01.10.3 It's a significant limitation and GCP does patch=C2=A0recent versions=C2=A0= [https://cloud.google.com/composer/docs/release-notes] - if possible, a fix would be good. =C2=A0 =C2=A0 was (Author: pauld): The latest version avail on GCP (Cloud Composer service) is 1.10.3 and I ge= t the same error (404 Job not found) in europe-west2 for BigQueryIntervalCh= eckOperator using=C2=A01.10.3 It's a significant limitation - if possible, a fix would be good. =C2=A0 =C2=A0 > BigQueryCheckOperator fails for datasets outside of 'US' region > --------------------------------------------------------------- > > Key: AIRFLOW-5249 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5249 > Project: Apache Airflow > Issue Type: Bug > Components: operators > Affects Versions: 1.10.2 > Reporter: Michael > Assignee: Thomas Pilewicz > Priority: Blocker > > When I try to use the BigQueryCheckOperator or BigQueryValueCheckOperator= on a dataset that is not in the 'US' location my task fails with the follo= wing error > {code:java} > [2019-08-15 07:26:19,378] {__init__.py:1580} ERROR - BigQuery job status = check failed. Final error was: 404 > Traceback (most recent call last): > File "/usr/local/lib/python3.6/site-packages/airflow/contrib/hooks/bigq= uery_hook.py", line 1241, in run_with_configuration > jobId=3Dself.running_job_id).execute() > File "/usr/local/lib/python3.6/site-packages/googleapiclient/_helpers.p= y", line 130, in positional_wrapper > return wrapped(*args, **kwargs) > File "/usr/local/lib/python3.6/site-packages/googleapiclient/http.py", = line 855, in execute > raise HttpError(resp, content, uri=3Dself.uri) > googleapiclient.errors.HttpError: > During handling of the above exception, another exception occurred: > Traceback (most recent call last): > File "/usr/local/lib/python3.6/site-packages/airflow/models/__init__.py= ", line 1441, in _run_raw_task > result =3D task_copy.execute(context=3Dcontext) > File "/usr/local/lib/python3.6/site-packages/airflow/operators/check_op= erator.py", line 81, in execute > records =3D self.get_db_hook().get_first(self.sql) > File "/usr/local/lib/python3.6/site-packages/airflow/hooks/dbapi_hook.p= y", line 138, in get_first > cur.execute(sql) > File "/usr/local/lib/python3.6/site-packages/airflow/contrib/hooks/bigq= uery_hook.py", line 1821, in execute > self.job_id =3D self.run_query(sql) > File "/usr/local/lib/python3.6/site-packages/airflow/contrib/hooks/bigq= uery_hook.py", line 849, in run_query > return self.run_with_configuration(configuration) > File "/usr/local/lib/python3.6/site-packages/airflow/contrib/hooks/bigq= uery_hook.py", line 1263, in run_with_configuration > format(err.resp.status)) > Exception: BigQuery job status check failed. Final error was: 404 > [2019-08-15 07:26:19,388] {__init__.py:1611} INFO - Marking task as FAILE= D. > {code} > This is the same error I get when I try to run the BigQuery operator with= out specifying a location. When I run the same operator on a dataset that i= s in the US region It succeeds. > The BigQueryCheckOperator does not accept a location as one of its argume= nts and does not pass a location to the BigQueryHook, I believe this is the= source of the problem.=C2=A0 > =C2=A0 > I realise a task (AIRFLOW-3601) was already created to fix a similar issu= e to this one, but the referenced task calls out the two operators I'm havi= ng an issue with as out of scope and after commenting on that task I have n= ot received a response. > =C2=A0 -- This message was sent by Atlassian Jira (v8.3.4#803005)