From issues-return-118040-archive-asf-public=cust-asf.ponee.io@hive.apache.org Mon May 14 09:59:04 2018 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx-eu-01.ponee.io (Postfix) with SMTP id 10882180627 for ; Mon, 14 May 2018 09:59:03 +0200 (CEST) Received: (qmail 4077 invoked by uid 500); 14 May 2018 07:59:03 -0000 Mailing-List: contact issues-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hive.apache.org Delivered-To: mailing list issues@hive.apache.org Received: (qmail 4068 invoked by uid 99); 14 May 2018 07:59:03 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 14 May 2018 07:59:03 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id A72141A168B for ; Mon, 14 May 2018 07:59:02 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -102.311 X-Spam-Level: X-Spam-Status: No, score=-102.311 tagged_above=-999 required=6.31 tests=[RCVD_IN_DNSWL_MED=-2.3, SPF_PASS=-0.001, T_RP_MATCHES_RCVD=-0.01, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id XaAiyIEtuU8U for ; Mon, 14 May 2018 07:59:01 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTP id 099D55F398 for ; Mon, 14 May 2018 07:59:01 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id 84040E0E29 for ; Mon, 14 May 2018 07:59:00 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id 3D329217A0 for ; Mon, 14 May 2018 07:59:00 +0000 (UTC) Date: Mon, 14 May 2018 07:59:00 +0000 (UTC) From: "Thomas Nys (JIRA)" To: issues@hive.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (HIVE-19475) Issue when streaming data to Azure Data Lake Store MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/HIVE-19475?page=3Dcom.atlassian= .jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=3D1647= 3884#comment-16473884 ]=20 Thomas Nys commented on HIVE-19475: ----------------------------------- Hi I had to add them to my application dependencies(sbt) to have them in my cl= ass-path. Is this not the correct way to go? =C2=A0 > Issue when streaming data to Azure Data Lake Store > -------------------------------------------------- > > Key: HIVE-19475 > URL: https://issues.apache.org/jira/browse/HIVE-19475 > Project: Hive > Issue Type: Bug > Components: Streaming > Affects Versions: 2.2.0 > Environment: HDInsight 3.6 on Ubuntu 16.04.4 LTS (GNU/Linux 4.13.= 0-1012-azure x86_64) > Used java libraries: > {code:java} > libraryDependencies +=3D "org.apache.hive.hcatalog" % "hive-hcatalog-stre= aming" % "2.2.0" > libraryDependencies +=3D "org.apache.hive.hcatalog" % "hive-hcatalog-core= " % "2.2.0" > libraryDependencies +=3D "org.apache.hadoop" % "hadoop-client" % "2.8.0" > {code} > Please let me know if more details are needed. > Reporter: Thomas Nys > Priority: Major > > I am trying to stream data from a Java (Play2 api) to=C2=A0 HDInsight Hiv= e interactive query with Azure Data Lake Store as storage back-end. The fol= lowing code is ran on one of the head nodes of the cluster. > When fetching a transaction-batch: > {code:java} > TransactionBatch txnBatch =3D this.connection.fetchTransactionBatch(10, (= RecordWriter)writer); > {code} > I receive the following error: > {code:java} > play.api.UnexpectedException: Unexpected exception[StreamingIOFailure: Fa= iled creating RecordUpdaterS for adl://home/hive/warehouse/raw_telemetry_da= ta/ingest_date=3D2018-05-07 txnIds[506,515]] > =C2=A0=C2=A0 =C2=A0at play.api.http.HttpErrorHandlerExceptions$.throwable= ToUsefulException(HttpErrorHandler.scala:251) > =C2=A0=C2=A0 =C2=A0at play.api.http.DefaultHttpErrorHandler.onServerError= (HttpErrorHandler.scala:182) > =C2=A0=C2=A0 =C2=A0at play.core.server.AkkaHttpServer$$anonfun$2.applyOrE= lse(AkkaHttpServer.scala:343) > =C2=A0=C2=A0 =C2=A0at play.core.server.AkkaHttpServer$$anonfun$2.applyOrE= lse(AkkaHttpServer.scala:341) > =C2=A0=C2=A0 =C2=A0at scala.concurrent.Future.$anonfun$recoverWith$1(Futu= re.scala:414) > =C2=A0=C2=A0 =C2=A0at scala.concurrent.impl.Promise.$anonfun$transformWit= h$1(Promise.scala:37) > =C2=A0=C2=A0 =C2=A0at scala.concurrent.impl.CallbackRunnable.run(Promise.= scala:60) > =C2=A0=C2=A0 =C2=A0at akka.dispatch.BatchingExecutor$AbstractBatch.proces= sBatch(BatchingExecutor.scala:55) > =C2=A0=C2=A0 =C2=A0at akka.dispatch.BatchingExecutor$BlockableBatch.$anon= fun$run$1(BatchingExecutor.scala:91) > =C2=A0=C2=A0 =C2=A0at scala.runtime.java8.JFunction0$mcV$sp.apply(JFuncti= on0$mcV$sp.java:12) > Caused by: org.apache.hive.hcatalog.streaming.StreamingIOFailure: Failed = creating RecordUpdaterS for adl://home/hive/warehouse/raw_telemetry_data/in= gest_date=3D2018-05-07 txnIds[506,515] > =C2=A0=C2=A0 =C2=A0at org.apache.hive.hcatalog.streaming.AbstractRecordWr= iter.newBatch(AbstractRecordWriter.java:208) > =C2=A0=C2=A0 =C2=A0at org.apache.hive.hcatalog.streaming.HiveEndPoint$Tra= nsactionBatchImpl.(HiveEndPoint.java:608) > =C2=A0=C2=A0 =C2=A0at org.apache.hive.hcatalog.streaming.HiveEndPoint$Tra= nsactionBatchImpl.(HiveEndPoint.java:556) > =C2=A0=C2=A0 =C2=A0at org.apache.hive.hcatalog.streaming.HiveEndPoint$Con= nectionImpl.fetchTransactionBatchImpl(HiveEndPoint.java:442) > =C2=A0=C2=A0 =C2=A0at org.apache.hive.hcatalog.streaming.HiveEndPoint$Con= nectionImpl.fetchTransactionBatch(HiveEndPoint.java:422) > =C2=A0=C2=A0 =C2=A0at hive.HiveRepository.createMany(HiveRepository.java:= 76) > =C2=A0=C2=A0 =C2=A0at controllers.HiveController.create(HiveController.ja= va:40) > =C2=A0=C2=A0 =C2=A0at router.Routes$$anonfun$routes$1.$anonfun$applyOrEls= e$2(Routes.scala:70) > =C2=A0=C2=A0 =C2=A0at play.core.routing.HandlerInvokerFactory$$anon$4.res= ultCall(HandlerInvoker.scala:137) > =C2=A0=C2=A0 =C2=A0at play.core.routing.HandlerInvokerFactory$JavaActionI= nvokerFactory$$anon$8$$anon$2$$anon$1.invocation(HandlerInvoker.scala:108) > Caused by: java.io.IOException: No FileSystem for scheme: adl > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.fs.FileSystem.getFileSystemClass(= FileSystem.java:2798) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.fs.FileSystem.createFileSystem(Fi= leSystem.java:2809) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.fs.FileSystem.access$200(FileSyst= em.java:100) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.fs.FileSystem$Cache.getInternal(F= ileSystem.java:2848) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.fs.FileSystem$Cache.get(FileSyste= m.java:2830) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.fs.FileSystem.get(FileSystem.java= :389) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.fs.Path.getFileSystem(Path.java:3= 56) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hive.ql.io.orc.OrcRecordUpdater.<= init>(OrcRecordUpdater.java:187) > =C2=A0=C2=A0 =C2=A0at org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat.ge= tRecordUpdater(OrcOutputFormat.java:278) > =C2=A0=C2=A0 =C2=A0at org.apache.hive.hcatalog.streaming.AbstractRecordWr= iter.createRecordUpdater(AbstractRecordWriter.java:268){code} > =C2=A0 > Any help would be greatly appreciated. > =C2=A0 > =C2=A0 > =C2=A0 -- This message was sent by Atlassian JIRA (v7.6.3#76005)