Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id DC14210093 for ; Wed, 31 Dec 2014 07:13:50 +0000 (UTC) Received: (qmail 29818 invoked by uid 500); 31 Dec 2014 07:13:43 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 29704 invoked by uid 500); 31 Dec 2014 07:13:43 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 29688 invoked by uid 99); 31 Dec 2014 07:13:41 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 31 Dec 2014 07:13:41 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,MIME_QP_LONG_LINE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of yuzhihong@gmail.com designates 209.85.192.182 as permitted sender) Received: from [209.85.192.182] (HELO mail-pd0-f182.google.com) (209.85.192.182) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 31 Dec 2014 07:13:36 +0000 Received: by mail-pd0-f182.google.com with SMTP id p10so20495965pdj.13 for ; Tue, 30 Dec 2014 23:12:30 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=references:mime-version:in-reply-to:content-type :content-transfer-encoding:message-id:cc:from:subject:date:to; bh=Pm0kyhvoiKesfroTji50e7jGooQdKoxARwgkY3SY2iU=; b=MTS+pCiEh/dGuero4jhpo1a3uFozXlrpJghK2FDyQbz3TweRjrUCgd4D0sPvTuAMnM 0e2t4eXi9xiToYHUWmru0PkqOizYzs/QIuxnrfGUE6bWtwm2erB31cY08qtlmadw2lMj NB2PXqnCx23G9gxJMsnhEQcv/SKmRxH1gci6BcbsNJ5IINyQ8AZ1RnYFX3vvmnEHdtxx Oqc27MrxDO10viSqxNhUwptpP+ZBMHhe+3paAdCjIHjQyDR6rUpzr0VsTE7fl5MX71Wq QC7mUkjef5jNwu4nIOuFoOWKzjRPkkVTlci/wIQZGl/0bJzKpIS/rTlWg4K/auQUyIb7 vdew== X-Received: by 10.66.221.135 with SMTP id qe7mr5886878pac.26.1420009950193; Tue, 30 Dec 2014 23:12:30 -0800 (PST) Received: from [192.168.0.15] (c-24-130-236-83.hsd1.ca.comcast.net. [24.130.236.83]) by mx.google.com with ESMTPSA id cm10sm40558600pad.46.2014.12.30.23.12.29 for (version=TLSv1 cipher=ECDHE-RSA-RC4-SHA bits=128/128); Tue, 30 Dec 2014 23:12:29 -0800 (PST) References: Mime-Version: 1.0 (1.0) In-Reply-To: Content-Type: multipart/alternative; boundary=Apple-Mail-2DE24C25-1E46-412A-A5A1-4F1B1B4A5B08 Content-Transfer-Encoding: 7bit Message-Id: <779DE8E2-25A8-464B-819E-987C1E23B050@gmail.com> Cc: "user@hadoop.apache.org" X-Mailer: iPhone Mail (10B146) From: Ted Yu Subject: Re: way to add custom udf jar in hadoop 2.x version Date: Tue, 30 Dec 2014 23:12:28 -0800 To: "user@hadoop.apache.org" X-Virus-Checked: Checked by ClamAV on apache.org --Apple-Mail-2DE24C25-1E46-412A-A5A1-4F1B1B4A5B08 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: quoted-printable Have you seen this thread ? http://search-hadoop.com/m/8er9TcALc/Hive+udf+custom+jar&subj=3DBest+way+to+= add+custom+UDF+jar+in+HiveServer2 On Dec 30, 2014, at 10:56 PM, reena upadhyay wrote: > Hi, >=20 > I am using hadoop 2.4.0 version. I have created custom udf jar. I am tryin= g to execute a simple select udf query using java hive jdbc client program. W= hen hive execute the query using map reduce job, then the query execution ge= t fails because the mapper is not able to locate the udf class. > So I wanted to add the udf jar in hadoop environment permanently. Please s= uggest me a way to add this external jar for single node and multi node hado= op cluster. >=20 > PS: I am using hive 0.13.1 version and I already have this custom udf jar a= dded in HIVE_HOME/lib directory. >=20 >=20 > Thanks --Apple-Mail-2DE24C25-1E46-412A-A5A1-4F1B1B4A5B08 Content-Type: text/html; charset=utf-8 Content-Transfer-Encoding: quoted-printable
Have you seen this thread ?

On Dec 30, 2014, at 1= 0:56 PM, reena upadhyay <reena2485= @gmail.com> wrote:

Hi,

I am using hadoop 2.4.0 version. I h= ave created custom udf jar. I am trying to execute a simple select udf query= using java hive jdbc client program. When hive execute the query using map r= educe job, then the query execution get fails because the mapper is not able= to locate the udf class.
So I wanted to add the udf jar in hadoop envir= onment permanently. Please suggest me a way to add this external jar for sin= gle node and multi node hadoop cluster.

PS: I am us= ing hive 0.13.1 version and I already have this custom udf jar added in HIVE= _HOME/lib directory.


Thanks
= --Apple-Mail-2DE24C25-1E46-412A-A5A1-4F1B1B4A5B08--