Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 953F4200CFF for ; Tue, 22 Aug 2017 18:25:11 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 91BB91673AE; Tue, 22 Aug 2017 16:25:11 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id ABCEC1673AA for ; Tue, 22 Aug 2017 18:25:10 +0200 (CEST) Received: (qmail 26296 invoked by uid 500); 22 Aug 2017 16:25:02 -0000 Mailing-List: contact common-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list common-dev@hadoop.apache.org Received: (qmail 26283 invoked by uid 99); 22 Aug 2017 16:25:01 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd4-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 22 Aug 2017 16:25:01 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd4-us-west.apache.org (ASF Mail Server at spamd4-us-west.apache.org) with ESMTP id 48286C0169 for ; Tue, 22 Aug 2017 16:25:01 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd4-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 0.697 X-Spam-Level: X-Spam-Status: No, score=0.697 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, KAM_ASCII_DIVIDERS=0.8, RP_MATCHES_RCVD=-0.001, SPF_HELO_PASS=-0.001, SPF_PASS=-0.001] autolearn=disabled Authentication-Results: spamd4-us-west.apache.org (amavisd-new); dkim=pass (1024-bit key) header.d=effectivemachines.com Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd4-us-west.apache.org [10.40.0.11]) (amavisd-new, port 10024) with ESMTP id soWRJpUYYyrg for ; Tue, 22 Aug 2017 16:24:59 +0000 (UTC) Received: from effectivemachines.com (effectivemachines.com [104.236.136.112]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTPS id 46C715F46F for ; Tue, 22 Aug 2017 16:24:59 +0000 (UTC) Received: from localhost (localhost [127.0.0.1]) by effectivemachines.com (Postfix) with ESMTP id B3C1A164367; Tue, 22 Aug 2017 09:24:57 -0700 (PDT) Received: from effectivemachines.com ([127.0.0.1]) by localhost (effectivemachines.com [127.0.0.1]) (amavisd-new, port 10032) with ESMTP id hkykNSRIs4I3; Tue, 22 Aug 2017 09:24:57 -0700 (PDT) Received: from localhost (localhost [127.0.0.1]) by effectivemachines.com (Postfix) with ESMTP id 7E133165294; Tue, 22 Aug 2017 09:24:57 -0700 (PDT) DKIM-Filter: OpenDKIM Filter v2.9.2 effectivemachines.com 7E133165294 DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=effectivemachines.com; s=D35149BA-5A53-11E6-AF53-2EA667C55D35; t=1503419097; bh=oJuLtRh9SlyICz/FgjlAhLV0Bl3XDH4y2VcZEU6Tb9g=; h=Content-Type:Mime-Version:Subject:From:Date: Content-Transfer-Encoding:Message-Id:To; b=kFx4+YItYmJ5BIEGHGva3idWrNMvjk0AZZD9NeWu/nFjKKs1rXAewNoBy5gg92ILb FcPd+Y/iTFMl6VSp+FfOmy/jQoUTeXcYcMr7LZYd6bVjkBDHzG18QSFvfJf0zExelI 06+i+p0drCDCk2U2XahDs8wzDaA6dd1BQZooVp+s= X-Virus-Scanned: amavisd-new at effectivemachines.com Received: from effectivemachines.com ([127.0.0.1]) by localhost (effectivemachines.com [127.0.0.1]) (amavisd-new, port 10026) with ESMTP id vDi_xnSNncNv; Tue, 22 Aug 2017 09:24:57 -0700 (PDT) Received: from [172.20.10.4] (mobile-166-175-60-57.mycingular.net [166.175.60.57]) by effectivemachines.com (Postfix) with ESMTPSA id CEA50164367; Tue, 22 Aug 2017 09:24:56 -0700 (PDT) Content-Type: text/plain; charset=windows-1252 Mime-Version: 1.0 (Mac OS X Mail 9.3 \(3124\)) Subject: Re: hadoop 3 scripts & classpath setup From: Allen Wittenauer In-Reply-To: Date: Tue, 22 Aug 2017 09:24:56 -0700 Cc: Hadoop Common Content-Transfer-Encoding: quoted-printable Message-Id: References: To: Steve Loughran X-Mailer: Apple Mail (2.3124) archived-at: Tue, 22 Aug 2017 16:25:11 -0000 > On Aug 22, 2017, at 6:00 AM, Steve Loughran = wrote: >=20 >=20 > I'm having problems getting the s3 classpath setup on the CLI & am = trying to work out what I'm doing wrong. >=20 >=20 > without setting things up, you can't expect to talk to blobstores >=20 > hadoop fs -ls wasb://something/ > hadoop fs -ls s3a://landsat-pds/ >=20 > That's expected. Yup. > but what I can't do is get the aws bits on the CP via = HADOOP_OPTIONAL_TOOLS >=20 > export = HADOOP_OPTIONAL_TOOLS=3D"hadoop-azure,hadoop-aws,hadoop-adl,hadoop-opensta= ck" >=20 > Once I do that the wasb:// ls works (or at least doesnt throw a CNFE), = but the s3a URL still fails Hmm. So HOT is getting processed at least somewhat then... > if Add the line to ~/.hadooprc all becomes well >=20 > hadoop_add_to_classpath_tools hadoop-aws >=20 > any ideas? Setting HOT should be calling the equivalent of = hadoop_add_to_classpath_tools hadoop-aws in the code path. Luckily, we = have debugging tools in 3.x[1]: First, let=92s duplicate the failure conditions, but only activate = hadoop-aws since it should be standalone and cuts our output down: =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D $ cat ~/.hadooprc cat: /Users/aw/.hadooprc: No such file or directory $ bin/hadoop envvars | grep CONF HADOOP_CONF_DIR=3D'/Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/etc/hadoop' $ pwd /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT $ grep OPTIONAL_TOOLS etc/hadoop/hadoop-env.sh # export = HADOOP_OPTIONAL_TOOLS=3D"hadoop-aliyun,hadoop-aws,hadoop-azure,hadoop-azur= e-datalake,hadoop-kafka,hadoop-openstack" export HADOOP_OPTIONAL_TOOLS=3D"hadoop-aws=94 =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D Using --debug, let=92s see what happens: =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D $ bin/hadoop --debug classpath 2>&1 | egrep '(tools|hadoop-aws)' DEBUG: shellprofiles: = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-aliyun.sh = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-archive-logs.sh = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-archives.sh = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-aws.sh = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-azure-datalake.sh = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-azure.sh = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-distcp.sh = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-extras.sh = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-gridmix.sh = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-hdfs.sh = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-httpfs.sh = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-kafka.sh = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-kms.sh = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-mapreduce.sh = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-openstack.sh = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-rumen.sh = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-streaming.sh = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-yarn.sh DEBUG: Adding hadoop-aws to HADOOP_TOOLS_OPTIONS DEBUG: Profiles: importing = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/bin/../libexec/shellprofile.d/hado= op-aws.sh DEBUG: HADOOP_SHELL_PROFILES accepted hadoop-aws DEBUG: Profiles: hadoop-aws classpath DEBUG: Append CLASSPATH: = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/share/hadoop/tools/lib/aws-java-sd= k-bundle-1.11.134.jar DEBUG: Append CLASSPATH: = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/share/hadoop/tools/lib/java-xmlbui= lder-0.4.jar DEBUG: Append CLASSPATH: = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/share/hadoop/tools/lib/jets3t-0.9.= 0.jar DEBUG: Append CLASSPATH: = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/share/hadoop/tools/lib/hadoop-aws-= 3.0.0-beta1-SNAPSHOT.jar = /Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/etc/hadoop:/Users/aw/H/hadoop-3.0.= 0-beta1-SNAPSHOT/share/hadoop/common/lib/*:/Users/aw/H/hadoop-3.0.0-beta1-= SNAPSHOT/share/hadoop/common/*:/Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/sha= re/hadoop/tools/lib/aws-java-sdk-bundle-1.11.134.jar:/Users/aw/H/hadoop-3.= 0.0-beta1-SNAPSHOT/share/hadoop/tools/lib/java-xmlbuilder-0.4.jar:/Users/a= w/H/hadoop-3.0.0-beta1-SNAPSHOT/share/hadoop/tools/lib/jets3t-0.9.0.jar:/U= sers/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/share/hadoop/tools/lib/hadoop-aws-3.= 0.0-beta1-SNAPSHOT.jar:/Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/share/hadoo= p/hdfs:/Users/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/share/hadoop/hdfs/lib/*:/Us= ers/aw/H/hadoop-3.0.0-beta1-SNAPSHOT/share/hadoop/hdfs/*:/Users/aw/H/hadoo= p-3.0.0-beta1-SNAPSHOT/share/hadoop/mapreduce/*:/Users/aw/H/hadoop-3.0.0-b= eta1-SNAPSHOT/share/hadoop/yarn/lib/*:/Users/aw/H/hadoop-3.0.0-beta1-SNAPS= HOT/share/hadoop/yarn/* =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D OK, the =93extra=94 bits are definitely getting added. With the = addition of the debug lines: * the hadoop-aws profile and tools hooks are getting executed * the hadoop-aws classpath function is getting executed (aka = hadoop_add_to_classpath_tools hadoop-aws) * the classpath isn=92t rejecting any jars * the final line definitely has AWS there. So we should be good to go assuming the profile and supplemental tools = code is correct. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D $ bin/hadoop fs -ls s3a://landsat-pds/ ls: Interrupted =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D umm, ok? No CNFE though. If I disable the network: =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D $ bin/hadoop fs -ls s3a://landsat-pds/ ls: doesBucketExist on landsat-pds: com.amazonaws.AmazonClientException: = No AWS Credentials provided by BasicAWSCredentialsProvider = EnvironmentVariableCredentialsProvider = InstanceProfileCredentialsProvider : com.amazonaws.SdkClientException: = Unable to load credentials from service endpoint: No AWS Credentials = provided by BasicAWSCredentialsProvider = EnvironmentVariableCredentialsProvider = InstanceProfileCredentialsProvider : com.amazonaws.SdkClientException: = Unable to load credentials from service endpoint =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D Ugly error, but still no CNFE. So at least out of the box with a build = from last week. I guess this is working? At this point, it=92d probably = be worthwhile to make sure that the libexec/shellprofile.d/hadoop-aws.sh = on your system is in working order. In particular... =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D if hadoop_verify_entry HADOOP_TOOLS_OPTIONS "hadoop-aws"; then hadoop_add_profile "hadoop-aws=94 fi =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D =85 is the magic code. It (effectively[2]) says that if = HADOOP_OPTIONAL_TOOLS has hadoop-aws in it, then activate the hadoop-aws = profile which should end up calling hadoop_add_to_classpath_tools = hadoop-aws. Might also be worthwhile to check simple stuff like = permissions. [1] It=92s tempting to say =93now=94, but given that debug was added = several years ago. it=92s more like branch-2 is just really ancient = rather than 3.x being "current". [2] yes, that variable is supposed to be HADOOP_TOOLS_OPTIONS. HOT = gets transformed into HADOOP_OPTIONAL_TOOLS internally for =93reasons=94.= It=92s a longer discussion that most people aren=92t interested in. --------------------------------------------------------------------- To unsubscribe, e-mail: common-dev-unsubscribe@hadoop.apache.org For additional commands, e-mail: common-dev-help@hadoop.apache.org