hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Terry Siu <terry....@dev9.com>
Subject Hive table over S3 bucket with s3a
Date Tue, 02 Feb 2016 15:52:27 GMT
Hi,

I’m wondering if anyone has found a workaround for defining a Hive table over a S3 bucket
when the secret access key has ‘/‘ characters in it. I’m using Hive 0.14 in HDP 2.2.4
and the statement that I used is:


CREATE EXTERNAL TABLE IF NOT EXISTS s3_foo (

  key INT, value STRING

)

ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t’

LOCATION 's3a://<access key>:<secret key>@<bucket>/<folder>’;


The following error is returned:


FAILED: IllegalArgumentException The bucketName parameter must be specified.


A workaround was to set the fs.s3a.access.key and fs.s3a.secret.key configuration and then
change the location URL to be s3a://<bucket>/<folder>. However, this produces
the following error:


FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:com.amazonaws.AmazonClientException:
Unable to load AWS credentials from any provider in the chain)


Has anyone found a way to create a Hive over S3 table when the key contains ‘/‘ characters
or it just standard practice to simply regenerate the keys until IAM returns one that doesn’t
have the offending characters?


Thanks,

-Terry
Mime
View raw message