hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Tom White (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HADOOP-6681) Fill AWS credentials when configuring Hadoop on EC2 instances
Date Tue, 06 Apr 2010 18:15:33 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-6681?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12854098#action_12854098

Tom White commented on HADOOP-6681:

Passing these credentials automatically may not be what the user wants, and may in fact be
a security liability (since the credentials are passed to the cluster, which may be shared
with other users). You can achieve the same result explicitly with the following command line


Does this solve your issue?

> Fill AWS credentials when configuring Hadoop on EC2 instances
> -------------------------------------------------------------
>                 Key: HADOOP-6681
>                 URL: https://issues.apache.org/jira/browse/HADOOP-6681
>             Project: Hadoop Common
>          Issue Type: Improvement
>          Components: contrib/cloud
>            Reporter: Andrew Klochkov
>         Attachments: HADOOP-6681.patch
> There's a function "configure_hadoop" in the hadoop-ec2-init-remote.sh script used to
configure EC2 nodes for Hadoop. The function actually uses AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
environment variables, but they are never passed to it. It can be fixed in service.py by passing
those variables.

This message is automatically generated by JIRA.
You can reply to this email to add a comment to the issue online.

View raw message