Return-Path: Delivered-To: apmail-hadoop-core-commits-archive@www.apache.org Received: (qmail 20105 invoked from network); 13 Nov 2008 23:56:56 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.2) by minotaur.apache.org with SMTP; 13 Nov 2008 23:56:56 -0000 Received: (qmail 37771 invoked by uid 500); 13 Nov 2008 23:57:03 -0000 Delivered-To: apmail-hadoop-core-commits-archive@hadoop.apache.org Received: (qmail 37746 invoked by uid 500); 13 Nov 2008 23:57:03 -0000 Mailing-List: contact core-commits-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: core-dev@hadoop.apache.org Delivered-To: mailing list core-commits@hadoop.apache.org Received: (qmail 37737 invoked by uid 99); 13 Nov 2008 23:57:03 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 13 Nov 2008 15:57:03 -0800 X-ASF-Spam-Status: No, hits=-2000.0 required=10.0 tests=ALL_TRUSTED X-Spam-Check-By: apache.org Received: from [140.211.11.4] (HELO eris.apache.org) (140.211.11.4) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 13 Nov 2008 23:55:42 +0000 Received: by eris.apache.org (Postfix, from userid 65534) id DD8B0238899D; Thu, 13 Nov 2008 15:56:25 -0800 (PST) Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit Subject: svn commit: r713872 - in /hadoop/core/trunk: CHANGES.txt src/contrib/ec2/bin/delete-hadoop-cluster src/contrib/ec2/bin/hadoop-ec2 src/contrib/ec2/bin/launch-hadoop-master src/contrib/ec2/bin/list-hadoop-clusters Date: Thu, 13 Nov 2008 23:56:25 -0000 To: core-commits@hadoop.apache.org From: omalley@apache.org X-Mailer: svnmailer-1.0.8 Message-Id: <20081113235625.DD8B0238899D@eris.apache.org> X-Virus-Checked: Checked by ClamAV on apache.org Author: omalley Date: Thu Nov 13 15:56:25 2008 New Revision: 713872 URL: http://svn.apache.org/viewvc?rev=713872&view=rev Log: HADOOP-4126. Allow access to HDFS web UI on EC2 (tomwhite via omalley) Added: hadoop/core/trunk/src/contrib/ec2/bin/delete-hadoop-cluster Modified: hadoop/core/trunk/CHANGES.txt hadoop/core/trunk/src/contrib/ec2/bin/hadoop-ec2 hadoop/core/trunk/src/contrib/ec2/bin/launch-hadoop-master hadoop/core/trunk/src/contrib/ec2/bin/list-hadoop-clusters Modified: hadoop/core/trunk/CHANGES.txt URL: http://svn.apache.org/viewvc/hadoop/core/trunk/CHANGES.txt?rev=713872&r1=713871&r2=713872&view=diff ============================================================================== --- hadoop/core/trunk/CHANGES.txt (original) +++ hadoop/core/trunk/CHANGES.txt Thu Nov 13 15:56:25 2008 @@ -103,6 +103,8 @@ HADOOP-4645. Package HdfsProxy contrib project without the extra level of directories. (Kan Zhang via omalley) + HADOOP-4126. Allow access to HDFS web UI on EC2 (tomwhite via omalley) + OPTIMIZATIONS BUG FIXES Added: hadoop/core/trunk/src/contrib/ec2/bin/delete-hadoop-cluster URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/ec2/bin/delete-hadoop-cluster?rev=713872&view=auto ============================================================================== --- hadoop/core/trunk/src/contrib/ec2/bin/delete-hadoop-cluster (added) +++ hadoop/core/trunk/src/contrib/ec2/bin/delete-hadoop-cluster Thu Nov 13 15:56:25 2008 @@ -0,0 +1,58 @@ +#!/usr/bin/env bash + +# Licensed to the Apache Software Foundation (ASF) under one or more +# contributor license agreements. See the NOTICE file distributed with +# this work for additional information regarding copyright ownership. +# The ASF licenses this file to You under the Apache License, Version 2.0 +# (the "License"); you may not use this file except in compliance with +# the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +# Delete the groups an local files associated with a cluster. + +if [ -z $1 ]; then + echo "Cluster name required!" + exit -1 +fi + +CLUSTER=$1 + +# Finding Hadoop clusters +CLUSTERS=`ec2-describe-instances | \ + awk '"RESERVATION" == $1 && $4 ~ /-master$/, "INSTANCE" == $1' | tr '\n' '\t' | \ + grep "$CLUSTER" | grep running | cut -f4 | rev | cut -d'-' -f2- | rev` + +if [ -n "$CLUSTERS" ]; then + echo "Cluster $CLUSTER has running instances. Please terminate them first." + exit 0 +fi + +# Import variables +bin=`dirname "$0"` +bin=`cd "$bin"; pwd` +. "$bin"/hadoop-ec2-env.sh + +rm -f $MASTER_IP_PATH +rm -f $MASTER_PRIVATE_IP_PATH + +ec2-describe-group | egrep "[[:space:]]$CLUSTER_MASTER[[:space:]]" > /dev/null +if [ $? -eq 0 ]; then + echo "Deleting group $CLUSTER_MASTER" + ec2-revoke $CLUSTER_MASTER -o $CLUSTER -u $AWS_ACCOUNT_ID +fi + +ec2-describe-group | egrep "[[:space:]]$CLUSTER[[:space:]]" > /dev/null +if [ $? -eq 0 ]; then + echo "Deleting group $CLUSTER" + ec2-revoke $CLUSTER -o $CLUSTER_MASTER -u $AWS_ACCOUNT_ID +fi + +ec2-delete-group $CLUSTER_MASTER +ec2-delete-group $CLUSTER Modified: hadoop/core/trunk/src/contrib/ec2/bin/hadoop-ec2 URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/ec2/bin/hadoop-ec2?rev=713872&r1=713871&r2=713872&view=diff ============================================================================== --- hadoop/core/trunk/src/contrib/ec2/bin/hadoop-ec2 (original) +++ hadoop/core/trunk/src/contrib/ec2/bin/hadoop-ec2 Thu Nov 13 15:56:25 2008 @@ -27,6 +27,7 @@ echo " launch-master launch or find a cluster master" echo " launch-slaves launch the cluster slaves" echo " terminate-cluster terminate all Hadoop EC2 instances" + echo " delete-cluster delete the group information for a terminated cluster" echo " login login to the master node of the Hadoop EC2 cluster" echo " screen start or attach 'screen' on the master node of the Hadoop EC2 cluster" echo " proxy start a socks proxy on localhost:6666 (use w/foxyproxy)" @@ -48,6 +49,8 @@ . "$bin"/launch-hadoop-master $* elif [ "$COMMAND" = "launch-slaves" ] ; then . "$bin"/launch-hadoop-slaves $* +elif [ "$COMMAND" = "delete-cluster" ] ; then + . "$bin"/delete-hadoop-cluster $* elif [ "$COMMAND" = "terminate-cluster" ] ; then . "$bin"/terminate-hadoop-cluster $* elif [ "$COMMAND" = "list" ] ; then Modified: hadoop/core/trunk/src/contrib/ec2/bin/launch-hadoop-master URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/ec2/bin/launch-hadoop-master?rev=713872&r1=713871&r2=713872&view=diff ============================================================================== --- hadoop/core/trunk/src/contrib/ec2/bin/launch-hadoop-master (original) +++ hadoop/core/trunk/src/contrib/ec2/bin/launch-hadoop-master Thu Nov 13 15:56:25 2008 @@ -56,6 +56,8 @@ if [ $ENABLE_WEB_PORTS == "true" ]; then ec2-authorize $CLUSTER_MASTER -p 50030 # JobTracker web interface ec2-authorize $CLUSTER_MASTER -p 50060 # TaskTracker web interface + ec2-authorize $CLUSTER_MASTER -p 50070 # NameNode web interface + ec2-authorize $CLUSTER_MASTER -p 50075 # DataNode web interface fi fi @@ -69,6 +71,8 @@ if [ $ENABLE_WEB_PORTS == "true" ]; then ec2-authorize $CLUSTER -p 50030 # JobTracker web interface ec2-authorize $CLUSTER -p 50060 # TaskTracker web interface + ec2-authorize $CLUSTER -p 50070 # NameNode web interface + ec2-authorize $CLUSTER -p 50075 # DataNode web interface fi ec2-authorize $CLUSTER_MASTER -o $CLUSTER -u $AWS_ACCOUNT_ID Modified: hadoop/core/trunk/src/contrib/ec2/bin/list-hadoop-clusters URL: http://svn.apache.org/viewvc/hadoop/core/trunk/src/contrib/ec2/bin/list-hadoop-clusters?rev=713872&r1=713871&r2=713872&view=diff ============================================================================== --- hadoop/core/trunk/src/contrib/ec2/bin/list-hadoop-clusters (original) +++ hadoop/core/trunk/src/contrib/ec2/bin/list-hadoop-clusters Thu Nov 13 15:56:25 2008 @@ -15,7 +15,7 @@ # See the License for the specific language governing permissions and # limitations under the License. -# Terminate a cluster. +# List running clusters. # Import variables bin=`dirname "$0"`