Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 14B01200AF2 for ; Wed, 4 May 2016 15:54:55 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 115681609FF; Wed, 4 May 2016 13:54:55 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 8BFF41609FC for ; Wed, 4 May 2016 15:54:52 +0200 (CEST) Received: (qmail 84279 invoked by uid 500); 4 May 2016 13:54:51 -0000 Mailing-List: contact common-commits-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list common-commits@hadoop.apache.org Received: (qmail 83842 invoked by uid 99); 4 May 2016 13:54:51 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 04 May 2016 13:54:51 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id C589DDF97F; Wed, 4 May 2016 13:54:50 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: aajisaka@apache.org To: common-commits@hadoop.apache.org Date: Wed, 04 May 2016 13:54:52 -0000 Message-Id: <17c77301f84e4cce9f41396531e3a115@git.apache.org> In-Reply-To: References: X-Mailer: ASF-Git Admin Mailer Subject: [3/3] hadoop git commit: HADOOP-12504. Remove metrics v1. (aajisaka) archived-at: Wed, 04 May 2016 13:54:55 -0000 HADOOP-12504. Remove metrics v1. (aajisaka) Project: http://git-wip-us.apache.org/repos/asf/hadoop/repo Commit: http://git-wip-us.apache.org/repos/asf/hadoop/commit/36972d61 Tree: http://git-wip-us.apache.org/repos/asf/hadoop/tree/36972d61 Diff: http://git-wip-us.apache.org/repos/asf/hadoop/diff/36972d61 Branch: refs/heads/trunk Commit: 36972d61d1cf0b8248a4887df780d9863b27b7c1 Parents: 355325b Author: Akira Ajisaka Authored: Wed May 4 22:52:23 2016 +0900 Committer: Akira Ajisaka Committed: Wed May 4 22:52:23 2016 +0900 ---------------------------------------------------------------------- .../src/main/conf/hadoop-metrics.properties | 75 --- .../org/apache/hadoop/http/HttpServer2.java | 4 - .../apache/hadoop/metrics/ContextFactory.java | 214 -------- .../apache/hadoop/metrics/MetricsContext.java | 125 ----- .../apache/hadoop/metrics/MetricsException.java | 49 -- .../apache/hadoop/metrics/MetricsRecord.java | 254 ---------- .../apache/hadoop/metrics/MetricsServlet.java | 188 ------- .../org/apache/hadoop/metrics/MetricsUtil.java | 104 ---- .../java/org/apache/hadoop/metrics/Updater.java | 41 -- .../org/apache/hadoop/metrics/file/package.html | 43 -- .../hadoop/metrics/ganglia/GangliaContext.java | 276 ----------- .../metrics/ganglia/GangliaContext31.java | 147 ------ .../apache/hadoop/metrics/ganglia/package.html | 80 --- .../apache/hadoop/metrics/jvm/EventCounter.java | 36 -- .../apache/hadoop/metrics/jvm/JvmMetrics.java | 203 -------- .../apache/hadoop/metrics/jvm/package-info.java | 22 - .../java/org/apache/hadoop/metrics/package.html | 159 ------ .../metrics/spi/AbstractMetricsContext.java | 494 ------------------- .../hadoop/metrics/spi/CompositeContext.java | 206 -------- .../apache/hadoop/metrics/spi/MetricValue.java | 58 --- .../hadoop/metrics/spi/MetricsRecordImpl.java | 304 ------------ .../metrics/spi/NoEmitMetricsContext.java | 61 --- .../apache/hadoop/metrics/spi/NullContext.java | 74 --- .../spi/NullContextWithUpdateThread.java | 82 --- .../apache/hadoop/metrics/spi/OutputRecord.java | 93 ---- .../org/apache/hadoop/metrics/spi/Util.java | 68 --- .../org/apache/hadoop/metrics/spi/package.html | 36 -- .../apache/hadoop/metrics/util/MBeanUtil.java | 92 ---- .../apache/hadoop/metrics/util/MetricsBase.java | 51 -- .../metrics/util/MetricsDynamicMBeanBase.java | 229 --------- .../hadoop/metrics/util/MetricsIntValue.java | 106 ---- .../hadoop/metrics/util/MetricsLongValue.java | 93 ---- .../hadoop/metrics/util/MetricsRegistry.java | 90 ---- .../metrics/util/MetricsTimeVaryingInt.java | 129 ----- .../metrics/util/MetricsTimeVaryingLong.java | 125 ----- .../metrics/util/MetricsTimeVaryingRate.java | 198 -------- .../hadoop/metrics/util/package-info.java | 22 - .../org/apache/hadoop/http/TestHttpServer.java | 10 +- .../hadoop/metrics/TestMetricsServlet.java | 112 ----- .../metrics/ganglia/TestGangliaContext.java | 84 ---- .../hadoop/metrics/spi/TestOutputRecord.java | 39 -- 41 files changed, 5 insertions(+), 4871 deletions(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/hadoop/blob/36972d61/hadoop-common-project/hadoop-common/src/main/conf/hadoop-metrics.properties ---------------------------------------------------------------------- diff --git a/hadoop-common-project/hadoop-common/src/main/conf/hadoop-metrics.properties b/hadoop-common-project/hadoop-common/src/main/conf/hadoop-metrics.properties deleted file mode 100644 index c1b2eb7..0000000 --- a/hadoop-common-project/hadoop-common/src/main/conf/hadoop-metrics.properties +++ /dev/null @@ -1,75 +0,0 @@ -# Configuration of the "dfs" context for null -dfs.class=org.apache.hadoop.metrics.spi.NullContext - -# Configuration of the "dfs" context for file -#dfs.class=org.apache.hadoop.metrics.file.FileContext -#dfs.period=10 -#dfs.fileName=/tmp/dfsmetrics.log - -# Configuration of the "dfs" context for ganglia -# Pick one: Ganglia 3.0 (former) or Ganglia 3.1 (latter) -# dfs.class=org.apache.hadoop.metrics.ganglia.GangliaContext -# dfs.class=org.apache.hadoop.metrics.ganglia.GangliaContext31 -# dfs.period=10 -# dfs.servers=localhost:8649 - - -# Configuration of the "mapred" context for null -mapred.class=org.apache.hadoop.metrics.spi.NullContext - -# Configuration of the "mapred" context for file -#mapred.class=org.apache.hadoop.metrics.file.FileContext -#mapred.period=10 -#mapred.fileName=/tmp/mrmetrics.log - -# Configuration of the "mapred" context for ganglia -# Pick one: Ganglia 3.0 (former) or Ganglia 3.1 (latter) -# mapred.class=org.apache.hadoop.metrics.ganglia.GangliaContext -# mapred.class=org.apache.hadoop.metrics.ganglia.GangliaContext31 -# mapred.period=10 -# mapred.servers=localhost:8649 - - -# Configuration of the "jvm" context for null -#jvm.class=org.apache.hadoop.metrics.spi.NullContext - -# Configuration of the "jvm" context for file -#jvm.class=org.apache.hadoop.metrics.file.FileContext -#jvm.period=10 -#jvm.fileName=/tmp/jvmmetrics.log - -# Configuration of the "jvm" context for ganglia -# jvm.class=org.apache.hadoop.metrics.ganglia.GangliaContext -# jvm.class=org.apache.hadoop.metrics.ganglia.GangliaContext31 -# jvm.period=10 -# jvm.servers=localhost:8649 - -# Configuration of the "rpc" context for null -rpc.class=org.apache.hadoop.metrics.spi.NullContext - -# Configuration of the "rpc" context for file -#rpc.class=org.apache.hadoop.metrics.file.FileContext -#rpc.period=10 -#rpc.fileName=/tmp/rpcmetrics.log - -# Configuration of the "rpc" context for ganglia -# rpc.class=org.apache.hadoop.metrics.ganglia.GangliaContext -# rpc.class=org.apache.hadoop.metrics.ganglia.GangliaContext31 -# rpc.period=10 -# rpc.servers=localhost:8649 - - -# Configuration of the "ugi" context for null -ugi.class=org.apache.hadoop.metrics.spi.NullContext - -# Configuration of the "ugi" context for file -#ugi.class=org.apache.hadoop.metrics.file.FileContext -#ugi.period=10 -#ugi.fileName=/tmp/ugimetrics.log - -# Configuration of the "ugi" context for ganglia -# ugi.class=org.apache.hadoop.metrics.ganglia.GangliaContext -# ugi.class=org.apache.hadoop.metrics.ganglia.GangliaContext31 -# ugi.period=10 -# ugi.servers=localhost:8649 - http://git-wip-us.apache.org/repos/asf/hadoop/blob/36972d61/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java ---------------------------------------------------------------------- diff --git a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java b/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java index 8ba67dd..9d2fae6 100644 --- a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java +++ b/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java @@ -59,7 +59,6 @@ import org.apache.hadoop.security.authentication.util.SignerSecretProvider; import org.apache.hadoop.security.ssl.SslSocketConnectorSecure; import org.apache.hadoop.jmx.JMXJsonServlet; import org.apache.hadoop.log.LogLevel; -import org.apache.hadoop.metrics.MetricsServlet; import org.apache.hadoop.security.SecurityUtil; import org.apache.hadoop.security.UserGroupInformation; import org.apache.hadoop.security.authentication.server.AuthenticationFilter; @@ -572,14 +571,11 @@ public final class HttpServer2 implements FilterContainer { /** * Add default servlets. - * Note: /metrics servlet will be removed in 3.X release. */ - @SuppressWarnings("deprecation") protected void addDefaultServlets() { // set up default servlets addServlet("stacks", "/stacks", StackServlet.class); addServlet("logLevel", "/logLevel", LogLevel.Servlet.class); - addServlet("metrics", "/metrics", MetricsServlet.class); addServlet("jmx", "/jmx", JMXJsonServlet.class); addServlet("conf", "/conf", ConfServlet.class); } http://git-wip-us.apache.org/repos/asf/hadoop/blob/36972d61/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/ContextFactory.java ---------------------------------------------------------------------- diff --git a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/ContextFactory.java b/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/ContextFactory.java deleted file mode 100644 index 15ecd61..0000000 --- a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/ContextFactory.java +++ /dev/null @@ -1,214 +0,0 @@ -/* - * ContextFactory.java - * - * Licensed to the Apache Software Foundation (ASF) under one - * or more contributor license agreements. See the NOTICE file - * distributed with this work for additional information - * regarding copyright ownership. The ASF licenses this file - * to you under the Apache License, Version 2.0 (the - * "License"); you may not use this file except in compliance - * with the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package org.apache.hadoop.metrics; - -import java.io.IOException; -import java.io.InputStream; -import java.util.ArrayList; -import java.util.Collection; -import java.util.HashMap; -import java.util.Iterator; -import java.util.Map; -import java.util.Properties; - -import org.apache.hadoop.classification.InterfaceAudience; -import org.apache.hadoop.classification.InterfaceStability; -import org.apache.hadoop.metrics.spi.NullContext; - -/** - * Factory class for creating MetricsContext objects. To obtain an instance - * of this class, use the static getFactory() method. - * - * @deprecated Use org.apache.hadoop.metrics2 package instead. - */ -@Deprecated -@InterfaceAudience.LimitedPrivate({"HDFS", "MapReduce"}) -@InterfaceStability.Evolving -public class ContextFactory { - - private static final String PROPERTIES_FILE = - "/hadoop-metrics.properties"; - private static final String CONTEXT_CLASS_SUFFIX = - ".class"; - private static final String DEFAULT_CONTEXT_CLASSNAME = - "org.apache.hadoop.metrics.spi.NullContext"; - - private static ContextFactory theFactory = null; - - private Map attributeMap = new HashMap(); - private Map contextMap = - new HashMap(); - - // Used only when contexts, or the ContextFactory itself, cannot be - // created. - private static Map nullContextMap = - new HashMap(); - - /** Creates a new instance of ContextFactory */ - protected ContextFactory() { - } - - /** - * Returns the value of the named attribute, or null if there is no - * attribute of that name. - * - * @param attributeName the attribute name - * @return the attribute value - */ - public Object getAttribute(String attributeName) { - return attributeMap.get(attributeName); - } - - /** - * Returns the names of all the factory's attributes. - * - * @return the attribute names - */ - public String[] getAttributeNames() { - String[] result = new String[attributeMap.size()]; - int i = 0; - // for (String attributeName : attributeMap.keySet()) { - Iterator it = attributeMap.keySet().iterator(); - while (it.hasNext()) { - result[i++] = (String) it.next(); - } - return result; - } - - /** - * Sets the named factory attribute to the specified value, creating it - * if it did not already exist. If the value is null, this is the same as - * calling removeAttribute. - * - * @param attributeName the attribute name - * @param value the new attribute value - */ - public void setAttribute(String attributeName, Object value) { - attributeMap.put(attributeName, value); - } - - /** - * Removes the named attribute if it exists. - * - * @param attributeName the attribute name - */ - public void removeAttribute(String attributeName) { - attributeMap.remove(attributeName); - } - - /** - * Returns the named MetricsContext instance, constructing it if necessary - * using the factory's current configuration attributes.

- * - * When constructing the instance, if the factory property - * contextName.class exists, - * its value is taken to be the name of the class to instantiate. Otherwise, - * the default is to create an instance of - * org.apache.hadoop.metrics.spi.NullContext, which is a - * dummy "no-op" context which will cause all metric data to be discarded. - * - * @param contextName the name of the context - * @return the named MetricsContext - */ - public synchronized MetricsContext getContext(String refName, String contextName) - throws IOException, ClassNotFoundException, - InstantiationException, IllegalAccessException { - MetricsContext metricsContext = contextMap.get(refName); - if (metricsContext == null) { - String classNameAttribute = refName + CONTEXT_CLASS_SUFFIX; - String className = (String) getAttribute(classNameAttribute); - if (className == null) { - className = DEFAULT_CONTEXT_CLASSNAME; - } - Class contextClass = Class.forName(className); - metricsContext = (MetricsContext) contextClass.newInstance(); - metricsContext.init(contextName, this); - contextMap.put(contextName, metricsContext); - } - return metricsContext; - } - - public synchronized MetricsContext getContext(String contextName) - throws IOException, ClassNotFoundException, InstantiationException, - IllegalAccessException { - return getContext(contextName, contextName); - } - - /** - * Returns all MetricsContexts built by this factory. - */ - public synchronized Collection getAllContexts() { - // Make a copy to avoid race conditions with creating new contexts. - return new ArrayList(contextMap.values()); - } - - /** - * Returns a "null" context - one which does nothing. - */ - public static synchronized MetricsContext getNullContext(String contextName) { - MetricsContext nullContext = nullContextMap.get(contextName); - if (nullContext == null) { - nullContext = new NullContext(); - nullContextMap.put(contextName, nullContext); - } - return nullContext; - } - - /** - * Returns the singleton ContextFactory instance, constructing it if - * necessary.

- * - * When the instance is constructed, this method checks if the file - * hadoop-metrics.properties exists on the class path. If it - * exists, it must be in the format defined by java.util.Properties, and all - * the properties in the file are set as attributes on the newly created - * ContextFactory instance. - * - * @return the singleton ContextFactory instance - */ - public static synchronized ContextFactory getFactory() throws IOException { - if (theFactory == null) { - theFactory = new ContextFactory(); - theFactory.setAttributes(); - } - return theFactory; - } - - private void setAttributes() throws IOException { - InputStream is = getClass().getResourceAsStream(PROPERTIES_FILE); - if (is != null) { - try { - Properties properties = new Properties(); - properties.load(is); - //for (Object propertyNameObj : properties.keySet()) { - Iterator it = properties.keySet().iterator(); - while (it.hasNext()) { - String propertyName = (String) it.next(); - String propertyValue = properties.getProperty(propertyName); - setAttribute(propertyName, propertyValue); - } - } finally { - is.close(); - } - } - } - -} http://git-wip-us.apache.org/repos/asf/hadoop/blob/36972d61/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsContext.java ---------------------------------------------------------------------- diff --git a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsContext.java b/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsContext.java deleted file mode 100644 index aa08641..0000000 --- a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsContext.java +++ /dev/null @@ -1,125 +0,0 @@ -/* - * MetricsContext.java - * - * Licensed to the Apache Software Foundation (ASF) under one - * or more contributor license agreements. See the NOTICE file - * distributed with this work for additional information - * regarding copyright ownership. The ASF licenses this file - * to you under the Apache License, Version 2.0 (the - * "License"); you may not use this file except in compliance - * with the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package org.apache.hadoop.metrics; - -import java.io.IOException; -import java.util.Collection; -import java.util.Map; - -import org.apache.hadoop.classification.InterfaceAudience; -import org.apache.hadoop.classification.InterfaceStability; -import org.apache.hadoop.metrics.spi.OutputRecord; - -/** - * The main interface to the metrics package. - * - * @deprecated Use org.apache.hadoop.metrics2 package instead. - */ -@Deprecated -@InterfaceAudience.Private -@InterfaceStability.Evolving -public interface MetricsContext { - - /** - * Default period in seconds at which data is sent to the metrics system. - */ - public static final int DEFAULT_PERIOD = 5; - - /** - * Initialize this context. - * @param contextName The given name for this context - * @param factory The creator of this context - */ - public void init(String contextName, ContextFactory factory); - - /** - * Returns the context name. - * - * @return the context name - */ - public abstract String getContextName(); - - /** - * Starts or restarts monitoring, the emitting of metrics records as they are - * updated. - */ - public abstract void startMonitoring() - throws IOException; - - /** - * Stops monitoring. This does not free any data that the implementation - * may have buffered for sending at the next timer event. It - * is OK to call startMonitoring() again after calling - * this. - * @see #close() - */ - public abstract void stopMonitoring(); - - /** - * Returns true if monitoring is currently in progress. - */ - public abstract boolean isMonitoring(); - - /** - * Stops monitoring and also frees any buffered data, returning this - * object to its initial state. - */ - public abstract void close(); - - /** - * Creates a new MetricsRecord instance with the given recordName. - * Throws an exception if the metrics implementation is configured with a fixed - * set of record names and recordName is not in that set. - * - * @param recordName the name of the record - * @throws MetricsException if recordName conflicts with configuration data - */ - public abstract MetricsRecord createRecord(String recordName); - - /** - * Registers a callback to be called at regular time intervals, as - * determined by the implementation-class specific configuration. - * - * @param updater object to be run periodically; it should updated - * some metrics records and then return - */ - public abstract void registerUpdater(Updater updater); - - /** - * Removes a callback, if it exists. - * - * @param updater object to be removed from the callback list - */ - public abstract void unregisterUpdater(Updater updater); - - /** - * Returns the timer period. - */ - public abstract int getPeriod(); - - /** - * Retrieves all the records managed by this MetricsContext. - * Useful for monitoring systems that are polling-based. - * - * @return A non-null map from all record names to the records managed. - */ - Map> getAllRecords(); -} http://git-wip-us.apache.org/repos/asf/hadoop/blob/36972d61/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsException.java ---------------------------------------------------------------------- diff --git a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsException.java b/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsException.java deleted file mode 100644 index 5fcf751..0000000 --- a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsException.java +++ /dev/null @@ -1,49 +0,0 @@ -/* - * MetricsException.java - * - * Licensed to the Apache Software Foundation (ASF) under one - * or more contributor license agreements. See the NOTICE file - * distributed with this work for additional information - * regarding copyright ownership. The ASF licenses this file - * to you under the Apache License, Version 2.0 (the - * "License"); you may not use this file except in compliance - * with the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package org.apache.hadoop.metrics; - -import org.apache.hadoop.classification.InterfaceAudience; -import org.apache.hadoop.classification.InterfaceStability; - -/** - * General-purpose, unchecked metrics exception. - * @deprecated Use {@link org.apache.hadoop.metrics2.MetricsException} instead. - */ -@Deprecated -@InterfaceAudience.LimitedPrivate({"HDFS", "MapReduce"}) -@InterfaceStability.Evolving -public class MetricsException extends RuntimeException { - - private static final long serialVersionUID = -1643257498540498497L; - - /** Creates a new instance of MetricsException */ - public MetricsException() { - } - - /** Creates a new instance of MetricsException - * - * @param message an error message - */ - public MetricsException(String message) { - super(message); - } - -} http://git-wip-us.apache.org/repos/asf/hadoop/blob/36972d61/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsRecord.java ---------------------------------------------------------------------- diff --git a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsRecord.java b/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsRecord.java deleted file mode 100644 index d252a17..0000000 --- a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsRecord.java +++ /dev/null @@ -1,254 +0,0 @@ -/* - * MetricsRecord.java - * - * Licensed to the Apache Software Foundation (ASF) under one - * or more contributor license agreements. See the NOTICE file - * distributed with this work for additional information - * regarding copyright ownership. The ASF licenses this file - * to you under the Apache License, Version 2.0 (the - * "License"); you may not use this file except in compliance - * with the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package org.apache.hadoop.metrics; - -import org.apache.hadoop.classification.InterfaceAudience; -import org.apache.hadoop.classification.InterfaceStability; - -/** - * A named and optionally tagged set of records to be sent to the metrics - * system.

- * - * A record name identifies the kind of data to be reported. For example, a - * program reporting statistics relating to the disks on a computer might use - * a record name "diskStats".

- * - * A record has zero or more tags. A tag has a name and a value. To - * continue the example, the "diskStats" record might use a tag named - * "diskName" to identify a particular disk. Sometimes it is useful to have - * more than one tag, so there might also be a "diskType" with value "ide" or - * "scsi" or whatever.

- * - * A record also has zero or more metrics. These are the named - * values that are to be reported to the metrics system. In the "diskStats" - * example, possible metric names would be "diskPercentFull", "diskPercentBusy", - * "kbReadPerSecond", etc.

- * - * The general procedure for using a MetricsRecord is to fill in its tag and - * metric values, and then call update() to pass the record to the - * client library. - * Metric data is not immediately sent to the metrics system - * each time that update() is called. - * An internal table is maintained, identified by the record name. This - * table has columns - * corresponding to the tag and the metric names, and rows - * corresponding to each unique set of tag values. An update - * either modifies an existing row in the table, or adds a new row with a set of - * tag values that are different from all the other rows. Note that if there - * are no tags, then there can be at most one row in the table.

- * - * Once a row is added to the table, its data will be sent to the metrics system - * on every timer period, whether or not it has been updated since the previous - * timer period. If this is inappropriate, for example if metrics were being - * reported by some transient object in an application, the remove() - * method can be used to remove the row and thus stop the data from being - * sent.

- * - * Note that the update() method is atomic. This means that it is - * safe for different threads to be updating the same metric. More precisely, - * it is OK for different threads to call update() on MetricsRecord instances - * with the same set of tag names and tag values. Different threads should - * not use the same MetricsRecord instance at the same time. - * - * @deprecated Use {@link org.apache.hadoop.metrics2.MetricsRecord} instead. - */ -@Deprecated -@InterfaceAudience.Private -@InterfaceStability.Evolving -public interface MetricsRecord { - - /** - * Returns the record name. - * - * @return the record name - */ - public abstract String getRecordName(); - - /** - * Sets the named tag to the specified value. The tagValue may be null, - * which is treated the same as an empty String. - * - * @param tagName name of the tag - * @param tagValue new value of the tag - * @throws MetricsException if the tagName conflicts with the configuration - */ - public abstract void setTag(String tagName, String tagValue); - - /** - * Sets the named tag to the specified value. - * - * @param tagName name of the tag - * @param tagValue new value of the tag - * @throws MetricsException if the tagName conflicts with the configuration - */ - public abstract void setTag(String tagName, int tagValue); - - /** - * Sets the named tag to the specified value. - * - * @param tagName name of the tag - * @param tagValue new value of the tag - * @throws MetricsException if the tagName conflicts with the configuration - */ - public abstract void setTag(String tagName, long tagValue); - - /** - * Sets the named tag to the specified value. - * - * @param tagName name of the tag - * @param tagValue new value of the tag - * @throws MetricsException if the tagName conflicts with the configuration - */ - public abstract void setTag(String tagName, short tagValue); - - /** - * Sets the named tag to the specified value. - * - * @param tagName name of the tag - * @param tagValue new value of the tag - * @throws MetricsException if the tagName conflicts with the configuration - */ - public abstract void setTag(String tagName, byte tagValue); - - /** - * Removes any tag of the specified name. - * - * @param tagName name of a tag - */ - public abstract void removeTag(String tagName); - - /** - * Sets the named metric to the specified value. - * - * @param metricName name of the metric - * @param metricValue new value of the metric - * @throws MetricsException if the metricName or the type of the metricValue - * conflicts with the configuration - */ - public abstract void setMetric(String metricName, int metricValue); - - /** - * Sets the named metric to the specified value. - * - * @param metricName name of the metric - * @param metricValue new value of the metric - * @throws MetricsException if the metricName or the type of the metricValue - * conflicts with the configuration - */ - public abstract void setMetric(String metricName, long metricValue); - - /** - * Sets the named metric to the specified value. - * - * @param metricName name of the metric - * @param metricValue new value of the metric - * @throws MetricsException if the metricName or the type of the metricValue - * conflicts with the configuration - */ - public abstract void setMetric(String metricName, short metricValue); - - /** - * Sets the named metric to the specified value. - * - * @param metricName name of the metric - * @param metricValue new value of the metric - * @throws MetricsException if the metricName or the type of the metricValue - * conflicts with the configuration - */ - public abstract void setMetric(String metricName, byte metricValue); - - /** - * Sets the named metric to the specified value. - * - * @param metricName name of the metric - * @param metricValue new value of the metric - * @throws MetricsException if the metricName or the type of the metricValue - * conflicts with the configuration - */ - public abstract void setMetric(String metricName, float metricValue); - - /** - * Increments the named metric by the specified value. - * - * @param metricName name of the metric - * @param metricValue incremental value - * @throws MetricsException if the metricName or the type of the metricValue - * conflicts with the configuration - */ - public abstract void incrMetric(String metricName, int metricValue); - - /** - * Increments the named metric by the specified value. - * - * @param metricName name of the metric - * @param metricValue incremental value - * @throws MetricsException if the metricName or the type of the metricValue - * conflicts with the configuration - */ - public abstract void incrMetric(String metricName, long metricValue); - - /** - * Increments the named metric by the specified value. - * - * @param metricName name of the metric - * @param metricValue incremental value - * @throws MetricsException if the metricName or the type of the metricValue - * conflicts with the configuration - */ - public abstract void incrMetric(String metricName, short metricValue); - - /** - * Increments the named metric by the specified value. - * - * @param metricName name of the metric - * @param metricValue incremental value - * @throws MetricsException if the metricName or the type of the metricValue - * conflicts with the configuration - */ - public abstract void incrMetric(String metricName, byte metricValue); - - /** - * Increments the named metric by the specified value. - * - * @param metricName name of the metric - * @param metricValue incremental value - * @throws MetricsException if the metricName or the type of the metricValue - * conflicts with the configuration - */ - public abstract void incrMetric(String metricName, float metricValue); - - /** - * Updates the table of buffered data which is to be sent periodically. - * If the tag values match an existing row, that row is updated; - * otherwise, a new row is added. - */ - public abstract void update(); - - /** - * Removes, from the buffered data table, all rows having tags - * that equal the tags that have been set on this record. For example, - * if there are no tags on this record, all rows for this record name - * would be removed. Or, if there is a single tag on this record, then - * just rows containing a tag with the same name and value would be removed. - */ - public abstract void remove(); - -} http://git-wip-us.apache.org/repos/asf/hadoop/blob/36972d61/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsServlet.java ---------------------------------------------------------------------- diff --git a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsServlet.java b/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsServlet.java deleted file mode 100644 index 8e592ad..0000000 --- a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsServlet.java +++ /dev/null @@ -1,188 +0,0 @@ -/** - * Licensed to the Apache Software Foundation (ASF) under one - * or more contributor license agreements. See the NOTICE file - * distributed with this work for additional information - * regarding copyright ownership. The ASF licenses this file - * to you under the Apache License, Version 2.0 (the - * "License"); you may not use this file except in compliance - * with the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ -package org.apache.hadoop.metrics; - -import java.io.IOException; -import java.io.PrintWriter; -import java.util.ArrayList; -import java.util.Collection; -import java.util.List; -import java.util.Map; -import java.util.TreeMap; - -import javax.servlet.ServletException; -import javax.servlet.http.HttpServlet; -import javax.servlet.http.HttpServletRequest; -import javax.servlet.http.HttpServletResponse; - -import org.apache.hadoop.classification.InterfaceAudience; -import org.apache.hadoop.classification.InterfaceStability; -import org.apache.hadoop.http.HttpServer2; -import org.apache.hadoop.metrics.spi.OutputRecord; -import org.apache.hadoop.metrics.spi.AbstractMetricsContext.MetricMap; -import org.apache.hadoop.metrics.spi.AbstractMetricsContext.TagMap; -import org.mortbay.util.ajax.JSON; -import org.mortbay.util.ajax.JSON.Output; - -/** - * A servlet to print out metrics data. By default, the servlet returns a - * textual representation (no promises are made for parseability), and - * users can use "?format=json" for parseable output. - * - * @deprecated Use org.apache.hadoop.metrics2 package instead. - */ -@Deprecated -@InterfaceAudience.Private -@InterfaceStability.Evolving -public class MetricsServlet extends HttpServlet { - - /** - * A helper class to hold a TagMap and MetricMap. - */ - static class TagsMetricsPair implements JSON.Convertible { - final TagMap tagMap; - final MetricMap metricMap; - - public TagsMetricsPair(TagMap tagMap, MetricMap metricMap) { - this.tagMap = tagMap; - this.metricMap = metricMap; - } - - @Override - @SuppressWarnings("unchecked") - public void fromJSON(Map map) { - throw new UnsupportedOperationException(); - } - - /** Converts to JSON by providing an array. */ - @Override - public void toJSON(Output out) { - out.add(new Object[] { tagMap, metricMap }); - } - } - - /** - * Collects all metric data, and returns a map: - * contextName -> recordName -> [ (tag->tagValue), (metric->metricValue) ]. - * The values are either String or Number. The final value is implemented - * as a list of TagsMetricsPair. - */ - Map>> makeMap( - Collection contexts) throws IOException { - Map>> map = - new TreeMap>>(); - - for (MetricsContext context : contexts) { - Map> records = - new TreeMap>(); - map.put(context.getContextName(), records); - - for (Map.Entry> r : - context.getAllRecords().entrySet()) { - List metricsAndTags = - new ArrayList(); - records.put(r.getKey(), metricsAndTags); - for (OutputRecord outputRecord : r.getValue()) { - TagMap tagMap = outputRecord.getTagsCopy(); - MetricMap metricMap = outputRecord.getMetricsCopy(); - metricsAndTags.add(new TagsMetricsPair(tagMap, metricMap)); - } - } - } - return map; - } - - @Override - public void doGet(HttpServletRequest request, HttpServletResponse response) - throws ServletException, IOException { - - if (!HttpServer2.isInstrumentationAccessAllowed(getServletContext(), - request, response)) { - return; - } - - String format = request.getParameter("format"); - Collection allContexts = - ContextFactory.getFactory().getAllContexts(); - if ("json".equals(format)) { - response.setContentType("application/json; charset=utf-8"); - PrintWriter out = response.getWriter(); - try { - // Uses Jetty's built-in JSON support to convert the map into JSON. - out.print(new JSON().toJSON(makeMap(allContexts))); - } finally { - out.close(); - } - } else { - PrintWriter out = response.getWriter(); - try { - printMap(out, makeMap(allContexts)); - } finally { - out.close(); - } - } - } - - /** - * Prints metrics data in a multi-line text form. - */ - void printMap(PrintWriter out, Map>> map) { - for (Map.Entry>> context : map.entrySet()) { - out.print(context.getKey()); - out.print("\n"); - for (Map.Entry> record : context.getValue().entrySet()) { - indent(out, 1); - out.print(record.getKey()); - out.print("\n"); - for (TagsMetricsPair pair : record.getValue()) { - indent(out, 2); - // Prints tag values in the form "{key=value,key=value}:" - out.print("{"); - boolean first = true; - for (Map.Entry tagValue : pair.tagMap.entrySet()) { - if (first) { - first = false; - } else { - out.print(","); - } - out.print(tagValue.getKey()); - out.print("="); - out.print(tagValue.getValue().toString()); - } - out.print("}:\n"); - - // Now print metric values, one per line - for (Map.Entry metricValue : - pair.metricMap.entrySet()) { - indent(out, 3); - out.print(metricValue.getKey()); - out.print("="); - out.print(metricValue.getValue().toString()); - out.print("\n"); - } - } - } - } - } - - private void indent(PrintWriter out, int indent) { - for (int i = 0; i < indent; ++i) { - out.append(" "); - } - } -} http://git-wip-us.apache.org/repos/asf/hadoop/blob/36972d61/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsUtil.java ---------------------------------------------------------------------- diff --git a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsUtil.java b/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsUtil.java deleted file mode 100644 index 934a758..0000000 --- a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/MetricsUtil.java +++ /dev/null @@ -1,104 +0,0 @@ -/** - * Licensed to the Apache Software Foundation (ASF) under one - * or more contributor license agreements. See the NOTICE file - * distributed with this work for additional information - * regarding copyright ownership. The ASF licenses this file - * to you under the Apache License, Version 2.0 (the - * "License"); you may not use this file except in compliance - * with the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ -package org.apache.hadoop.metrics; - -import java.net.InetAddress; -import java.net.UnknownHostException; - -import org.apache.commons.logging.Log; -import org.apache.commons.logging.LogFactory; -import org.apache.hadoop.classification.InterfaceAudience; -import org.apache.hadoop.classification.InterfaceStability; - -/** - * Utility class to simplify creation and reporting of hadoop metrics. - * - * For examples of usage, see NameNodeMetrics. - * @see org.apache.hadoop.metrics.MetricsRecord - * @see org.apache.hadoop.metrics.MetricsContext - * @see org.apache.hadoop.metrics.ContextFactory - * @deprecated Use org.apache.hadoop.metrics2 package instead. - */ -@Deprecated -@InterfaceAudience.LimitedPrivate({"HDFS", "MapReduce"}) -@InterfaceStability.Evolving -public class MetricsUtil { - - public static final Log LOG = - LogFactory.getLog(MetricsUtil.class); - - /** - * Don't allow creation of a new instance of Metrics - */ - private MetricsUtil() {} - - public static MetricsContext getContext(String contextName) { - return getContext(contextName, contextName); - } - - /** - * Utility method to return the named context. - * If the desired context cannot be created for any reason, the exception - * is logged, and a null context is returned. - */ - public static MetricsContext getContext(String refName, String contextName) { - MetricsContext metricsContext; - try { - metricsContext = - ContextFactory.getFactory().getContext(refName, contextName); - if (!metricsContext.isMonitoring()) { - metricsContext.startMonitoring(); - } - } catch (Exception ex) { - LOG.error("Unable to create metrics context " + contextName, ex); - metricsContext = ContextFactory.getNullContext(contextName); - } - return metricsContext; - } - - /** - * Utility method to create and return new metrics record instance within the - * given context. This record is tagged with the host name. - * - * @param context the context - * @param recordName name of the record - * @return newly created metrics record - */ - public static MetricsRecord createRecord(MetricsContext context, - String recordName) - { - MetricsRecord metricsRecord = context.createRecord(recordName); - metricsRecord.setTag("hostName", getHostName()); - return metricsRecord; - } - - /** - * Returns the host name. If the host name is unobtainable, logs the - * exception and returns "unknown". - */ - private static String getHostName() { - String hostName = null; - try { - hostName = InetAddress.getLocalHost().getHostName(); - } catch (UnknownHostException ex) { - LOG.info("Unable to obtain hostName", ex); - hostName = "unknown"; - } - return hostName; - } -} http://git-wip-us.apache.org/repos/asf/hadoop/blob/36972d61/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/Updater.java ---------------------------------------------------------------------- diff --git a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/Updater.java b/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/Updater.java deleted file mode 100644 index c1a8017..0000000 --- a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/Updater.java +++ /dev/null @@ -1,41 +0,0 @@ -/* - * Updater.java - * - * Licensed to the Apache Software Foundation (ASF) under one - * or more contributor license agreements. See the NOTICE file - * distributed with this work for additional information - * regarding copyright ownership. The ASF licenses this file - * to you under the Apache License, Version 2.0 (the - * "License"); you may not use this file except in compliance - * with the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package org.apache.hadoop.metrics; - -import org.apache.hadoop.classification.InterfaceAudience; -import org.apache.hadoop.classification.InterfaceStability; - -/** - * Call-back interface. See MetricsContext.registerUpdater(). - * - * @deprecated Use org.apache.hadoop.metrics2 package instead. - */ -@Deprecated -@InterfaceAudience.LimitedPrivate({"HDFS", "MapReduce"}) -@InterfaceStability.Evolving -public interface Updater { - - /** - * Timer-based call-back from the metric library. - */ - public abstract void doUpdates(MetricsContext context); - -} http://git-wip-us.apache.org/repos/asf/hadoop/blob/36972d61/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/file/package.html ---------------------------------------------------------------------- diff --git a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/file/package.html b/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/file/package.html deleted file mode 100644 index 7358486..0000000 --- a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/file/package.html +++ /dev/null @@ -1,43 +0,0 @@ - - - - - -Implementation of the metrics package that writes the metrics to a file. -Programmers should not normally need to use this package directly. Instead -they should use org.hadoop.metrics. - -

-These are the implementation specific factory attributes -(See ContextFactory.getFactory()): - -

-
contextName.fileName
-
The path of the file to which metrics in context contextName - are to be appended. If this attribute is not specified, the metrics - are written to standard output by default.
- -
contextName.period
-
The period in seconds on which the metric data is written to the - file.
- -
- - - - http://git-wip-us.apache.org/repos/asf/hadoop/blob/36972d61/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/ganglia/GangliaContext.java ---------------------------------------------------------------------- diff --git a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/ganglia/GangliaContext.java b/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/ganglia/GangliaContext.java deleted file mode 100644 index 7e75d2d..0000000 --- a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/ganglia/GangliaContext.java +++ /dev/null @@ -1,276 +0,0 @@ -/* - * GangliaContext.java - * - * Licensed to the Apache Software Foundation (ASF) under one - * or more contributor license agreements. See the NOTICE file - * distributed with this work for additional information - * regarding copyright ownership. The ASF licenses this file - * to you under the Apache License, Version 2.0 (the - * "License"); you may not use this file except in compliance - * with the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package org.apache.hadoop.metrics.ganglia; - -import java.io.IOException; -import java.net.*; -import java.util.HashMap; -import java.util.List; -import java.util.Map; - -import org.apache.commons.io.Charsets; -import org.apache.commons.logging.Log; -import org.apache.commons.logging.LogFactory; - -import org.apache.hadoop.classification.InterfaceAudience; -import org.apache.hadoop.classification.InterfaceStability; -import org.apache.hadoop.metrics.ContextFactory; -import org.apache.hadoop.metrics.spi.AbstractMetricsContext; -import org.apache.hadoop.metrics.spi.OutputRecord; -import org.apache.hadoop.metrics.spi.Util; - -/** - * Context for sending metrics to Ganglia. - * - * @deprecated Use {@link org.apache.hadoop.metrics2.sink.ganglia.GangliaSink30} - * instead. - */ -@Deprecated -@InterfaceAudience.Public -@InterfaceStability.Evolving -public class GangliaContext extends AbstractMetricsContext { - - private static final String PERIOD_PROPERTY = "period"; - private static final String SERVERS_PROPERTY = "servers"; - private static final String UNITS_PROPERTY = "units"; - private static final String SLOPE_PROPERTY = "slope"; - private static final String TMAX_PROPERTY = "tmax"; - private static final String DMAX_PROPERTY = "dmax"; - private static final String MULTICAST_PROPERTY = "multicast"; - private static final String MULTICAST_TTL_PROPERTY = "multicast.ttl"; - - private static final String DEFAULT_UNITS = ""; - private static final String DEFAULT_SLOPE = "both"; - private static final int DEFAULT_TMAX = 60; - private static final int DEFAULT_DMAX = 0; - private static final int DEFAULT_PORT = 8649; - private static final int BUFFER_SIZE = 1500; // as per libgmond.c - private static final int DEFAULT_MULTICAST_TTL = 1; - - private final Log LOG = LogFactory.getLog(this.getClass()); - - private static final Map typeTable = new HashMap(5); - - static { - typeTable.put(String.class, "string"); - typeTable.put(Byte.class, "int8"); - typeTable.put(Short.class, "int16"); - typeTable.put(Integer.class, "int32"); - typeTable.put(Long.class, "float"); - typeTable.put(Float.class, "float"); - } - - protected byte[] buffer = new byte[BUFFER_SIZE]; - protected int offset; - - protected List metricsServers; - private Map unitsTable; - private Map slopeTable; - private Map tmaxTable; - private Map dmaxTable; - private boolean multicastEnabled; - private int multicastTtl; - - protected DatagramSocket datagramSocket; - - /** Creates a new instance of GangliaContext */ - @InterfaceAudience.Private - public GangliaContext() { - } - - @Override - @InterfaceAudience.Private - public void init(String contextName, ContextFactory factory) { - super.init(contextName, factory); - parseAndSetPeriod(PERIOD_PROPERTY); - - metricsServers = - Util.parse(getAttribute(SERVERS_PROPERTY), DEFAULT_PORT); - - unitsTable = getAttributeTable(UNITS_PROPERTY); - slopeTable = getAttributeTable(SLOPE_PROPERTY); - tmaxTable = getAttributeTable(TMAX_PROPERTY); - dmaxTable = getAttributeTable(DMAX_PROPERTY); - multicastEnabled = Boolean.parseBoolean(getAttribute(MULTICAST_PROPERTY)); - String multicastTtlValue = getAttribute(MULTICAST_TTL_PROPERTY); - if (multicastEnabled) { - if (multicastTtlValue == null) { - multicastTtl = DEFAULT_MULTICAST_TTL; - } else { - multicastTtl = Integer.parseInt(multicastTtlValue); - } - } - - try { - if (multicastEnabled) { - LOG.info("Enabling multicast for Ganglia with TTL " + multicastTtl); - datagramSocket = new MulticastSocket(); - ((MulticastSocket) datagramSocket).setTimeToLive(multicastTtl); - } else { - datagramSocket = new DatagramSocket(); - } - } catch (IOException e) { - LOG.error(e); - } - } - - /** - * method to close the datagram socket - */ - @Override - public void close() { - super.close(); - if (datagramSocket != null) { - datagramSocket.close(); - } - } - - @Override - @InterfaceAudience.Private - public void emitRecord(String contextName, String recordName, - OutputRecord outRec) - throws IOException { - // Setup so that the records have the proper leader names so they are - // unambiguous at the ganglia level, and this prevents a lot of rework - StringBuilder sb = new StringBuilder(); - sb.append(contextName); - sb.append('.'); - - if (contextName.equals("jvm") && outRec.getTag("processName") != null) { - sb.append(outRec.getTag("processName")); - sb.append('.'); - } - - sb.append(recordName); - sb.append('.'); - int sbBaseLen = sb.length(); - - // emit each metric in turn - for (String metricName : outRec.getMetricNames()) { - Object metric = outRec.getMetric(metricName); - String type = typeTable.get(metric.getClass()); - if (type != null) { - sb.append(metricName); - emitMetric(sb.toString(), type, metric.toString()); - sb.setLength(sbBaseLen); - } else { - LOG.warn("Unknown metrics type: " + metric.getClass()); - } - } - } - - protected void emitMetric(String name, String type, String value) - throws IOException { - String units = getUnits(name); - int slope = getSlope(name); - int tmax = getTmax(name); - int dmax = getDmax(name); - - offset = 0; - xdr_int(0); // metric_user_defined - xdr_string(type); - xdr_string(name); - xdr_string(value); - xdr_string(units); - xdr_int(slope); - xdr_int(tmax); - xdr_int(dmax); - - for (SocketAddress socketAddress : metricsServers) { - DatagramPacket packet = - new DatagramPacket(buffer, offset, socketAddress); - datagramSocket.send(packet); - } - } - - protected String getUnits(String metricName) { - String result = unitsTable.get(metricName); - if (result == null) { - result = DEFAULT_UNITS; - } - return result; - } - - protected int getSlope(String metricName) { - String slopeString = slopeTable.get(metricName); - if (slopeString == null) { - slopeString = DEFAULT_SLOPE; - } - return ("zero".equals(slopeString) ? 0 : 3); // see gmetric.c - } - - protected int getTmax(String metricName) { - if (tmaxTable == null) { - return DEFAULT_TMAX; - } - String tmaxString = tmaxTable.get(metricName); - if (tmaxString == null) { - return DEFAULT_TMAX; - } - else { - return Integer.parseInt(tmaxString); - } - } - - protected int getDmax(String metricName) { - String dmaxString = dmaxTable.get(metricName); - if (dmaxString == null) { - return DEFAULT_DMAX; - } - else { - return Integer.parseInt(dmaxString); - } - } - - /** - * Puts a string into the buffer by first writing the size of the string - * as an int, followed by the bytes of the string, padded if necessary to - * a multiple of 4. - */ - protected void xdr_string(String s) { - byte[] bytes = s.getBytes(Charsets.UTF_8); - int len = bytes.length; - xdr_int(len); - System.arraycopy(bytes, 0, buffer, offset, len); - offset += len; - pad(); - } - - /** - * Pads the buffer with zero bytes up to the nearest multiple of 4. - */ - private void pad() { - int newOffset = ((offset + 3) / 4) * 4; - while (offset < newOffset) { - buffer[offset++] = 0; - } - } - - /** - * Puts an integer into the buffer as 4 bytes, big-endian. - */ - protected void xdr_int(int i) { - buffer[offset++] = (byte)((i >> 24) & 0xff); - buffer[offset++] = (byte)((i >> 16) & 0xff); - buffer[offset++] = (byte)((i >> 8) & 0xff); - buffer[offset++] = (byte)(i & 0xff); - } -} http://git-wip-us.apache.org/repos/asf/hadoop/blob/36972d61/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/ganglia/GangliaContext31.java ---------------------------------------------------------------------- diff --git a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/ganglia/GangliaContext31.java b/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/ganglia/GangliaContext31.java deleted file mode 100644 index 5a53d1b..0000000 --- a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/ganglia/GangliaContext31.java +++ /dev/null @@ -1,147 +0,0 @@ -/* - * GangliaContext.java - * - * Licensed to the Apache Software Foundation (ASF) under one - * or more contributor license agreements. See the NOTICE file - * distributed with this work for additional information - * regarding copyright ownership. The ASF licenses this file - * to you under the Apache License, Version 2.0 (the - * "License"); you may not use this file except in compliance - * with the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package org.apache.hadoop.metrics.ganglia; - -import java.io.IOException; -import java.net.DatagramPacket; -import java.net.SocketAddress; -import java.net.UnknownHostException; - -import org.apache.commons.logging.Log; -import org.apache.commons.logging.LogFactory; -import org.apache.hadoop.conf.Configuration; -import org.apache.hadoop.metrics.ContextFactory; -import org.apache.hadoop.net.DNS; - -/** - * Context for sending metrics to Ganglia version 3.1.x. - * - * 3.1.1 has a slightly different wire portal compared to 3.0.x. - * - * @deprecated Use {@link org.apache.hadoop.metrics2.sink.ganglia.GangliaSink31} - * instead. - */ -@Deprecated -public class GangliaContext31 extends GangliaContext { - - String hostName = "UNKNOWN.example.com"; - - private static final Log LOG = - LogFactory.getLog("org.apache.hadoop.util.GangliaContext31"); - - @Override - public void init(String contextName, ContextFactory factory) { - super.init(contextName, factory); - - LOG.debug("Initializing the GangliaContext31 for Ganglia 3.1 metrics."); - - // Take the hostname from the DNS class. - - Configuration conf = new Configuration(); - - if (conf.get("slave.host.name") != null) { - hostName = conf.get("slave.host.name"); - } else { - try { - hostName = DNS.getDefaultHost( - conf.get("dfs.datanode.dns.interface","default"), - conf.get("dfs.datanode.dns.nameserver","default")); - } catch (UnknownHostException uhe) { - LOG.error(uhe); - hostName = "UNKNOWN.example.com"; - } - } - } - - @Override - protected void emitMetric(String name, String type, String value) - throws IOException - { - if (name == null) { - LOG.warn("Metric was emitted with no name."); - return; - } else if (value == null) { - LOG.warn("Metric name " + name +" was emitted with a null value."); - return; - } else if (type == null) { - LOG.warn("Metric name " + name + ", value " + value + " has no type."); - return; - } - - if (LOG.isDebugEnabled()) { - LOG.debug("Emitting metric " + name + ", type " + type + ", value " + - value + " from hostname" + hostName); - } - - String units = getUnits(name); - int slope = getSlope(name); - int tmax = getTmax(name); - int dmax = getDmax(name); - offset = 0; - String groupName = name.substring(0,name.lastIndexOf(".")); - - // The following XDR recipe was done through a careful reading of - // gm_protocol.x in Ganglia 3.1 and carefully examining the output of - // the gmetric utility with strace. - - // First we send out a metadata message - xdr_int(128); // metric_id = metadata_msg - xdr_string(hostName); // hostname - xdr_string(name); // metric name - xdr_int(0); // spoof = False - xdr_string(type); // metric type - xdr_string(name); // metric name - xdr_string(units); // units - xdr_int(slope); // slope - xdr_int(tmax); // tmax, the maximum time between metrics - xdr_int(dmax); // dmax, the maximum data value - - xdr_int(1); /*Num of the entries in extra_value field for - Ganglia 3.1.x*/ - xdr_string("GROUP"); /*Group attribute*/ - xdr_string(groupName); /*Group value*/ - - for (SocketAddress socketAddress : metricsServers) { - DatagramPacket packet = - new DatagramPacket(buffer, offset, socketAddress); - datagramSocket.send(packet); - } - - // Now we send out a message with the actual value. - // Technically, we only need to send out the metadata message once for - // each metric, but I don't want to have to record which metrics we did and - // did not send. - offset = 0; - xdr_int(133); // we are sending a string value - xdr_string(hostName); // hostName - xdr_string(name); // metric name - xdr_int(0); // spoof = False - xdr_string("%s"); // format field - xdr_string(value); // metric value - - for (SocketAddress socketAddress : metricsServers) { - DatagramPacket packet = - new DatagramPacket(buffer, offset, socketAddress); - datagramSocket.send(packet); - } - } - -} http://git-wip-us.apache.org/repos/asf/hadoop/blob/36972d61/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/ganglia/package.html ---------------------------------------------------------------------- diff --git a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/ganglia/package.html b/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/ganglia/package.html deleted file mode 100644 index b9acfae..0000000 --- a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/ganglia/package.html +++ /dev/null @@ -1,80 +0,0 @@ - - - - - - - - -Implementation of the metrics package that sends metric data to -Ganglia. -Programmers should not normally need to use this package directly. Instead -they should use org.hadoop.metrics. - -

-These are the implementation specific factory attributes -(See ContextFactory.getFactory()): - -

-
contextName.servers
-
Space and/or comma separated sequence of servers to which UDP - messages should be sent.
- -
contextName.period
-
The period in seconds on which the metric data is sent to the - server(s).
- -
contextName.multicast
-
Enable multicast for Ganglia
- -
contextName.multicast.ttl
-
TTL for multicast packets
- -
contextName.units.recordName.metricName
-
The units for the specified metric in the specified record.
- -
contextName.slope.recordName.metricName
-
The slope for the specified metric in the specified record.
- -
contextName.tmax.recordName.metricName
-
The tmax for the specified metric in the specified record.
- -
contextName.dmax.recordName.metricName
-
The dmax for the specified metric in the specified record.
- -
- - - - http://git-wip-us.apache.org/repos/asf/hadoop/blob/36972d61/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/jvm/EventCounter.java ---------------------------------------------------------------------- diff --git a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/jvm/EventCounter.java b/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/jvm/EventCounter.java deleted file mode 100644 index c6d5ca7..0000000 --- a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/jvm/EventCounter.java +++ /dev/null @@ -1,36 +0,0 @@ -/** - * Licensed to the Apache Software Foundation (ASF) under one - * or more contributor license agreements. See the NOTICE file - * distributed with this work for additional information - * regarding copyright ownership. The ASF licenses this file - * to you under the Apache License, Version 2.0 (the - * "License"); you may not use this file except in compliance - * with the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ -package org.apache.hadoop.metrics.jvm; - -/** - * A log4J Appender that simply counts logging events in three levels: - * fatal, error and warn. - * - * @deprecated Use org.apache.hadoop.metrics2 package instead. - */ -@Deprecated -public class EventCounter extends org.apache.hadoop.log.metrics.EventCounter { - - static { - // The logging system is not started yet. - System.err.println("WARNING: "+ EventCounter.class.getName() + - " is deprecated. Please use "+ - org.apache.hadoop.log.metrics.EventCounter.class.getName() + - " in all the log4j.properties files."); - } -} http://git-wip-us.apache.org/repos/asf/hadoop/blob/36972d61/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/jvm/JvmMetrics.java ---------------------------------------------------------------------- diff --git a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/jvm/JvmMetrics.java b/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/jvm/JvmMetrics.java deleted file mode 100644 index f9ab94a..0000000 --- a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/jvm/JvmMetrics.java +++ /dev/null @@ -1,203 +0,0 @@ -/** - * Licensed to the Apache Software Foundation (ASF) under one - * or more contributor license agreements. See the NOTICE file - * distributed with this work for additional information - * regarding copyright ownership. The ASF licenses this file - * to you under the Apache License, Version 2.0 (the - * "License"); you may not use this file except in compliance - * with the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ -package org.apache.hadoop.metrics.jvm; - -import java.lang.management.ManagementFactory; -import java.lang.management.MemoryMXBean; -import java.lang.management.MemoryUsage; -import java.lang.management.ThreadInfo; -import java.lang.management.ThreadMXBean; - -import org.apache.hadoop.classification.InterfaceAudience; -import org.apache.hadoop.classification.InterfaceStability; -import org.apache.hadoop.metrics.MetricsContext; -import org.apache.hadoop.metrics.MetricsRecord; -import org.apache.hadoop.metrics.MetricsUtil; -import org.apache.hadoop.metrics.Updater; - -import static java.lang.Thread.State.*; -import java.lang.management.GarbageCollectorMXBean; -import java.util.List; -import org.apache.commons.logging.Log; -import org.apache.commons.logging.LogFactory; - -/** - * Singleton class which reports Java Virtual Machine metrics to the metrics API. - * Any application can create an instance of this class in order to emit - * Java VM metrics. - * - * @deprecated Use {@link org.apache.hadoop.metrics2.source.JvmMetrics} instead. - */ -@Deprecated -@InterfaceAudience.Private -@InterfaceStability.Evolving -public class JvmMetrics implements Updater { - - private static final float M = 1024*1024; - private static JvmMetrics theInstance = null; - private static Log log = LogFactory.getLog(JvmMetrics.class); - - private MetricsRecord metrics; - - // garbage collection counters - private long gcCount = 0; - private long gcTimeMillis = 0; - - // logging event counters - private long fatalCount = 0; - private long errorCount = 0; - private long warnCount = 0; - private long infoCount = 0; - - public synchronized static JvmMetrics init(String processName, String sessionId) { - return init(processName, sessionId, "metrics"); - } - - public synchronized static JvmMetrics init(String processName, String sessionId, - String recordName) { - if (theInstance != null) { - log.info("Cannot initialize JVM Metrics with processName=" + - processName + ", sessionId=" + sessionId + - " - already initialized"); - } - else { - log.info("Initializing JVM Metrics with processName=" - + processName + ", sessionId=" + sessionId); - theInstance = new JvmMetrics(processName, sessionId, recordName); - } - return theInstance; - } - - /** Creates a new instance of JvmMetrics */ - private JvmMetrics(String processName, String sessionId, - String recordName) { - MetricsContext context = MetricsUtil.getContext("jvm"); - metrics = MetricsUtil.createRecord(context, recordName); - metrics.setTag("processName", processName); - metrics.setTag("sessionId", sessionId); - context.registerUpdater(this); - } - - /** - * This will be called periodically (with the period being configuration - * dependent). - */ - @Override - public void doUpdates(MetricsContext context) { - doMemoryUpdates(); - doGarbageCollectionUpdates(); - doThreadUpdates(); - doEventCountUpdates(); - metrics.update(); - } - - private void doMemoryUpdates() { - MemoryMXBean memoryMXBean = - ManagementFactory.getMemoryMXBean(); - MemoryUsage memNonHeap = - memoryMXBean.getNonHeapMemoryUsage(); - MemoryUsage memHeap = - memoryMXBean.getHeapMemoryUsage(); - Runtime runtime = Runtime.getRuntime(); - - metrics.setMetric("memNonHeapUsedM", memNonHeap.getUsed()/M); - metrics.setMetric("memNonHeapCommittedM", memNonHeap.getCommitted()/M); - metrics.setMetric("memHeapUsedM", memHeap.getUsed()/M); - metrics.setMetric("memHeapCommittedM", memHeap.getCommitted()/M); - metrics.setMetric("maxMemoryM", runtime.maxMemory()/M); - } - - private void doGarbageCollectionUpdates() { - List gcBeans = - ManagementFactory.getGarbageCollectorMXBeans(); - long count = 0; - long timeMillis = 0; - for (GarbageCollectorMXBean gcBean : gcBeans) { - count += gcBean.getCollectionCount(); - timeMillis += gcBean.getCollectionTime(); - } - metrics.incrMetric("gcCount", (int)(count - gcCount)); - metrics.incrMetric("gcTimeMillis", (int)(timeMillis - gcTimeMillis)); - - gcCount = count; - gcTimeMillis = timeMillis; - } - - private void doThreadUpdates() { - ThreadMXBean threadMXBean = - ManagementFactory.getThreadMXBean(); - long threadIds[] = - threadMXBean.getAllThreadIds(); - ThreadInfo[] threadInfos = - threadMXBean.getThreadInfo(threadIds, 0); - - int threadsNew = 0; - int threadsRunnable = 0; - int threadsBlocked = 0; - int threadsWaiting = 0; - int threadsTimedWaiting = 0; - int threadsTerminated = 0; - - for (ThreadInfo threadInfo : threadInfos) { - // threadInfo is null if the thread is not alive or doesn't exist - if (threadInfo == null) continue; - Thread.State state = threadInfo.getThreadState(); - if (state == NEW) { - threadsNew++; - } - else if (state == RUNNABLE) { - threadsRunnable++; - } - else if (state == BLOCKED) { - threadsBlocked++; - } - else if (state == WAITING) { - threadsWaiting++; - } - else if (state == TIMED_WAITING) { - threadsTimedWaiting++; - } - else if (state == TERMINATED) { - threadsTerminated++; - } - } - metrics.setMetric("threadsNew", threadsNew); - metrics.setMetric("threadsRunnable", threadsRunnable); - metrics.setMetric("threadsBlocked", threadsBlocked); - metrics.setMetric("threadsWaiting", threadsWaiting); - metrics.setMetric("threadsTimedWaiting", threadsTimedWaiting); - metrics.setMetric("threadsTerminated", threadsTerminated); - } - - private void doEventCountUpdates() { - long newFatal = EventCounter.getFatal(); - long newError = EventCounter.getError(); - long newWarn = EventCounter.getWarn(); - long newInfo = EventCounter.getInfo(); - - metrics.incrMetric("logFatal", (int)(newFatal - fatalCount)); - metrics.incrMetric("logError", (int)(newError - errorCount)); - metrics.incrMetric("logWarn", (int)(newWarn - warnCount)); - metrics.incrMetric("logInfo", (int)(newInfo - infoCount)); - - fatalCount = newFatal; - errorCount = newError; - warnCount = newWarn; - infoCount = newInfo; - } -} http://git-wip-us.apache.org/repos/asf/hadoop/blob/36972d61/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/jvm/package-info.java ---------------------------------------------------------------------- diff --git a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/jvm/package-info.java b/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/jvm/package-info.java deleted file mode 100644 index d4661e3..0000000 --- a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/jvm/package-info.java +++ /dev/null @@ -1,22 +0,0 @@ -/* - * Licensed to the Apache Software Foundation (ASF) under one - * or more contributor license agreements. See the NOTICE file - * distributed with this work for additional information - * regarding copyright ownership. The ASF licenses this file - * to you under the Apache License, Version 2.0 (the - * "License"); you may not use this file except in compliance - * with the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ -@InterfaceAudience.Private -@InterfaceStability.Evolving -package org.apache.hadoop.metrics.jvm; -import org.apache.hadoop.classification.InterfaceAudience; -import org.apache.hadoop.classification.InterfaceStability; http://git-wip-us.apache.org/repos/asf/hadoop/blob/36972d61/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/package.html ---------------------------------------------------------------------- diff --git a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/package.html b/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/package.html deleted file mode 100644 index dd16e38..0000000 --- a/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics/package.html +++ /dev/null @@ -1,159 +0,0 @@ - - - - - - org.apache.hadoop.metrics - - -This package defines an API for reporting performance metric information. -

-The API is abstract so that it can be implemented on top of -a variety of metrics client libraries. The choice of -client library is a configuration option, and different -modules within the same application can use -different metrics implementation libraries. -

-Sub-packages: -

-
org.apache.hadoop.metrics.spi
-
The abstract Server Provider Interface package. Those wishing to - integrate the metrics API with a particular metrics client library should - extend this package.
- -
org.apache.hadoop.metrics.file
-
An implementation package which writes the metric data to - a file, or sends it to the standard output stream.
- -
org.apache.hadoop.metrics.ganglia
-
An implementation package which sends metric data to - Ganglia.
-
- -

Introduction to the Metrics API

- -Here is a simple example of how to use this package to report a single -metric value: -
-    private ContextFactory contextFactory = ContextFactory.getFactory();
-    
-    void reportMyMetric(float myMetric) {
-        MetricsContext myContext = contextFactory.getContext("myContext");
-        MetricsRecord myRecord = myContext.getRecord("myRecord");
-        myRecord.setMetric("myMetric", myMetric);
-        myRecord.update();
-    }
-
- -In this example there are three names: -
-
myContext
-
The context name will typically identify either the application, or else a - module within an application or library.
- -
myRecord
-
The record name generally identifies some entity for which a set of - metrics are to be reported. For example, you could have a record named - "cacheStats" for reporting a number of statistics relating to the usage of - some cache in your application.
- -
myMetric
-
This identifies a particular metric. For example, you might have metrics - named "cache_hits" and "cache_misses". -
-
- -

Tags

- -In some cases it is useful to have multiple records with the same name. For -example, suppose that you want to report statistics about each disk on a computer. -In this case, the record name would be something like "diskStats", but you also -need to identify the disk which is done by adding a tag to the record. -The code could look something like this: -
-    private MetricsRecord diskStats =
-            contextFactory.getContext("myContext").getRecord("diskStats");
-            
-    void reportDiskMetrics(String diskName, float diskBusy, float diskUsed) {
-        diskStats.setTag("diskName", diskName);
-        diskStats.setMetric("diskBusy", diskBusy);
-        diskStats.setMetric("diskUsed", diskUsed);
-        diskStats.update();
-    }
-
- -

Buffering and Callbacks

- -Data is not sent immediately to the metrics system when -MetricsRecord.update() is called. Instead it is stored in an -internal table, and the contents of the table are sent periodically. -This can be important for two reasons: -
    -
  1. It means that a programmer is free to put calls to this API in an - inner loop, since updates can be very frequent without slowing down - the application significantly.
  2. -
  3. Some implementations can gain efficiency by combining many metrics - into a single UDP message.
  4. -
- -The API provides a timer-based callback via the -registerUpdater() method. The benefit of this -versus using java.util.Timer is that the callbacks will be done -immediately before sending the data, making the data as current as possible. - -

Configuration

- -It is possible to programmatically examine and modify configuration data -before creating a context, like this: -
-    ContextFactory factory = ContextFactory.getFactory();
-    ... examine and/or modify factory attributes ...
-    MetricsContext context = factory.getContext("myContext");
-
-The factory attributes can be examined and modified using the following -ContextFactorymethods: -
    -
  • Object getAttribute(String attributeName)
  • -
  • String[] getAttributeNames()
  • -
  • void setAttribute(String name, Object value)
  • -
  • void removeAttribute(attributeName)
  • -
- -

-ContextFactory.getFactory() initializes the factory attributes by -reading the properties file hadoop-metrics.properties if it exists -on the class path. - -

-A factory attribute named: -

-contextName.class
-
-should have as its value the fully qualified name of the class to be -instantiated by a call of the CodeFactory method -getContext(contextName). If this factory attribute is not -specified, the default is to instantiate -org.apache.hadoop.metrics.file.FileContext. - -

-Other factory attributes are specific to a particular implementation of this -API and are documented elsewhere. For example, configuration attributes for -the file and Ganglia implementations can be found in the javadoc for -their respective packages. - - --------------------------------------------------------------------- To unsubscribe, e-mail: common-commits-unsubscribe@hadoop.apache.org For additional commands, e-mail: common-commits-help@hadoop.apache.org