eagle-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From yonzhang2...@apache.org
Subject [7/7] incubator-eagle git commit: EAGLE-120 EAGLE-100 initial system and hadoop metric initial system and hadoop metric https://issues.apache.org/jira/browse/EAGLE-120 Author: qingwen220 qingwzhao@ebay.com Reviewer: yonzhang2012 yonzhang2012@apache.org C
Date Wed, 13 Jan 2016 01:08:14 GMT
EAGLE-120 EAGLE-100 initial system and hadoop metric
initial system and hadoop metric
https://issues.apache.org/jira/browse/EAGLE-120
Author: qingwen220 qingwzhao@ebay.com
Reviewer: yonzhang2012 yonzhang2012@apache.org
Closes: 62


Project: http://git-wip-us.apache.org/repos/asf/incubator-eagle/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-eagle/commit/db2bbf91
Tree: http://git-wip-us.apache.org/repos/asf/incubator-eagle/tree/db2bbf91
Diff: http://git-wip-us.apache.org/repos/asf/incubator-eagle/diff/db2bbf91

Branch: refs/heads/master
Commit: db2bbf91b30e8cc00adaadb64affb863dd41dc44
Parents: 178d066
Author: yonzhang <yonzhang@ebay.com>
Authored: Tue Jan 12 17:07:32 2016 -0800
Committer: yonzhang <yonzhang@ebay.com>
Committed: Tue Jan 12 17:07:32 2016 -0800

----------------------------------------------------------------------
 eagle-assembly/src/assembly/eagle-bin.xml       |   8 +
 eagle-external/hadoop_jmx_collector/README.md   |  75 ++
 .../add_extended_metrics.py                     |  39 +
 .../add_extended_metrics.pyc                    | Bin 0 -> 1432 bytes
 .../hadoop_jmx_collector/eagle-collector.conf   |  15 +
 .../hadoop_jmx_collector/hadoop_jmx_kafka.py    | 187 +++++
 .../lib/kafka-python/.gitignore                 |  11 +
 .../lib/kafka-python/.gitmodules                |   0
 .../lib/kafka-python/.travis.yml                |  44 ++
 .../lib/kafka-python/AUTHORS.md                 |  16 +
 .../lib/kafka-python/CHANGES.md                 |  73 ++
 .../lib/kafka-python/LICENSE                    | 202 +++++
 .../lib/kafka-python/MANIFEST.in                |   2 +
 .../lib/kafka-python/POWERED-BY.md              |   6 +
 .../lib/kafka-python/README.rst                 |  53 ++
 .../lib/kafka-python/VERSION                    |   1 +
 .../lib/kafka-python/build_integration.sh       |  67 ++
 .../lib/kafka-python/docs/Makefile              | 177 +++++
 .../kafka-python/docs/apidoc/kafka.consumer.rst |  46 ++
 .../docs/apidoc/kafka.partitioner.rst           |  38 +
 .../kafka-python/docs/apidoc/kafka.producer.rst |  38 +
 .../lib/kafka-python/docs/apidoc/kafka.rst      |  79 ++
 .../lib/kafka-python/docs/apidoc/modules.rst    |   7 +
 .../lib/kafka-python/docs/conf.py               | 272 +++++++
 .../lib/kafka-python/docs/index.rst             |  58 ++
 .../lib/kafka-python/docs/install.rst           |  79 ++
 .../lib/kafka-python/docs/make.bat              | 242 ++++++
 .../lib/kafka-python/docs/requirements.txt      |   7 +
 .../lib/kafka-python/docs/tests.rst             |  59 ++
 .../lib/kafka-python/docs/usage.rst             | 124 +++
 .../lib/kafka-python/example.py                 |  48 ++
 .../lib/kafka-python/kafka/NOTES.md             |  32 +
 .../lib/kafka-python/kafka/__init__.py          |  27 +
 .../lib/kafka-python/kafka/client.py            | 503 ++++++++++++
 .../lib/kafka-python/kafka/codec.py             | 143 ++++
 .../lib/kafka-python/kafka/common.py            | 215 ++++++
 .../lib/kafka-python/kafka/conn.py              | 208 +++++
 .../lib/kafka-python/kafka/consumer/__init__.py |   7 +
 .../lib/kafka-python/kafka/consumer/base.py     | 173 +++++
 .../lib/kafka-python/kafka/consumer/kafka.py    | 758 ++++++++++++++++++
 .../kafka-python/kafka/consumer/multiprocess.py | 251 ++++++
 .../lib/kafka-python/kafka/consumer/simple.py   | 330 ++++++++
 .../lib/kafka-python/kafka/context.py           | 175 +++++
 .../kafka-python/kafka/partitioner/__init__.py  |   6 +
 .../lib/kafka-python/kafka/partitioner/base.py  |  24 +
 .../kafka-python/kafka/partitioner/hashed.py    |  14 +
 .../kafka/partitioner/roundrobin.py             |  23 +
 .../lib/kafka-python/kafka/producer/__init__.py |   6 +
 .../lib/kafka-python/kafka/producer/base.py     | 214 ++++++
 .../lib/kafka-python/kafka/producer/keyed.py    |  68 ++
 .../lib/kafka-python/kafka/producer/simple.py   |  81 ++
 .../lib/kafka-python/kafka/protocol.py          | 604 +++++++++++++++
 .../lib/kafka-python/kafka/util.py              | 153 ++++
 .../lib/kafka-python/load_example.py            |  60 ++
 .../servers/0.8.0/resources/kafka.properties    |  59 ++
 .../servers/0.8.0/resources/log4j.properties    |  24 +
 .../0.8.0/resources/zookeeper.properties        |  19 +
 .../servers/0.8.1.1/resources/kafka.properties  | 118 +++
 .../servers/0.8.1.1/resources/log4j.properties  |  24 +
 .../0.8.1.1/resources/zookeeper.properties      |  21 +
 .../servers/0.8.1/resources/kafka.properties    |  59 ++
 .../servers/0.8.1/resources/log4j.properties    |  24 +
 .../0.8.1/resources/zookeeper.properties        |  19 +
 .../servers/0.8.2.0/resources/kafka.properties  | 118 +++
 .../servers/0.8.2.0/resources/log4j.properties  |  24 +
 .../0.8.2.0/resources/zookeeper.properties      |  21 +
 .../servers/trunk/resources/kafka.properties    | 118 +++
 .../servers/trunk/resources/log4j.properties    |  24 +
 .../trunk/resources/zookeeper.properties        |  21 +
 .../lib/kafka-python/setup.py                   |  70 ++
 .../lib/kafka-python/test/__init__.py           |   6 +
 .../lib/kafka-python/test/fixtures.py           | 236 ++++++
 .../lib/kafka-python/test/service.py            | 111 +++
 .../lib/kafka-python/test/test_client.py        | 403 ++++++++++
 .../test/test_client_integration.py             |  67 ++
 .../lib/kafka-python/test/test_codec.py         |  72 ++
 .../lib/kafka-python/test/test_conn.py          | 164 ++++
 .../lib/kafka-python/test/test_consumer.py      |  15 +
 .../test/test_consumer_integration.py           | 401 ++++++++++
 .../lib/kafka-python/test/test_context.py       | 117 +++
 .../test/test_failover_integration.py           | 201 +++++
 .../lib/kafka-python/test/test_package.py       |  29 +
 .../lib/kafka-python/test/test_producer.py      |  42 +
 .../test/test_producer_integration.py           | 481 ++++++++++++
 .../lib/kafka-python/test/test_protocol.py      | 732 ++++++++++++++++++
 .../lib/kafka-python/test/test_util.py          | 126 +++
 .../lib/kafka-python/test/testutil.py           | 105 +++
 .../lib/kafka-python/tox.ini                    |  50 ++
 .../lib/kafka-python/travis_selector.sh         |  16 +
 .../hadoop_jmx_collector/lib/six/.gitignore     |  23 +
 .../hadoop_jmx_collector/lib/six/CHANGES        | 231 ++++++
 .../hadoop_jmx_collector/lib/six/CONTRIBUTORS   |  21 +
 .../hadoop_jmx_collector/lib/six/LICENSE        |  18 +
 .../hadoop_jmx_collector/lib/six/MANIFEST.in    |   6 +
 .../hadoop_jmx_collector/lib/six/README         |  16 +
 .../lib/six/documentation/Makefile              | 130 ++++
 .../lib/six/documentation/conf.py               | 217 ++++++
 .../lib/six/documentation/index.rst             | 757 ++++++++++++++++++
 .../hadoop_jmx_collector/lib/six/setup.cfg      |   2 +
 .../hadoop_jmx_collector/lib/six/setup.py       |  32 +
 .../hadoop_jmx_collector/lib/six/six.py         | 762 +++++++++++++++++++
 .../hadoop_jmx_collector/lib/six/test_six.py    | 736 ++++++++++++++++++
 .../hadoop_jmx_collector/lib/six/tox.ini        |  12 +
 .../hadoop_jmx_collector/system_metric_kafka.py | 378 +++++++++
 .../hadoop_jmx_collector/util_func.py           |  85 +++
 .../hadoop_jmx_collector/util_func.pyc          | Bin 0 -> 2619 bytes
 eagle-gc/src/main/resources/application.conf    |   2 +-
 .../src/main/resources/application.conf         |   2 +-
 .../src/main/resources/application.conf         |   2 +-
 109 files changed, 13714 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-assembly/src/assembly/eagle-bin.xml
----------------------------------------------------------------------
diff --git a/eagle-assembly/src/assembly/eagle-bin.xml b/eagle-assembly/src/assembly/eagle-bin.xml
index d03af36..05fd0d2 100644
--- a/eagle-assembly/src/assembly/eagle-bin.xml
+++ b/eagle-assembly/src/assembly/eagle-bin.xml
@@ -59,6 +59,14 @@
             <lineEnding>unix</lineEnding>
         </fileSet>
         <fileSet>
+            <directory>${project.basedir}/../eagle-external/hadoop_jmx_collector</directory>
+            <outputDirectory>tools/hadoop_jmx_collector/</outputDirectory>
+            <includes>
+                <include>**</include>
+            </includes>
+            <lineEnding>unix</lineEnding>
+        </fileSet>
+        <fileSet>
             <directory>${project.basedir}/src/main/examples</directory>
             <outputDirectory>examples/</outputDirectory>
             <includes>

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/README.md
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/README.md b/eagle-external/hadoop_jmx_collector/README.md
new file mode 100644
index 0000000..cd8d887
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/README.md
@@ -0,0 +1,75 @@
+<!--
+{% comment %}
+Licensed to the Apache Software Foundation (ASF) under one or more
+contributor license agreements.  See the NOTICE file distributed with
+this work for additional information regarding copyright ownership.
+The ASF licenses this file to you under the Apache License, Version 2.0
+(the "License"); you may not use this file except in compliance with
+the License.  You may obtain a copy of the License at
+
+http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+{% endcomment %}
+-->
+
+
+# Hadoop Jmx Collector
+
+These scripts help to collect Hadoop jmx and evently sent the metrics to stdout or Kafka. Tested with Python 2.7.
+
+### How to use it
+
+  1. Edit the configuration file (json file). For example:
+  
+            {
+             "env": {
+              "site": "sandbox"
+             },
+             "input": {
+              "component": "namenode",
+              "port": "50070",
+              "https": false
+             },
+             "filter": {
+              "monitoring.group.selected": ["hadoop", "java.lang"]
+             },
+             "output": {
+             }
+            }
+
+  2. Run the scripts
+  
+        # for general use
+        python hadoop_jmx_kafka.py > 1.txt
+
+
+### Edit `eagle-collector.conf`
+
+* input
+
+  "port" defines the hadoop service port, such as 50070 => "namenode", 60010 => "hbase master".
+
+* filter
+
+  "monitoring.group.selected" can filter out beans which we care about.
+
+* output 
+  
+  if we left it empty, then the output is stdout by default. 
+
+        "output": {}
+        
+  It also supports Kafka as its output. 
+
+        "output": {
+          "kafka": {
+            "topic": "test_topic",
+            "brokerList": [ "sandbox.hortonworks.com:6667"]
+          }
+        }
+      

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/add_extended_metrics.py
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/add_extended_metrics.py b/eagle-external/hadoop_jmx_collector/add_extended_metrics.py
new file mode 100644
index 0000000..9e0f105
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/add_extended_metrics.py
@@ -0,0 +1,39 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+#!/usr/bin/python
+
+from util_func import *
+
+
+def cal_mem_usage(producer, topic, bean, metricMap, metric_prefix_name):
+    kafka_dict = metricMap.copy()
+    PercentVal = None
+    PercentVal = round(float(bean['MemNonHeapUsedM']) / float(bean['MemNonHeapMaxM']) * 100.0, 2)
+    send_output_message(producer, topic, kafka_dict, metric_prefix_name + ".MemNonHeapUsedUsage", PercentVal)
+
+    PercentVal = round(float(bean['MemNonHeapCommittedM']) / float(bean['MemNonHeapMaxM']) * 100, 2)
+    send_output_message(producer, topic, kafka_dict, metric_prefix_name + ".MemNonHeapCommittedUsage", PercentVal)
+
+    PercentVal = round(float(bean['MemHeapUsedM']) / float(bean['MemHeapMaxM']) * 100, 2)
+    send_output_message(producer, topic, kafka_dict, metric_prefix_name + ".MemHeapUsedUsage", PercentVal)
+
+    PercentVal = round(float(bean['MemHeapCommittedM']) / float(bean['MemHeapMaxM']) * 100, 2)
+    send_output_message(producer, topic, kafka_dict, metric_prefix_name + ".MemHeapCommittedUsage", PercentVal)
+
+
+def add_extended_metrics(producer, topic, metricMap, fat_bean):
+    cal_mem_usage(producer, topic, fat_bean, metricMap, "hadoop.namenode.jvm")

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/add_extended_metrics.pyc
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/add_extended_metrics.pyc b/eagle-external/hadoop_jmx_collector/add_extended_metrics.pyc
new file mode 100644
index 0000000..f8ed8f2
Binary files /dev/null and b/eagle-external/hadoop_jmx_collector/add_extended_metrics.pyc differ

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/eagle-collector.conf
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/eagle-collector.conf b/eagle-external/hadoop_jmx_collector/eagle-collector.conf
new file mode 100644
index 0000000..08ed74d
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/eagle-collector.conf
@@ -0,0 +1,15 @@
+{
+   "env": {
+    "site": "sandbox"
+   },
+   "input": {
+    "component": "namenode",
+    "port": "50070",
+    "https": false
+   },
+   "filter": {
+    "monitoring.group.selected": ["hadoop", "java.lang"]
+   },
+   "output": {
+   }
+}

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/hadoop_jmx_kafka.py
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/hadoop_jmx_kafka.py b/eagle-external/hadoop_jmx_collector/hadoop_jmx_kafka.py
new file mode 100644
index 0000000..8ebeb9f
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/hadoop_jmx_kafka.py
@@ -0,0 +1,187 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+#!/usr/bin/python
+
+import os
+import re
+import time
+import json
+import urllib2
+import sys
+import socket
+import types
+import httplib
+
+# load six
+sys.path.append(os.path.join(os.path.dirname(os.path.abspath(__file__)), '', 'lib/six'))
+import six
+
+# load kafka-python
+sys.path.append(os.path.join(os.path.dirname(os.path.abspath(__file__)), '', 'lib/kafka-python'))
+from kafka import KafkaClient, SimpleProducer, SimpleConsumer
+
+from util_func import *
+from add_extended_metrics import *
+
+
+DATA_TYPE = "hadoop"
+
+def readUrl(url, https=False):
+    jmxjson = 'error'
+    try:
+        if https:
+            print "Reading https://" + str(url) + "/jmx?anonymous=true"
+            c = httplib.HTTPSConnection(url, timeout=57)
+            c.request("GET", "/jmx?anonymous=true")
+            response = c.getresponse()
+        else:
+            print "Reading http://" + str(url) + "/jmx?anonymous=true"
+            response = urllib2.urlopen("http://" + url + '/jmx?anonymous=true', timeout=57)
+    except Exception, e:
+        print 'Reason: ', e
+    else:
+        # everything is fine
+        jmxjson = response.read()
+        response.close()
+    finally:
+        return jmxjson
+
+
+def get_metric_prefix_name(mbean_attribute, context):
+    mbean_list = list(prop.split("=", 1)
+                      for prop in mbean_attribute.split(","))
+    metric_prefix_name = None
+    if context == "":
+        metric_prefix_name = '.'.join([i[1] for i in mbean_list])
+    else:
+        name_index = [i[0] for i in mbean_list].index('name')
+        mbean_list[name_index][1] = context
+        metric_prefix_name = '.'.join([i[1] for i in mbean_list])
+    return DATA_TYPE + "." + metric_prefix_name
+
+
+def getHadoopData(producer, topic, config, beans, dataMap, fat_bean):
+    selected_group = [s.encode('utf-8') for s in config[u'filter'].get('monitoring.group.selected')]
+    #print selected_group
+
+    for bean in beans:
+        kafka_dict = dataMap.copy()
+
+        # mbean is of the form "domain:key=value,...,foo=bar"
+        mbean = bean[u'name']
+        mbean_domain, mbean_attribute = mbean.rstrip().split(":", 1)
+        mbean_domain = mbean_domain.lower()
+
+        # print mbean_domain
+        if mbean_domain not in selected_group:
+            # print "Unexpected mbean domain = %s on %s" % (mbean_domain, mbean)
+            continue
+
+        fat_bean.update(bean)
+
+        context = bean.get("tag.Context", "")
+        metric_prefix_name = get_metric_prefix_name(mbean_attribute, context)
+
+        # print kafka_dict
+        for key, value in bean.iteritems():
+            #print key, value
+            key = key.lower()
+            if not isNumber(value) or re.match(r'tag.*', key):
+                continue
+
+            if mbean_domain == 'hadoop' and re.match(r'^namespace', key):
+                #print key
+                items = re.split('_table_', key)
+                key = items[1]
+                items = re.split('_region_', key)
+                kafka_dict['table'] = items[0]
+                items = re.split('_metric_', items[1])
+                kafka_dict['region'] = items[0]
+                key = items[1]
+
+            metric = metric_prefix_name + '.' + key
+            send_output_message(producer, topic, kafka_dict, metric, value)
+
+
+def loadJmxData(host, inputConfig):
+    port = inputConfig.get('port')
+    https = inputConfig.get('https')
+
+    url = host + ':' + port
+    #print url
+
+    jmxjson = readUrl(url, https)
+
+    if jmxjson == 'error':
+        print 'jmx load error'
+
+    # transfer the json string into dict
+    jmx = json.loads(jmxjson)
+    beans = jmx[u'beans']
+
+    return beans
+
+
+def main():
+    kafka = None
+    producer = None
+    topic = None
+
+    try:
+        #start = time.clock()
+
+        # read the kafka.ini
+        config = loadConfigFile('eagle-collector.conf')
+        #print config
+
+        site = config[u'env'].get('site').encode('utf-8')
+        component = config[u'input'].get('component').encode('utf-8')
+        host = socket.getfqdn()
+        #host="10.249.66.185"
+
+        beans = loadJmxData(host, config[u'input'])
+
+        outputs = [s.encode('utf-8') for s in config[u'output']]
+        #print outputs
+
+        if('kafka' in outputs):
+            kafkaConfig = config[u'output'].get(u'kafka')
+            brokerList = kafkaConfig.get('brokerList')
+            topic = kafkaConfig.get('topic')
+            #print brokerList
+            kafka, producer = kafka_connect(brokerList)
+
+        dataMap = {"site": site, "host": host, "timestamp": '', "component": component, "metric": '', "value": ''}
+        fat_bean = dict()
+        getHadoopData(producer, topic, config, beans, dataMap, fat_bean)
+        add_extended_metrics(producer, topic, dataMap, fat_bean)
+
+    except Exception, e:
+        print 'main except: ', e
+
+    finally:
+        if kafka != None and producer != None:
+            kafka_close(kafka, producer)
+
+        #elapsed = (time.clock() - start)
+        #print("Time used:",elapsed)
+
+
+if __name__ == "__main__":
+    main()
+
+

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/.gitignore
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/.gitignore b/eagle-external/hadoop_jmx_collector/lib/kafka-python/.gitignore
new file mode 100644
index 0000000..30d663d
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/.gitignore
@@ -0,0 +1,11 @@
+*.egg-info
+*.pyc
+.tox
+build
+dist
+MANIFEST
+env
+servers/*/kafka-bin
+.coverage
+.noseids
+docs/_build

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/.gitmodules
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/.gitmodules b/eagle-external/hadoop_jmx_collector/lib/kafka-python/.gitmodules
new file mode 100644
index 0000000..e69de29

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/.travis.yml
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/.travis.yml b/eagle-external/hadoop_jmx_collector/lib/kafka-python/.travis.yml
new file mode 100644
index 0000000..7184bc8
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/.travis.yml
@@ -0,0 +1,44 @@
+language: python
+
+python:
+    - 2.6
+    - 2.7
+    - 3.3
+    - 3.4
+    - pypy
+
+env:
+    - UNIT_AND_LINT_ONLY=true
+    - KAFKA_VERSION=0.8.0
+    - KAFKA_VERSION=0.8.1
+    - KAFKA_VERSION=0.8.1.1
+    - KAFKA_VERSION=0.8.2.0
+
+before_install:
+    - sudo apt-get install libsnappy-dev
+    - ./build_integration.sh
+
+install:
+    - pip install tox coveralls
+    - pip install .
+    # Deal with issue on Travis builders re: multiprocessing.Queue :(
+    # See https://github.com/travis-ci/travis-cookbooks/issues/155
+    - sudo rm -rf /dev/shm && sudo ln -s /run/shm /dev/shm
+
+deploy:
+  provider: pypi
+  server: https://pypi.python.org/pypi
+  user: mumrah
+  password:
+    secure: TIZNKxktOm42/LHLDCuKuPqmAfYKekyHL4MqEFpnqDI5T5sHzG9IQaOwppYfQNggHiILUBzk1j6w/FPJunJyd62AFtydkKtIccqENIIAio78afeCRMQDynstNXjDefmt0s90xLGSlLzDMxCEWB4F6frEtPl/8KpNSFB2fvj+HXY=
+  on:
+    tags: true
+    all_branches: true
+    # TODO replace all_branches with "branch: master" after https://github.com/travis-ci/travis-ci/issues/1675 is fixed
+    # branch: master
+
+script:
+  - if [ -n "$UNIT_AND_LINT_ONLY" ]; then tox -e lint,`./travis_selector.sh $TRAVIS_PYTHON_VERSION`; else tox -e `./travis_selector.sh $TRAVIS_PYTHON_VERSION`; fi
+
+after_success:
+  - coveralls

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/AUTHORS.md
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/AUTHORS.md b/eagle-external/hadoop_jmx_collector/lib/kafka-python/AUTHORS.md
new file mode 100644
index 0000000..67e3789
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/AUTHORS.md
@@ -0,0 +1,16 @@
+# Contributors
+
+Top 10 contributors, listed by contribution. See https://github.com/mumrah/kafka-python/graphs/contributors for the full list
+
+* David Arthur, [@mumrah](https://github.com/mumrah)
+* Dana Powers, [@dpkp](https://github.com/dpkp)
+* Mahendra M, [@mahendra](https://github.com/mahendra)
+* Mark Roberts, [@wizzat](https://github.com/wizzat)
+* Omar, [@rdiomar](https://github.com/rdiomar) - RIP, Omar. 2014
+* Bruno Renié, [@brutasse](https://github.com/brutasse)
+* Marc Labbé, [@mrtheb](https://github.com/mrtheb)
+* Ivan Pouzyrevsky, [@sandello](https://github.com/sandello)
+* Thomas Dimson, [@cosbynator](https://github.com/cosbynator)
+* Zack Dever, [@zever](https://github.com/zever)
+
+Thanks to all who have contributed!

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/CHANGES.md
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/CHANGES.md b/eagle-external/hadoop_jmx_collector/lib/kafka-python/CHANGES.md
new file mode 100644
index 0000000..5704afa
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/CHANGES.md
@@ -0,0 +1,73 @@
+# 0.9.3 (Feb 3, 2015)
+
+* Add coveralls.io support (sontek PR 307)
+* Fix python2.6 threading.Event bug in ReentrantTimer (dpkp PR 312)
+* Add kafka 0.8.2.0 to travis integration tests (dpkp PR 310)
+* Auto-convert topics to utf-8 bytes in Producer (sontek PR 306)
+* Fix reference cycle between SimpleConsumer and ReentrantTimer (zhaopengzp PR 309)
+* Add Sphinx API docs (wedaly PR 282)
+* Handle additional error cases exposed by 0.8.2.0 kafka server (dpkp PR 295)
+* Refactor error class management (alexcb PR 289)
+* Expose KafkaConsumer in __all__ for easy imports (Dinoshauer PR 286)
+* SimpleProducer starts on random partition by default (alexcb PR 288)
+* Add keys to compressed messages (meandthewallaby PR 281)
+* Add new high-level KafkaConsumer class based on java client api (dpkp PR 234)
+* Add KeyedProducer.send_messages api (pubnub PR 277)
+* Fix consumer pending() method (jettify PR 276)
+* Update low-level demo in README (sunisdown PR 274)
+* Include key in KeyedProducer messages (se7entyse7en PR 268)
+* Fix SimpleConsumer timeout behavior in get_messages (dpkp PR 238)
+* Fix error in consumer.py test against max_buffer_size (rthille/wizzat PR 225/242)
+* Improve string concat performance on pypy / py3 (dpkp PR 233)
+* Reorg directory layout for consumer/producer/partitioners (dpkp/wizzat PR 232/243)
+* Add OffsetCommitContext (locationlabs PR 217)
+* Metadata Refactor (dpkp  PR 223)
+* Add Python 3 support (brutasse/wizzat - PR 227)
+* Minor cleanups - imports / README / PyPI classifiers (dpkp - PR 221)
+* Fix socket test (dpkp - PR 222)
+* Fix exception catching bug in test_failover_integration (zever - PR 216)
+
+# 0.9.2 (Aug 26, 2014)
+
+* Warn users that async producer does not reliably handle failures (dpkp - PR 213)
+* Fix spurious ConsumerFetchSizeTooSmall error in consumer (DataDog - PR 136)
+* Use PyLint for static error checking (dpkp - PR 208)
+* Strictly enforce str message type in producer.send_messages (dpkp - PR 211)
+* Add test timers via nose-timer plugin; list 10 slowest timings by default (dpkp)
+* Move fetching last known offset logic to a stand alone function (zever - PR 177)
+* Improve KafkaConnection and add more tests (dpkp - PR 196)
+* Raise TypeError if necessary when encoding strings (mdaniel - PR 204) 
+* Use Travis-CI to publish tagged releases to pypi (tkuhlman / mumrah)
+* Use official binary tarballs for integration tests and parallelize travis tests (dpkp - PR 193)
+* Improve new-topic creation handling (wizzat - PR 174)
+
+# 0.9.1 (Aug 10, 2014)
+
+* Add codec parameter to Producers to enable compression (patricklucas - PR 166)
+* Support IPv6 hosts and network (snaury - PR 169)
+* Remove dependency on distribute (patricklucas - PR 163)
+* Fix connection error timeout and improve tests (wizzat - PR 158)
+* SimpleProducer randomization of initial round robin ordering (alexcb - PR 139)
+* Fix connection timeout in KafkaClient and KafkaConnection (maciejkula - PR 161)
+* Fix seek + commit behavior (wizzat - PR 148) 
+
+
+# 0.9.0 (Mar 21, 2014)
+
+* Connection refactor and test fixes (wizzat - PR 134)
+* Fix when partition has no leader (mrtheb - PR 109)
+* Change Producer API to take topic as send argument, not as instance variable (rdiomar - PR 111)
+* Substantial refactor and Test Fixing (rdiomar - PR 88)
+* Fix Multiprocess Consumer on windows (mahendra - PR 62)
+* Improve fault tolerance; add integration tests (jimjh)
+* PEP8 / Flakes / Style cleanups (Vetoshkin Nikita; mrtheb - PR 59)
+* Setup Travis CI (jimjh - PR 53/54)
+* Fix import of BufferUnderflowError (jimjh - PR 49)
+* Fix code examples in README (StevenLeRoux - PR 47/48)
+
+# 0.8.0
+
+* Changing auto_commit to False in [SimpleConsumer](kafka/consumer.py), until 0.8.1 is release offset commits are unsupported
+* Adding fetch_size_bytes to SimpleConsumer constructor to allow for user-configurable fetch sizes
+* Allow SimpleConsumer to automatically increase the fetch size if a partial message is read and no other messages were read during that fetch request. The increase factor is 1.5
+* Exception classes moved to kafka.common

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/LICENSE
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/LICENSE b/eagle-external/hadoop_jmx_collector/lib/kafka-python/LICENSE
new file mode 100644
index 0000000..412a2b6
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/LICENSE
@@ -0,0 +1,202 @@
+                                 Apache License
+                           Version 2.0, January 2004
+                        http://www.apache.org/licenses/
+
+   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+   1. Definitions.
+
+      "License" shall mean the terms and conditions for use, reproduction,
+      and distribution as defined by Sections 1 through 9 of this document.
+
+      "Licensor" shall mean the copyright owner or entity authorized by
+      the copyright owner that is granting the License.
+
+      "Legal Entity" shall mean the union of the acting entity and all
+      other entities that control, are controlled by, or are under common
+      control with that entity. For the purposes of this definition,
+      "control" means (i) the power, direct or indirect, to cause the
+      direction or management of such entity, whether by contract or
+      otherwise, or (ii) ownership of fifty percent (50%) or more of the
+      outstanding shares, or (iii) beneficial ownership of such entity.
+
+      "You" (or "Your") shall mean an individual or Legal Entity
+      exercising permissions granted by this License.
+
+      "Source" form shall mean the preferred form for making modifications,
+      including but not limited to software source code, documentation
+      source, and configuration files.
+
+      "Object" form shall mean any form resulting from mechanical
+      transformation or translation of a Source form, including but
+      not limited to compiled object code, generated documentation,
+      and conversions to other media types.
+
+      "Work" shall mean the work of authorship, whether in Source or
+      Object form, made available under the License, as indicated by a
+      copyright notice that is included in or attached to the work
+      (an example is provided in the Appendix below).
+
+      "Derivative Works" shall mean any work, whether in Source or Object
+      form, that is based on (or derived from) the Work and for which the
+      editorial revisions, annotations, elaborations, or other modifications
+      represent, as a whole, an original work of authorship. For the purposes
+      of this License, Derivative Works shall not include works that remain
+      separable from, or merely link (or bind by name) to the interfaces of,
+      the Work and Derivative Works thereof.
+
+      "Contribution" shall mean any work of authorship, including
+      the original version of the Work and any modifications or additions
+      to that Work or Derivative Works thereof, that is intentionally
+      submitted to Licensor for inclusion in the Work by the copyright owner
+      or by an individual or Legal Entity authorized to submit on behalf of
+      the copyright owner. For the purposes of this definition, "submitted"
+      means any form of electronic, verbal, or written communication sent
+      to the Licensor or its representatives, including but not limited to
+      communication on electronic mailing lists, source code control systems,
+      and issue tracking systems that are managed by, or on behalf of, the
+      Licensor for the purpose of discussing and improving the Work, but
+      excluding communication that is conspicuously marked or otherwise
+      designated in writing by the copyright owner as "Not a Contribution."
+
+      "Contributor" shall mean Licensor and any individual or Legal Entity
+      on behalf of whom a Contribution has been received by Licensor and
+      subsequently incorporated within the Work.
+
+   2. Grant of Copyright License. Subject to the terms and conditions of
+      this License, each Contributor hereby grants to You a perpetual,
+      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+      copyright license to reproduce, prepare Derivative Works of,
+      publicly display, publicly perform, sublicense, and distribute the
+      Work and such Derivative Works in Source or Object form.
+
+   3. Grant of Patent License. Subject to the terms and conditions of
+      this License, each Contributor hereby grants to You a perpetual,
+      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+      (except as stated in this section) patent license to make, have made,
+      use, offer to sell, sell, import, and otherwise transfer the Work,
+      where such license applies only to those patent claims licensable
+      by such Contributor that are necessarily infringed by their
+      Contribution(s) alone or by combination of their Contribution(s)
+      with the Work to which such Contribution(s) was submitted. If You
+      institute patent litigation against any entity (including a
+      cross-claim or counterclaim in a lawsuit) alleging that the Work
+      or a Contribution incorporated within the Work constitutes direct
+      or contributory patent infringement, then any patent licenses
+      granted to You under this License for that Work shall terminate
+      as of the date such litigation is filed.
+
+   4. Redistribution. You may reproduce and distribute copies of the
+      Work or Derivative Works thereof in any medium, with or without
+      modifications, and in Source or Object form, provided that You
+      meet the following conditions:
+
+      (a) You must give any other recipients of the Work or
+          Derivative Works a copy of this License; and
+
+      (b) You must cause any modified files to carry prominent notices
+          stating that You changed the files; and
+
+      (c) You must retain, in the Source form of any Derivative Works
+          that You distribute, all copyright, patent, trademark, and
+          attribution notices from the Source form of the Work,
+          excluding those notices that do not pertain to any part of
+          the Derivative Works; and
+
+      (d) If the Work includes a "NOTICE" text file as part of its
+          distribution, then any Derivative Works that You distribute must
+          include a readable copy of the attribution notices contained
+          within such NOTICE file, excluding those notices that do not
+          pertain to any part of the Derivative Works, in at least one
+          of the following places: within a NOTICE text file distributed
+          as part of the Derivative Works; within the Source form or
+          documentation, if provided along with the Derivative Works; or,
+          within a display generated by the Derivative Works, if and
+          wherever such third-party notices normally appear. The contents
+          of the NOTICE file are for informational purposes only and
+          do not modify the License. You may add Your own attribution
+          notices within Derivative Works that You distribute, alongside
+          or as an addendum to the NOTICE text from the Work, provided
+          that such additional attribution notices cannot be construed
+          as modifying the License.
+
+      You may add Your own copyright statement to Your modifications and
+      may provide additional or different license terms and conditions
+      for use, reproduction, or distribution of Your modifications, or
+      for any such Derivative Works as a whole, provided Your use,
+      reproduction, and distribution of the Work otherwise complies with
+      the conditions stated in this License.
+
+   5. Submission of Contributions. Unless You explicitly state otherwise,
+      any Contribution intentionally submitted for inclusion in the Work
+      by You to the Licensor shall be under the terms and conditions of
+      this License, without any additional terms or conditions.
+      Notwithstanding the above, nothing herein shall supersede or modify
+      the terms of any separate license agreement you may have executed
+      with Licensor regarding such Contributions.
+
+   6. Trademarks. This License does not grant permission to use the trade
+      names, trademarks, service marks, or product names of the Licensor,
+      except as required for reasonable and customary use in describing the
+      origin of the Work and reproducing the content of the NOTICE file.
+
+   7. Disclaimer of Warranty. Unless required by applicable law or
+      agreed to in writing, Licensor provides the Work (and each
+      Contributor provides its Contributions) on an "AS IS" BASIS,
+      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+      implied, including, without limitation, any warranties or conditions
+      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+      PARTICULAR PURPOSE. You are solely responsible for determining the
+      appropriateness of using or redistributing the Work and assume any
+      risks associated with Your exercise of permissions under this License.
+
+   8. Limitation of Liability. In no event and under no legal theory,
+      whether in tort (including negligence), contract, or otherwise,
+      unless required by applicable law (such as deliberate and grossly
+      negligent acts) or agreed to in writing, shall any Contributor be
+      liable to You for damages, including any direct, indirect, special,
+      incidental, or consequential damages of any character arising as a
+      result of this License or out of the use or inability to use the
+      Work (including but not limited to damages for loss of goodwill,
+      work stoppage, computer failure or malfunction, or any and all
+      other commercial damages or losses), even if such Contributor
+      has been advised of the possibility of such damages.
+
+   9. Accepting Warranty or Additional Liability. While redistributing
+      the Work or Derivative Works thereof, You may choose to offer,
+      and charge a fee for, acceptance of support, warranty, indemnity,
+      or other liability obligations and/or rights consistent with this
+      License. However, in accepting such obligations, You may act only
+      on Your own behalf and on Your sole responsibility, not on behalf
+      of any other Contributor, and only if You agree to indemnify,
+      defend, and hold each Contributor harmless for any liability
+      incurred by, or claims asserted against, such Contributor by reason
+      of your accepting any such warranty or additional liability.
+
+   END OF TERMS AND CONDITIONS
+
+   APPENDIX: How to apply the Apache License to your work.
+
+      To apply the Apache License to your work, attach the following
+      boilerplate notice, with the fields enclosed by brackets "[]"
+      replaced with your own identifying information. (Don't include
+      the brackets!)  The text should be enclosed in the appropriate
+      comment syntax for the file format. We also recommend that a
+      file or class name and description of purpose be included on the
+      same "printed page" as the copyright notice for easier
+      identification within third-party archives.
+
+   Copyright 2015 David Arthur
+
+   Licensed under the Apache License, Version 2.0 (the "License");
+   you may not use this file except in compliance with the License.
+   You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
+

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/MANIFEST.in
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/MANIFEST.in b/eagle-external/hadoop_jmx_collector/lib/kafka-python/MANIFEST.in
new file mode 100644
index 0000000..68bd793
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/MANIFEST.in
@@ -0,0 +1,2 @@
+include VERSION
+recursive-include kafka *.py

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/POWERED-BY.md
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/POWERED-BY.md b/eagle-external/hadoop_jmx_collector/lib/kafka-python/POWERED-BY.md
new file mode 100644
index 0000000..f2e323c
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/POWERED-BY.md
@@ -0,0 +1,6 @@
+# Project/People/Companies using kafka-python
+
+If you're using this library and care to give us a shout out, please fork the project,
+add yourself here, and submit a pull request. Thanks!
+
+* [@mumrah](https://github.com/mumrah), adding myself as an example

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/README.rst
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/README.rst b/eagle-external/hadoop_jmx_collector/lib/kafka-python/README.rst
new file mode 100644
index 0000000..5405f92
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/README.rst
@@ -0,0 +1,53 @@
+Kafka Python client
+------------------------
+.. image:: https://api.travis-ci.org/mumrah/kafka-python.png?branch=master
+    :target: https://travis-ci.org/mumrah/kafka-python
+    :alt: Build Status
+
+.. image:: https://coveralls.io/repos/mumrah/kafka-python/badge.svg?branch=master
+    :target: https://coveralls.io/r/mumrah/kafka-python?branch=master
+    :alt: Coverage Status
+
+.. image:: https://readthedocs.org/projects/kafka-python/badge/?version=latest
+    :target: http://kafka-python.readthedocs.org/en/latest/
+    :alt: Full documentation available on ReadTheDocs
+
+`Full documentation available on ReadTheDocs`_
+
+This module provides low-level protocol support for Apache Kafka as well as
+high-level consumer and producer classes. Request batching is supported by the
+protocol as well as broker-aware request routing. Gzip and Snappy compression
+is also supported for message sets.
+
+http://kafka.apache.org/
+
+On Freenode IRC at #kafka-python, as well as #apache-kafka
+
+For general discussion of kafka-client design and implementation (not python specific),
+see https://groups.google.com/forum/#!forum/kafka-clients
+
+License
+----------
+Copyright 2015, David Arthur under Apache License, v2.0. See `LICENSE`
+
+Status
+----------
+The current stable version of this package is `0.9.3`_ and is compatible with:
+
+Kafka broker versions
+
+- 0.8.2.0 [offset management currently ZK only -- does not support ConsumerCoordinator offset management APIs]
+- 0.8.1.1
+- 0.8.1
+- 0.8.0
+
+Python versions
+
+- 2.6 (tested on 2.6.9)
+- 2.7 (tested on 2.7.9)
+- 3.3 (tested on 3.3.5)
+- 3.4 (tested on 3.4.2)
+- pypy (tested on pypy 2.4.0 / python 2.7.8)
+
+.. _Full documentation available on ReadTheDocs: http://kafka-python.readthedocs.org/en/latest/
+.. _0.9.3: https://github.com/mumrah/kafka-python/releases/tag/v0.9.3

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/VERSION
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/VERSION b/eagle-external/hadoop_jmx_collector/lib/kafka-python/VERSION
new file mode 100644
index 0000000..8caff32
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/VERSION
@@ -0,0 +1 @@
+0.9.4-dev

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/build_integration.sh
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/build_integration.sh b/eagle-external/hadoop_jmx_collector/lib/kafka-python/build_integration.sh
new file mode 100755
index 0000000..2b81745
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/build_integration.sh
@@ -0,0 +1,67 @@
+#!/bin/bash
+
+# Versions available for testing via binary distributions
+OFFICIAL_RELEASES="0.8.0 0.8.1 0.8.1.1 0.8.2.0"
+
+# Useful configuration vars, with sensible defaults
+if [ -z "$SCALA_VERSION" ]; then
+  SCALA_VERSION=2.10
+fi
+
+# On travis CI, empty KAFKA_VERSION means skip integration tests
+# so we dont try to get binaries 
+# Otherwise it means test all official releases, so we get all of them!
+if [ -z "$KAFKA_VERSION" -a -z "$TRAVIS" ]; then
+  KAFKA_VERSION=$OFFICIAL_RELEASES
+fi
+
+# By default look for binary releases at archive.apache.org
+if [ -z "$DIST_BASE_URL" ]; then
+  DIST_BASE_URL="https://archive.apache.org/dist/kafka/"
+fi
+
+# When testing against source builds, use this git repo
+if [ -z "$KAFKA_SRC_GIT" ]; then
+  KAFKA_SRC_GIT="https://github.com/apache/kafka.git"
+fi
+
+pushd servers
+  mkdir -p dist
+  pushd dist
+    for kafka in $KAFKA_VERSION; do
+      if [ "$kafka" == "trunk" ]; then
+        if [ ! -d "$kafka" ]; then
+          git clone $KAFKA_SRC_GIT $kafka
+        fi
+        pushd $kafka
+          git pull
+          ./gradlew -PscalaVersion=$SCALA_VERSION -Pversion=$kafka releaseTarGz -x signArchives
+        popd
+        # Not sure how to construct the .tgz name accurately, so use a wildcard (ugh)
+        tar xzvf $kafka/core/build/distributions/kafka_*.tgz -C ../$kafka/
+        rm $kafka/core/build/distributions/kafka_*.tgz
+        mv ../$kafka/kafka_* ../$kafka/kafka-bin
+      else
+        echo "-------------------------------------"
+        echo "Checking kafka binaries for ${kafka}"
+        echo
+        # kafka 0.8.0 is only available w/ scala 2.8.0
+        if [ "$kafka" == "0.8.0" ]; then
+          KAFKA_ARTIFACT="kafka_2.8.0-${kafka}"
+        else
+          KAFKA_ARTIFACT="kafka_${SCALA_VERSION}-${kafka}"
+        fi
+        wget -N https://archive.apache.org/dist/kafka/$kafka/${KAFKA_ARTIFACT}.tgz || wget -N https://archive.apache.org/dist/kafka/$kafka/${KAFKA_ARTIFACT}.tar.gz
+        echo
+        if [ ! -d "../$kafka/kafka-bin" ]; then
+          echo "Extracting kafka binaries for ${kafka}"
+          tar xzvf ${KAFKA_ARTIFACT}.t* -C ../$kafka/
+          mv ../$kafka/${KAFKA_ARTIFACT} ../$kafka/kafka-bin
+        else
+          echo "$kafka/kafka-bin directory already exists -- skipping tgz extraction"
+        fi
+      fi
+      echo
+    done
+  popd
+popd

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/Makefile
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/Makefile b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/Makefile
new file mode 100644
index 0000000..5751f68
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/Makefile
@@ -0,0 +1,177 @@
+# Makefile for Sphinx documentation
+#
+
+# You can set these variables from the command line.
+SPHINXOPTS    =
+SPHINXBUILD   = sphinx-build
+PAPER         =
+BUILDDIR      = _build
+
+# User-friendly check for sphinx-build
+ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
+$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
+endif
+
+# Internal variables.
+PAPEROPT_a4     = -D latex_paper_size=a4
+PAPEROPT_letter = -D latex_paper_size=letter
+ALLSPHINXOPTS   = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
+# the i18n builder cannot share the environment and doctrees with the others
+I18NSPHINXOPTS  = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
+
+.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext
+
+help:
+	@echo "Please use \`make <target>' where <target> is one of"
+	@echo "  html       to make standalone HTML files"
+	@echo "  dirhtml    to make HTML files named index.html in directories"
+	@echo "  singlehtml to make a single large HTML file"
+	@echo "  pickle     to make pickle files"
+	@echo "  json       to make JSON files"
+	@echo "  htmlhelp   to make HTML files and a HTML help project"
+	@echo "  qthelp     to make HTML files and a qthelp project"
+	@echo "  devhelp    to make HTML files and a Devhelp project"
+	@echo "  epub       to make an epub"
+	@echo "  latex      to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
+	@echo "  latexpdf   to make LaTeX files and run them through pdflatex"
+	@echo "  latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
+	@echo "  text       to make text files"
+	@echo "  man        to make manual pages"
+	@echo "  texinfo    to make Texinfo files"
+	@echo "  info       to make Texinfo files and run them through makeinfo"
+	@echo "  gettext    to make PO message catalogs"
+	@echo "  changes    to make an overview of all changed/added/deprecated items"
+	@echo "  xml        to make Docutils-native XML files"
+	@echo "  pseudoxml  to make pseudoxml-XML files for display purposes"
+	@echo "  linkcheck  to check all external links for integrity"
+	@echo "  doctest    to run all doctests embedded in the documentation (if enabled)"
+
+clean:
+	rm -rf $(BUILDDIR)/*
+
+html:
+	$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
+	@echo
+	@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
+
+dirhtml:
+	$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
+	@echo
+	@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
+
+singlehtml:
+	$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
+	@echo
+	@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
+
+pickle:
+	$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
+	@echo
+	@echo "Build finished; now you can process the pickle files."
+
+json:
+	$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
+	@echo
+	@echo "Build finished; now you can process the JSON files."
+
+htmlhelp:
+	$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
+	@echo
+	@echo "Build finished; now you can run HTML Help Workshop with the" \
+	      ".hhp project file in $(BUILDDIR)/htmlhelp."
+
+qthelp:
+	$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
+	@echo
+	@echo "Build finished; now you can run "qcollectiongenerator" with the" \
+	      ".qhcp project file in $(BUILDDIR)/qthelp, like this:"
+	@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/kafka-python.qhcp"
+	@echo "To view the help file:"
+	@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/kafka-python.qhc"
+
+devhelp:
+	$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
+	@echo
+	@echo "Build finished."
+	@echo "To view the help file:"
+	@echo "# mkdir -p $$HOME/.local/share/devhelp/kafka-python"
+	@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/kafka-python"
+	@echo "# devhelp"
+
+epub:
+	$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
+	@echo
+	@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
+
+latex:
+	$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
+	@echo
+	@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
+	@echo "Run \`make' in that directory to run these through (pdf)latex" \
+	      "(use \`make latexpdf' here to do that automatically)."
+
+latexpdf:
+	$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
+	@echo "Running LaTeX files through pdflatex..."
+	$(MAKE) -C $(BUILDDIR)/latex all-pdf
+	@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
+
+latexpdfja:
+	$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
+	@echo "Running LaTeX files through platex and dvipdfmx..."
+	$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
+	@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
+
+text:
+	$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
+	@echo
+	@echo "Build finished. The text files are in $(BUILDDIR)/text."
+
+man:
+	$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
+	@echo
+	@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
+
+texinfo:
+	$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
+	@echo
+	@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
+	@echo "Run \`make' in that directory to run these through makeinfo" \
+	      "(use \`make info' here to do that automatically)."
+
+info:
+	$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
+	@echo "Running Texinfo files through makeinfo..."
+	make -C $(BUILDDIR)/texinfo info
+	@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
+
+gettext:
+	$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
+	@echo
+	@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
+
+changes:
+	$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
+	@echo
+	@echo "The overview file is in $(BUILDDIR)/changes."
+
+linkcheck:
+	$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
+	@echo
+	@echo "Link check complete; look for any errors in the above output " \
+	      "or in $(BUILDDIR)/linkcheck/output.txt."
+
+doctest:
+	$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
+	@echo "Testing of doctests in the sources finished, look at the " \
+	      "results in $(BUILDDIR)/doctest/output.txt."
+
+xml:
+	$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
+	@echo
+	@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
+
+pseudoxml:
+	$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
+	@echo
+	@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/kafka.consumer.rst
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/kafka.consumer.rst b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/kafka.consumer.rst
new file mode 100644
index 0000000..8595f99
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/kafka.consumer.rst
@@ -0,0 +1,46 @@
+kafka.consumer package
+======================
+
+Submodules
+----------
+
+kafka.consumer.base module
+--------------------------
+
+.. automodule:: kafka.consumer.base
+    :members:
+    :undoc-members:
+    :show-inheritance:
+
+kafka.consumer.kafka module
+---------------------------
+
+.. automodule:: kafka.consumer.kafka
+    :members:
+    :undoc-members:
+    :show-inheritance:
+
+kafka.consumer.multiprocess module
+----------------------------------
+
+.. automodule:: kafka.consumer.multiprocess
+    :members:
+    :undoc-members:
+    :show-inheritance:
+
+kafka.consumer.simple module
+----------------------------
+
+.. automodule:: kafka.consumer.simple
+    :members:
+    :undoc-members:
+    :show-inheritance:
+
+
+Module contents
+---------------
+
+.. automodule:: kafka.consumer
+    :members:
+    :undoc-members:
+    :show-inheritance:

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/kafka.partitioner.rst
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/kafka.partitioner.rst b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/kafka.partitioner.rst
new file mode 100644
index 0000000..ea215f1
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/kafka.partitioner.rst
@@ -0,0 +1,38 @@
+kafka.partitioner package
+=========================
+
+Submodules
+----------
+
+kafka.partitioner.base module
+-----------------------------
+
+.. automodule:: kafka.partitioner.base
+    :members:
+    :undoc-members:
+    :show-inheritance:
+
+kafka.partitioner.hashed module
+-------------------------------
+
+.. automodule:: kafka.partitioner.hashed
+    :members:
+    :undoc-members:
+    :show-inheritance:
+
+kafka.partitioner.roundrobin module
+-----------------------------------
+
+.. automodule:: kafka.partitioner.roundrobin
+    :members:
+    :undoc-members:
+    :show-inheritance:
+
+
+Module contents
+---------------
+
+.. automodule:: kafka.partitioner
+    :members:
+    :undoc-members:
+    :show-inheritance:

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/kafka.producer.rst
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/kafka.producer.rst b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/kafka.producer.rst
new file mode 100644
index 0000000..bd850bb
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/kafka.producer.rst
@@ -0,0 +1,38 @@
+kafka.producer package
+======================
+
+Submodules
+----------
+
+kafka.producer.base module
+--------------------------
+
+.. automodule:: kafka.producer.base
+    :members:
+    :undoc-members:
+    :show-inheritance:
+
+kafka.producer.keyed module
+---------------------------
+
+.. automodule:: kafka.producer.keyed
+    :members:
+    :undoc-members:
+    :show-inheritance:
+
+kafka.producer.simple module
+----------------------------
+
+.. automodule:: kafka.producer.simple
+    :members:
+    :undoc-members:
+    :show-inheritance:
+
+
+Module contents
+---------------
+
+.. automodule:: kafka.producer
+    :members:
+    :undoc-members:
+    :show-inheritance:

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/kafka.rst
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/kafka.rst b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/kafka.rst
new file mode 100644
index 0000000..eb04c35
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/kafka.rst
@@ -0,0 +1,79 @@
+kafka package
+=============
+
+Subpackages
+-----------
+
+.. toctree::
+
+    kafka.consumer
+    kafka.partitioner
+    kafka.producer
+
+Submodules
+----------
+
+kafka.client module
+-------------------
+
+.. automodule:: kafka.client
+    :members:
+    :undoc-members:
+    :show-inheritance:
+
+kafka.codec module
+------------------
+
+.. automodule:: kafka.codec
+    :members:
+    :undoc-members:
+    :show-inheritance:
+
+kafka.common module
+-------------------
+
+.. automodule:: kafka.common
+    :members:
+    :undoc-members:
+    :show-inheritance:
+
+kafka.conn module
+-----------------
+
+.. automodule:: kafka.conn
+    :members:
+    :undoc-members:
+    :show-inheritance:
+
+kafka.context module
+--------------------
+
+.. automodule:: kafka.context
+    :members:
+    :undoc-members:
+    :show-inheritance:
+
+kafka.protocol module
+---------------------
+
+.. automodule:: kafka.protocol
+    :members:
+    :undoc-members:
+    :show-inheritance:
+
+kafka.util module
+-----------------
+
+.. automodule:: kafka.util
+    :members:
+    :undoc-members:
+    :show-inheritance:
+
+
+Module contents
+---------------
+
+.. automodule:: kafka
+    :members:
+    :undoc-members:
+    :show-inheritance:

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/modules.rst
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/modules.rst b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/modules.rst
new file mode 100644
index 0000000..db3e580
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/apidoc/modules.rst
@@ -0,0 +1,7 @@
+kafka
+=====
+
+.. toctree::
+   :maxdepth: 4
+
+   kafka

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/conf.py
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/conf.py b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/conf.py
new file mode 100644
index 0000000..ea223c2
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/conf.py
@@ -0,0 +1,272 @@
+# -*- coding: utf-8 -*-
+#
+# kafka-python documentation build configuration file, created by
+# sphinx-quickstart on Sun Jan  4 12:21:50 2015.
+#
+# This file is execfile()d with the current directory set to its
+# containing dir.
+#
+# Note that not all possible configuration values are present in this
+# autogenerated file.
+#
+# All configuration values have a default; values that are commented out
+# serve to show the default.
+
+import sys
+import os
+
+# If extensions (or modules to document with autodoc) are in another directory,
+# add these directories to sys.path here. If the directory is relative to the
+# documentation root, use os.path.abspath to make it absolute, like shown here.
+#sys.path.insert(0, os.path.abspath('.'))
+
+# -- General configuration ------------------------------------------------
+
+# If your documentation needs a minimal Sphinx version, state it here.
+#needs_sphinx = '1.0'
+
+# Add any Sphinx extension module names here, as strings. They can be
+# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
+# ones.
+extensions = [
+    'sphinx.ext.autodoc',
+    'sphinx.ext.intersphinx',
+    'sphinx.ext.viewcode',
+    'sphinxcontrib.napoleon',
+]
+
+# Add any paths that contain templates here, relative to this directory.
+templates_path = ['_templates']
+
+# The suffix of source filenames.
+source_suffix = '.rst'
+
+# The encoding of source files.
+#source_encoding = 'utf-8-sig'
+
+# The master toctree document.
+master_doc = 'index'
+
+# General information about the project.
+project = u'kafka-python'
+copyright = u'2015, David Arthur'
+
+# The version info for the project you're documenting, acts as replacement for
+# |version| and |release|, also used in various other places throughout the
+# built documents.
+#
+# The short X.Y version.
+with open('../VERSION') as version_file:
+  version = version_file.read()
+
+# The full version, including alpha/beta/rc tags.
+release = version
+
+# The language for content autogenerated by Sphinx. Refer to documentation
+# for a list of supported languages.
+#language = None
+
+# There are two options for replacing |today|: either, you set today to some
+# non-false value, then it is used:
+#today = ''
+# Else, today_fmt is used as the format for a strftime call.
+#today_fmt = '%B %d, %Y'
+
+# List of patterns, relative to source directory, that match files and
+# directories to ignore when looking for source files.
+exclude_patterns = ['_build']
+
+# The reST default role (used for this markup: `text`) to use for all
+# documents.
+#default_role = None
+
+# If true, '()' will be appended to :func: etc. cross-reference text.
+#add_function_parentheses = True
+
+# If true, the current module name will be prepended to all description
+# unit titles (such as .. function::).
+#add_module_names = True
+
+# If true, sectionauthor and moduleauthor directives will be shown in the
+# output. They are ignored by default.
+#show_authors = False
+
+# The name of the Pygments (syntax highlighting) style to use.
+pygments_style = 'sphinx'
+
+# A list of ignored prefixes for module index sorting.
+#modindex_common_prefix = []
+
+# If true, keep warnings as "system message" paragraphs in the built documents.
+#keep_warnings = False
+
+
+# -- Options for HTML output ----------------------------------------------
+
+# The theme to use for HTML and HTML Help pages.  See the documentation for
+# a list of builtin themes.
+html_theme = 'default'
+
+# Theme options are theme-specific and customize the look and feel of a theme
+# further.  For a list of options available for each theme, see the
+# documentation.
+#html_theme_options = {}
+
+# Add any paths that contain custom themes here, relative to this directory.
+#html_theme_path = []
+
+# The name for this set of Sphinx documents.  If None, it defaults to
+# "<project> v<release> documentation".
+#html_title = None
+
+# A shorter title for the navigation bar.  Default is the same as html_title.
+#html_short_title = None
+
+# The name of an image file (relative to this directory) to place at the top
+# of the sidebar.
+#html_logo = None
+
+# The name of an image file (within the static path) to use as favicon of the
+# docs.  This file should be a Windows icon file (.ico) being 16x16 or 32x32
+# pixels large.
+#html_favicon = None
+
+# Add any paths that contain custom static files (such as style sheets) here,
+# relative to this directory. They are copied after the builtin static files,
+# so a file named "default.css" will overwrite the builtin "default.css".
+html_static_path = ['_static']
+
+# Add any extra paths that contain custom files (such as robots.txt or
+# .htaccess) here, relative to this directory. These files are copied
+# directly to the root of the documentation.
+#html_extra_path = []
+
+# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
+# using the given strftime format.
+#html_last_updated_fmt = '%b %d, %Y'
+
+# If true, SmartyPants will be used to convert quotes and dashes to
+# typographically correct entities.
+#html_use_smartypants = True
+
+# Custom sidebar templates, maps document names to template names.
+#html_sidebars = {}
+
+# Additional templates that should be rendered to pages, maps page names to
+# template names.
+#html_additional_pages = {}
+
+# If false, no module index is generated.
+#html_domain_indices = True
+
+# If false, no index is generated.
+#html_use_index = True
+
+# If true, the index is split into individual pages for each letter.
+#html_split_index = False
+
+# If true, links to the reST sources are added to the pages.
+#html_show_sourcelink = True
+
+# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
+#html_show_sphinx = True
+
+# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
+#html_show_copyright = True
+
+# If true, an OpenSearch description file will be output, and all pages will
+# contain a <link> tag referring to it.  The value of this option must be the
+# base URL from which the finished HTML is served.
+#html_use_opensearch = ''
+
+# This is the file name suffix for HTML files (e.g. ".xhtml").
+#html_file_suffix = None
+
+# Output file base name for HTML help builder.
+htmlhelp_basename = 'kafka-pythondoc'
+
+
+# -- Options for LaTeX output ---------------------------------------------
+
+latex_elements = {
+# The paper size ('letterpaper' or 'a4paper').
+#'papersize': 'letterpaper',
+
+# The font size ('10pt', '11pt' or '12pt').
+#'pointsize': '10pt',
+
+# Additional stuff for the LaTeX preamble.
+#'preamble': '',
+}
+
+# Grouping the document tree into LaTeX files. List of tuples
+# (source start file, target name, title,
+#  author, documentclass [howto, manual, or own class]).
+latex_documents = [
+  ('index', 'kafka-python.tex', u'kafka-python Documentation',
+   u'David Arthur', 'manual'),
+]
+
+# The name of an image file (relative to this directory) to place at the top of
+# the title page.
+#latex_logo = None
+
+# For "manual" documents, if this is true, then toplevel headings are parts,
+# not chapters.
+#latex_use_parts = False
+
+# If true, show page references after internal links.
+#latex_show_pagerefs = False
+
+# If true, show URL addresses after external links.
+#latex_show_urls = False
+
+# Documents to append as an appendix to all manuals.
+#latex_appendices = []
+
+# If false, no module index is generated.
+#latex_domain_indices = True
+
+
+# -- Options for manual page output ---------------------------------------
+
+# One entry per manual page. List of tuples
+# (source start file, name, description, authors, manual section).
+man_pages = [
+    ('index', 'kafka-python', u'kafka-python Documentation',
+     [u'David Arthur'], 1)
+]
+
+# If true, show URL addresses after external links.
+#man_show_urls = False
+
+
+# -- Options for Texinfo output -------------------------------------------
+
+# Grouping the document tree into Texinfo files. List of tuples
+# (source start file, target name, title, author,
+#  dir menu entry, description, category)
+texinfo_documents = [
+  ('index', 'kafka-python', u'kafka-python Documentation',
+   u'David Arthur', 'kafka-python', 'One line description of project.',
+   'Miscellaneous'),
+]
+
+# Documents to append as an appendix to all manuals.
+#texinfo_appendices = []
+
+# If false, no module index is generated.
+#texinfo_domain_indices = True
+
+# How to display URL addresses: 'footnote', 'no', or 'inline'.
+#texinfo_show_urls = 'footnote'
+
+# If true, do not generate a @detailmenu in the "Top" node's menu.
+#texinfo_no_detailmenu = False
+
+on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
+
+if not on_rtd:  # only import and set the theme if we're building docs locally
+    import sphinx_rtd_theme
+    html_theme = 'sphinx_rtd_theme'
+    html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/index.rst
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/index.rst b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/index.rst
new file mode 100644
index 0000000..e4a9ac7
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/index.rst
@@ -0,0 +1,58 @@
+kafka-python
+============
+
+This module provides low-level protocol support for Apache Kafka as well as
+high-level consumer and producer classes. Request batching is supported by the
+protocol as well as broker-aware request routing. Gzip and Snappy compression
+is also supported for message sets.
+
+http://kafka.apache.org/
+
+On Freenode IRC at #kafka-python, as well as #apache-kafka
+
+For general discussion of kafka-client design and implementation (not python specific),
+see https://groups.google.com/forum/m/#!forum/kafka-clients
+
+Status
+------
+
+The current stable version of this package is `0.9.3 <https://github.com/mumrah/kafka-python/releases/tag/v0.9.3>`_ and is compatible with:
+
+Kafka broker versions
+
+* 0.8.2.0 [offset management currently ZK only -- does not support ConsumerCoordinator offset management APIs]
+* 0.8.1.1
+* 0.8.1
+* 0.8.0
+
+Python versions
+
+* 2.6 (tested on 2.6.9)
+* 2.7 (tested on 2.7.9)
+* 3.3 (tested on 3.3.5)
+* 3.4 (tested on 3.4.2)
+* pypy (tested on pypy 2.4.0 / python 2.7.8)
+
+License
+-------
+
+Copyright 2015, David Arthur under Apache License, v2.0. See `LICENSE <https://github.com/mumrah/kafka-python/blob/master/LICENSE>`_.
+
+
+Contents
+--------
+
+.. toctree::
+   :maxdepth: 2
+
+   install
+   tests
+   usage
+   API reference </apidoc/modules>
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/install.rst
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/install.rst b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/install.rst
new file mode 100644
index 0000000..1dd6d4e
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/install.rst
@@ -0,0 +1,79 @@
+Install
+=======
+
+Install with your favorite package manager
+
+Latest Release
+--------------
+Pip:
+
+.. code:: bash
+
+    pip install kafka-python
+
+Releases are also listed at https://github.com/mumrah/kafka-python/releases
+
+
+Bleeding-Edge
+-------------
+
+.. code:: bash
+
+    git clone https://github.com/mumrah/kafka-python
+    pip install ./kafka-python
+
+Setuptools:
+
+.. code:: bash
+
+    git clone https://github.com/mumrah/kafka-python
+    easy_install ./kafka-python
+
+Using `setup.py` directly:
+
+.. code:: bash
+
+    git clone https://github.com/mumrah/kafka-python
+    cd kafka-python
+    python setup.py install
+
+
+Optional Snappy install
+-----------------------
+
+Install Development Libraries
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+Download and build Snappy from http://code.google.com/p/snappy/downloads/list
+
+Ubuntu:
+
+.. code:: bash
+
+    apt-get install libsnappy-dev
+
+OSX:
+
+.. code:: bash
+
+    brew install snappy
+
+From Source:
+
+.. code:: bash
+
+    wget http://snappy.googlecode.com/files/snappy-1.0.5.tar.gz
+    tar xzvf snappy-1.0.5.tar.gz
+    cd snappy-1.0.5
+    ./configure
+    make
+    sudo make install
+
+Install Python Module
+^^^^^^^^^^^^^^^^^^^^^
+
+Install the `python-snappy` module
+
+.. code:: bash
+
+    pip install python-snappy

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/make.bat
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/make.bat b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/make.bat
new file mode 100644
index 0000000..9d38ac1
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/make.bat
@@ -0,0 +1,242 @@
+@ECHO OFF
+
+REM Command file for Sphinx documentation
+
+if "%SPHINXBUILD%" == "" (
+	set SPHINXBUILD=sphinx-build
+)
+set BUILDDIR=_build
+set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% .
+set I18NSPHINXOPTS=%SPHINXOPTS% .
+if NOT "%PAPER%" == "" (
+	set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS%
+	set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS%
+)
+
+if "%1" == "" goto help
+
+if "%1" == "help" (
+	:help
+	echo.Please use `make ^<target^>` where ^<target^> is one of
+	echo.  html       to make standalone HTML files
+	echo.  dirhtml    to make HTML files named index.html in directories
+	echo.  singlehtml to make a single large HTML file
+	echo.  pickle     to make pickle files
+	echo.  json       to make JSON files
+	echo.  htmlhelp   to make HTML files and a HTML help project
+	echo.  qthelp     to make HTML files and a qthelp project
+	echo.  devhelp    to make HTML files and a Devhelp project
+	echo.  epub       to make an epub
+	echo.  latex      to make LaTeX files, you can set PAPER=a4 or PAPER=letter
+	echo.  text       to make text files
+	echo.  man        to make manual pages
+	echo.  texinfo    to make Texinfo files
+	echo.  gettext    to make PO message catalogs
+	echo.  changes    to make an overview over all changed/added/deprecated items
+	echo.  xml        to make Docutils-native XML files
+	echo.  pseudoxml  to make pseudoxml-XML files for display purposes
+	echo.  linkcheck  to check all external links for integrity
+	echo.  doctest    to run all doctests embedded in the documentation if enabled
+	goto end
+)
+
+if "%1" == "clean" (
+	for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i
+	del /q /s %BUILDDIR%\*
+	goto end
+)
+
+
+%SPHINXBUILD% 2> nul
+if errorlevel 9009 (
+	echo.
+	echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
+	echo.installed, then set the SPHINXBUILD environment variable to point
+	echo.to the full path of the 'sphinx-build' executable. Alternatively you
+	echo.may add the Sphinx directory to PATH.
+	echo.
+	echo.If you don't have Sphinx installed, grab it from
+	echo.http://sphinx-doc.org/
+	exit /b 1
+)
+
+if "%1" == "html" (
+	%SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Build finished. The HTML pages are in %BUILDDIR%/html.
+	goto end
+)
+
+if "%1" == "dirhtml" (
+	%SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml.
+	goto end
+)
+
+if "%1" == "singlehtml" (
+	%SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml.
+	goto end
+)
+
+if "%1" == "pickle" (
+	%SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Build finished; now you can process the pickle files.
+	goto end
+)
+
+if "%1" == "json" (
+	%SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Build finished; now you can process the JSON files.
+	goto end
+)
+
+if "%1" == "htmlhelp" (
+	%SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Build finished; now you can run HTML Help Workshop with the ^
+.hhp project file in %BUILDDIR%/htmlhelp.
+	goto end
+)
+
+if "%1" == "qthelp" (
+	%SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Build finished; now you can run "qcollectiongenerator" with the ^
+.qhcp project file in %BUILDDIR%/qthelp, like this:
+	echo.^> qcollectiongenerator %BUILDDIR%\qthelp\kafka-python.qhcp
+	echo.To view the help file:
+	echo.^> assistant -collectionFile %BUILDDIR%\qthelp\kafka-python.ghc
+	goto end
+)
+
+if "%1" == "devhelp" (
+	%SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Build finished.
+	goto end
+)
+
+if "%1" == "epub" (
+	%SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Build finished. The epub file is in %BUILDDIR%/epub.
+	goto end
+)
+
+if "%1" == "latex" (
+	%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Build finished; the LaTeX files are in %BUILDDIR%/latex.
+	goto end
+)
+
+if "%1" == "latexpdf" (
+	%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
+	cd %BUILDDIR%/latex
+	make all-pdf
+	cd %BUILDDIR%/..
+	echo.
+	echo.Build finished; the PDF files are in %BUILDDIR%/latex.
+	goto end
+)
+
+if "%1" == "latexpdfja" (
+	%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
+	cd %BUILDDIR%/latex
+	make all-pdf-ja
+	cd %BUILDDIR%/..
+	echo.
+	echo.Build finished; the PDF files are in %BUILDDIR%/latex.
+	goto end
+)
+
+if "%1" == "text" (
+	%SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Build finished. The text files are in %BUILDDIR%/text.
+	goto end
+)
+
+if "%1" == "man" (
+	%SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Build finished. The manual pages are in %BUILDDIR%/man.
+	goto end
+)
+
+if "%1" == "texinfo" (
+	%SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo.
+	goto end
+)
+
+if "%1" == "gettext" (
+	%SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Build finished. The message catalogs are in %BUILDDIR%/locale.
+	goto end
+)
+
+if "%1" == "changes" (
+	%SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.The overview file is in %BUILDDIR%/changes.
+	goto end
+)
+
+if "%1" == "linkcheck" (
+	%SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Link check complete; look for any errors in the above output ^
+or in %BUILDDIR%/linkcheck/output.txt.
+	goto end
+)
+
+if "%1" == "doctest" (
+	%SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Testing of doctests in the sources finished, look at the ^
+results in %BUILDDIR%/doctest/output.txt.
+	goto end
+)
+
+if "%1" == "xml" (
+	%SPHINXBUILD% -b xml %ALLSPHINXOPTS% %BUILDDIR%/xml
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Build finished. The XML files are in %BUILDDIR%/xml.
+	goto end
+)
+
+if "%1" == "pseudoxml" (
+	%SPHINXBUILD% -b pseudoxml %ALLSPHINXOPTS% %BUILDDIR%/pseudoxml
+	if errorlevel 1 exit /b 1
+	echo.
+	echo.Build finished. The pseudo-XML files are in %BUILDDIR%/pseudoxml.
+	goto end
+)
+
+:end

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/requirements.txt
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/requirements.txt b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/requirements.txt
new file mode 100644
index 0000000..86b4f05
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/requirements.txt
@@ -0,0 +1,7 @@
+sphinx
+sphinxcontrib-napoleon
+
+# Install kafka-python in editable mode
+# This allows the sphinx autodoc module
+# to load the Python modules and extract docstrings.
+# -e ..

http://git-wip-us.apache.org/repos/asf/incubator-eagle/blob/db2bbf91/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/tests.rst
----------------------------------------------------------------------
diff --git a/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/tests.rst b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/tests.rst
new file mode 100644
index 0000000..df9a3ef
--- /dev/null
+++ b/eagle-external/hadoop_jmx_collector/lib/kafka-python/docs/tests.rst
@@ -0,0 +1,59 @@
+Tests
+=====
+
+Run the unit tests
+------------------
+
+.. code:: bash
+
+    tox
+
+
+Run a subset of unit tests
+--------------------------
+
+.. code:: bash
+
+    # run protocol tests only
+    tox -- -v test.test_protocol
+
+    # test with pypy only
+    tox -e pypy
+
+    # Run only 1 test, and use python 2.7
+    tox -e py27 -- -v --with-id --collect-only
+
+    # pick a test number from the list like #102
+    tox -e py27 -- -v --with-id 102
+
+
+Run the integration tests
+-------------------------
+
+The integration tests will actually start up real local Zookeeper
+instance and Kafka brokers, and send messages in using the client.
+
+First, get the kafka binaries for integration testing:
+
+.. code:: bash
+
+    ./build_integration.sh
+
+By default, the build_integration.sh script will download binary
+distributions for all supported kafka versions.
+To test against the latest source build, set KAFKA_VERSION=trunk
+and optionally set SCALA_VERSION (defaults to 2.8.0, but 2.10.1 is recommended)
+
+.. code:: bash
+
+    SCALA_VERSION=2.10.1 KAFKA_VERSION=trunk ./build_integration.sh
+
+Then run the tests against supported Kafka versions, simply set the `KAFKA_VERSION`
+env variable to the server build you want to use for testing:
+
+.. code:: bash
+
+    KAFKA_VERSION=0.8.0 tox
+    KAFKA_VERSION=0.8.1 tox
+    KAFKA_VERSION=0.8.1.1 tox
+    KAFKA_VERSION=trunk tox


Mime
View raw message