eagle-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From h..@apache.org
Subject svn commit: r1728410 [2/2] - in /incubator/eagle: ./ site/ site/about/ site/css/ site/data/ site/development/ site/docs/ site/docs/tutorial/ site/fonts/ site/images/ site/images/docs/ site/images/posts/ site/images/slider/ site/images/usecases/ site/js...
Date Thu, 04 Feb 2016 06:27:17 GMT
Added: incubator/eagle/web/index.html
URL: http://svn.apache.org/viewvc/incubator/eagle/web/index.html?rev=1728410&view=auto
==============================================================================
--- incubator/eagle/web/index.html (added)
+++ incubator/eagle/web/index.html Thu Feb  4 06:27:16 2016
@@ -0,0 +1,474 @@
+<!DOCTYPE html>
+
+<head>
+<title>Apache Eagle - Secure Hadoop in Real Time</title>
+<meta name="keywords" content="Apache Eagle, Hadoop, Security, Real Time">
+<meta name="description" content="Apache Eagle - Secure Hadoop in Real Time">
+<meta name="author" content="eBay Inc">
+<meta charset="utf-8">
+<meta name="viewport" content="initial-scale=1">
+
+<!-- Style Sheets -->
+<link rel="stylesheet" href="css/animate.css">
+<link rel="stylesheet" href="css/bootstrap.min.css">
+<link rel="stylesheet" href="css/misc.css">
+<link rel="stylesheet" href="css/style.css">
+<link rel="stylesheet" href="css/styles.css">
+<link rel="stylesheet" href="css/colorbox.css">
+<link rel="shortcut icon" href="images/favicon.png">
+<script src="//load.sumome.com/" data-sumo-site-id="4f1f82ddde38afb72321e7c702051b89fb2ab9d54d314434184ebe0ea5b5fa37" async></script>
+
+  <!-- Baidu Analytics Tracking-->
+  <script>
+  var _hmt = _hmt || [];
+  (function() {
+    var hm = document.createElement("script");
+    hm.src = "//hm.baidu.com/hm.js?fedc55df2ea52777a679192e8f849ece";
+    var s = document.getElementsByTagName("script")[0]; 
+    s.parentNode.insertBefore(hm, s);
+  })();
+  </script>
+  
+  <!-- Google Analytics Tracking -->
+  <script>
+    (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
+    (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
+    m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
+    })(window,document,'script','//www.google-analytics.com/analytics.js','ga');
+    ga('create', 'UA-68929805-1', 'auto');
+    ga('send', 'pageview');
+  </script>
+</head>
+<body>
+<!-- header start -->
+<div id="home_page">
+  <div class="topbar">
+    <div class="container">
+      <div class="row" >
+        <nav class="navbar navbar-default">
+          <div class="container-fluid"> 
+            <!-- Brand and toggle get grouped for better mobile display -->
+            <div class="navbar-header">
+              <button type="button" class="navbar-toggle collapsed" data-toggle="collapse" data-target="#bs-example-navbar-collapse-1"> <span class="sr-only">Toggle navigation</span> <span class="icon-bar"></span> <span class="icon-bar"></span> <span class="icon-bar"></span> </button>
+              <a class="navbar-brand" href="#"><img src="images/logo2.png" height="44px" style="margin-top:-7px"></a> </div>
+            
+            <!-- Collect the nav links, forms, and other content for toggling -->
+            <div class="collapse navbar-collapse" id="bs-example-navbar-collapse-1">
+              <ul class="nav navbar-nav navbar-right" id="top-menu">
+                <li><a class="menu" href="#home_page">HOME</a></li>
+                <li><a class="menu" href="#about_page">ABOUT</a></li>
+                <li><a class="menu" href="#diagram_page">ARCHITECTURE</a></li>
+                <li><a class="menu" href="#modules_page">MODULES</a></li>
+                <li><a class="menu" href="#usecase_page">USE CASES</a></li>
+                <li><a class="menu" href="#community_page">COMMUNITY</a></li>
+              </ul>
+            </div>
+            <!-- /.navbar-collapse --> 
+          </div>
+          <!-- /.container-fluid --> 
+        </nav>
+      </div>
+    </div>
+  </div>
+  <div class="headerimage">
+    <div class="flexslider">
+
+      <ul class="slides">
+      
+        <li><img src="images/slider/3.jpg" alt="Slide 1"></li>
+      </ul>
+    </div>
+  </div>
+    <div class="particles"> </div><!---particles-->
+  <div class="slider-caption">
+    <div class="homewrapper">
+      <div class="hometitle"> <img src="images/feather.png" height="60px"> </div>
+      <div class="hometext">
+        <h2 style="font-weight:500;">Apache Eagle</h2>
+        <h3>Secure Hadoop in Real Time</h3>
+     </div>
+    </div>
+    
+     <div class="download"><a href="https://github.com/apache/incubator-eagle" target="_blank" title="Github">GITHUB</a></div>   <div class="download" style="margin-left:10px;" title="Documents"><a href="docs/">DOCS</a></div>
+  </div>
+</div>
+
+<!-- header end -->
+
+<!-- team start -->
+
+
+<!-- team start -->
+<div class="workwrapper" id="about_page">
+  <div class="container">
+    <div class="row">
+      <h2 class="sectiontile">ABOUT EAGLE</h2>
+      <div class="col-md-12">
+        <p style="width:80%; margin-left:auto; margin-right:auto;"> Apache Eagle is an Open Source Monitoring solution, contributed by <a href="http://www.ebayinc.com/">eBay Inc</a>, to instantly identify access to sensitive data, recognize attacks, malicious activities in Hadoop and take actions in real time.
+</p> 
+<div class="sepline">
+ <a class='youtube' width="560" height="315"  href="https://www.youtube.com/embed/6j5jXgAgjv8"><div class="video">Watch the Intro Video</div></a> 
+ </div>
+<!--<iframe width="560" height="315" src="https://www.youtube.com/embed/6j5jXgAgjv8" frameborder="0" allowfullscreen></iframe>-->
+
+        <P>Eagle has been accepted as an Apache Incubator Project on Oct 26, 2015.</P>
+        <p>Eagle secures Hadoop in realtime in 3 steps:</p>
+      </div>
+    </div>
+    <section id="cd-timeline" class="cd-container" style="margin-top:-3px;">
+      <div class="cd-timeline-block">
+        <div class="cd-timeline-img cd-picture"> <img src="images/step1.png" alt="Picture"> </div>
+        <div class="cd-timeline-content service-box-content">
+          <h3>Step 1</h3>
+          <p>Listens to the various user generated signals by streaming through the Hadoop Audit and System logs. </p>
+        </div>
+      </div>
+      <div class="cd-timeline-block">
+        <div class="cd-timeline-img cd-movie"> <img src="images/step2.png" alt="Picture"> </div>
+        <div class="cd-timeline-content service-box-content">
+          <h3>Step 2</h3>
+          <p>Analyzes these signals to detect anomalies based on predefined security policies and machine learning models according to the user's past behavior.</p>
+        </div>
+      </div>
+      <div class="cd-timeline-block">
+        <div class="cd-timeline-img cd-icon"> <img src="images/step3.png" alt="Picture"> </div>
+        <div class="cd-timeline-content service-box-content">
+          <h3>Step 3</h3>
+          <p>Generates Alerts on detection of anomalies and remediates user activity/access if necessary. </p>
+        </div>
+      </div>
+      <div class="cd-timeline-block">
+        <div class="cd-timeline-img cd-location"> <img src="images/step4.png" alt="Picture"> </div>
+        <div class="cd-timeline-content service-box-content">
+          <h3>Additional Bonus</h3>
+          <p>Eagle also provides an extensible architecture and framework for developers to easily onboard new use cases into the framework.</p>
+        </div>
+      </div>
+    </section>
+  </div>
+</div>
+</div>
+
+<!-- team end -->
+
+<div class="clear"></div>
+
+<div class="servicewrapper" id="about_page">
+  <div class="container">
+
+    <div class="row">
+        
+      <div class="row" style="margin-top:50px;">
+        <div class="col-md-3 workmargin">
+       
+          <h3 class="featuretext">Ease Of Use</h3>
+          <h4 class="featureexplain"> Start using Eagle in few minutes</h4>
+          <p>Eagle can be installed as an Ambari plugin. And you can use one instance of eagle to monitor multiple clusters.</p>
+        </div>
+        <div class="col-md-3 workmargin">
+        
+         <h3 class="featuretext">Policy Management</h3>
+          <h4 class="featureexplain"> Create complex monitoring policies</h4>
+          <p>Eagle offers wide variety of policies on HDFS, Hive data sets. It is easy to create and deploy policies with few clicks. </p>
+        </div>
+        <div class="col-md-3 workmargin">
+         
+         <h3 class="featuretext">Scalability</h3>
+          <h4 class="featureexplain"> Scale for Big Data </h4>
+          <p> Eagle processes access events in a scalable way to support billions of events and thousands of policies.</p>
+        </div>
+        <div class="col-md-3 workmargin">
+         
+         <h3 class="featuretext">Extensibility</h3>
+          <h4 class="featureexplain"> Eagle is extensible at all levels </h4>
+          <p>Easy to integrate with data classification and hadoop access control products. Eagle backend can support data sources other than hadoop.</p>
+        </div>
+      </div>
+    </div>
+   
+  </div>
+</div>
+
+
+<!-- team end --> 
+
+
+<div class="clear"></div>
+
+<!-- diagram start -->
+
+<div class="workwrapper" id="diagram_page">
+  <div class="container">
+    <div class="row">
+      <h2 class="sectiontile">ARCHITECTURE</h2>
+      <p style="width:80%; margin-left:auto; margin-right:auto;">Apache Eagle</p>
+      <div class="col-md-12 marginbot"> <img src="images/diagram2.png" style="border-radius: 10px; margin-top:10px;"> </div>
+    </div>
+  </div>
+</div>
+<!-- diagram end -->
+
+<div class="clear"></div>
+<!-- diagram start -->
+
+<div class="client_wrapper" id="modules_page">
+  <div class="container">
+    <div class="row">
+      <h2 class="sectiontile">MODULES</h2>
+      <p style="width:80%; margin-left:auto; margin-right:auto;">Eagle is built using big data open source products to scale for big data use cases.</p>
+    </div>
+    <div class="row" style="margin-top:20px;">
+      <div class="col-md-4">
+        <div class="well2">
+          <h4>Processing Engine</h4>
+          <p>Eagle stream processing API is an abstraction of Apache Storm, extensible with other streaming engine. Eagle processing is built to scale to large number of events and policies.</p>
+        </div>
+      </div>
+      <div class="col-md-4">
+        <div class="well2">
+          <h4>Alerting Framework</h4>
+          <p>Eagle’s extensible policy engine framework allows developer to plugin a new policy engine with a few lines of codes. Eagle supports WSO2 Siddhi CEP engine.</p>
+        </div>
+      </div>
+      <div class="col-md-4">
+        <div class="well2">
+          <h4>Machine Learning module</h4>
+         
+          <p>Eagle provides capabilities to define user profiles for Hadoop users based on the user behavior. Eagle uses Spark for data training and Storm for near real-time anomaly detection.</p>
+        </div>
+      </div>
+      <div class="col-md-4">
+        <div class="well2">
+          <h4>Data Classification</h4>
+          <p>Eagle allows you to easily classify HDFS and Hive data based on sensitivity of the data. Eagle can also be easily extended to import data classification from other tools. </p>
+        </div>
+      </div>
+      <div class="col-md-4">
+        <div class="well2">
+          <h4>Policy Manager</h4>
+          <p>Eagle policy manager provides a easy to use interface and Restful API for user to define policy with a few clicks. Eagle supports window functions to create complex policies.</p>
+        </div>
+      </div>
+      <div class="col-md-4">
+        <div class="well2">
+          <h4>Query Service</h4>
+          <p>Eagle provides SQL-like service API to support comprehensive computation for huge set of data, for example filtering, aggregation, histogram, sorting, top, arithmetical expression, pagination etc.</p>
+        </div>
+      </div>
+    </div>
+  </div>
+</div>
+<!-- diagram end -->
+
+<div class="clear"></div>
+
+<!-- team start -->
+<div class="workwrapper" id="usecase_page">
+  <div class="container">
+    <div class="row">
+      <h2 class="sectiontile">USE CASES</h2>
+      <div class="row" style="margin-top:50px;">
+        <div class="col-md-3 workmargin">
+          <div><img src="images/usecases/1.png" class="usecaseimage"></div>
+          <h4>Data loss prevention</h4>
+          <p>Prevent a user with malicious intent to move, copy, delete sensitive data from hadoop.</p>
+        </div>
+        <div class="col-md-3 workmargin">
+          <div><img src="images/usecases/3.png" class="usecaseimage"></div>
+          <h4>Active Access monitoring  </h4>
+          <p>Monitor every single user access to hadoop data sets in HDFS and Hive in realtime.</p>
+        </div>
+        <div class="col-md-3 workmargin">
+          <div><img src="images/usecases/4.png" class="usecaseimage"></div>
+          <h4>User Profiles</h4>
+          <p>Create user profiles based on the user behavior using machine learning algorithms. And use these models to detect and alert on anomalies.</p>
+        </div>
+        <div class="col-md-3 workmargin">
+          <div><img src="images/usecases/2.png" class="usecaseimage"></div>
+          <h4>Unauthorized Access</h4>
+          <p>Identify, alert and stop an unauthorized user trying to access sensitive data stored in the hadoop cluster. </p>
+        </div>
+      </div>
+    </div>
+  </div>
+</div>
+
+<div class="clear"></div>
+<!-- team start -->
+<div class="client_wrapper" id="community_page">
+  <div class="container">
+    <div class="row">
+      <h2 class="sectiontile">WHO USES EAGLE</h2>
+      <div class="row" style="margin-top:50px;">
+        <div class="workmargin">
+          <a href="http://www.ebay.com/"><img style="width: 130px" src="/images/ebay.png" class="usecaseimage"></a>
+          <a href="https://www.paypal.com"><img style="width: 160px;margin-left:60px;" src="/images/paypal.svg" class="usecaseimage"></a>
+        </div>
+      </div>
+    </div>
+  </div>
+</div>
+<!-- team end --> 
+
+<div class="clear"></div>
+
+<!-- team start -->
+<div class="workwrapper" id="community_page">
+  <div class="container">
+    <div class="row">
+      <h2 class="sectiontile">COMMUNITY</h2>
+      <div class="row" style="margin-top:50px;">
+        <div class="col-md-4 workmargin">
+          <h4>Discussion and Contribute</h4>
+          <div style="text-align:left">
+          <p>Get help using Eagle or contribute to the project</p>
+          <ul>
+            <li>
+              <a href="/docs/community.html"><b>Mailing Lists</b></a>
+            </li>
+            <li>
+              <a href="https://issues.apache.org/jira/browse/EAGLE"><b>Issues Tracking</b></a>
+            </li>
+            <li>
+              <a href="https://github.com/eBay/Eagle/wiki/How-to-Contribute"><b>How to Contribute</b></a>
+            </li>
+          </ul>
+          </div>
+        </div>
+        <div class="col-md-4 workmargin">
+          <h4>Events and Meetups</h4>
+          <div style="text-align:left">
+            <p>Learn more about Eagle from Conferences</p>
+            <ul>
+              <li><a href="/docs/community.html"><b>Conferences</b></a></li>
+              <li><a href="/docs/community.html"><b>Meetups</b></a></li>
+              <li><a href="/docs/community.html"><b>News</b></a></li>
+            </ul>
+          </div>
+            <!-- <button type="button" class="btn btn-primary" onclick="window.location='https://github.com/pulsarIO/realtime-analytics/wiki/Get-Started'">Learn More Events</button> -->
+        </div>
+        <div class="col-md-4 workmargin">
+          <h4>Find Us at</h4>
+          <div style="text-align:left">
+            
+
+            <p>Learn latest updates about Eagle through:</p>
+        <div class="row">
+          <div class="col-md-6">
+<iframe src="https://ghbtns.com/github-btn.html?user=apache&repo=incubator-eagle&type=star&count=true" frameborder="0" scrolling="0" width="150px" height="20px"></iframe>
+                <iframe src="https://ghbtns.com/github-btn.html?user=apache&repo=incubator-eagle&type=fork&count=true" frameborder="0" scrolling="0" width="150px" height="20px"></iframe>
+<br/>
+
+<a href="https://twitter.com/TheApacheEagle" class="twitter-follow-button" data-show-count="false">Follow @TheApacheEagle</a>
+<script>!function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0],p=/^http:/.test(d.location)?'http':'https';if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src=p+'://platform.twitter.com/widgets.js';fjs.parentNode.insertBefore(js,fjs);}}(document, 'script', 'twitter-wjs');</script>
+<br/>
+
+<div class="fb-like" data-href="https://www.facebook.com/TheApacheEagle" data-layout="button_count" data-action="like" data-show-faces="true" data-share="true"></div>
+
+          </div>
+          <div class="col-md-6">
+            <image width="110px" height="110px" src="/images/qrcode-8cm.jpg"/>
+          </div>
+        </div>
+      </div>
+    </div>
+  </div>
+</div>
+</div>
+</div>
+
+<!-- footer start -->
+<div class="footerwrapper">
+  <div class="container">
+    <div class="row">
+      <div class="col-md-12"><div style="margin-left:auto; margin-right:auto; text-align:center;font-size:12px;">
+<div>
+Apache Eagle is an effort undergoing incubation at The Apache Software Foundation (ASF), sponsored by the Apache Incubator. Incubation is required of all newly accepted projects until a further review indicates that the infrastructure, communications, and decision making process have stabilized in a manner consistent with other successful ASF projects. While incubation status is not necessarily a reflection of the completeness or stability of the code, it does indicate that the project has yet to be fully endorsed by the ASF.
+</div>
+<div>
+<a href="http://www.apache.org">
+<img id="asf-logo" alt="Apache Software Foundation" src="/images/apache-logo-small.gif">
+</a>
+<a href="http://incubator.apache.org">
+<img id="incubator-logo" alt="Apache Incubator" src="/images/apache-incubator-logo-small.png">
+</a>
+</div>
+<div>
+Copyright © 2015 <a href="http://www.apache.org">The Apache Software Foundation</a>, Licensed under the <a href="http://www.apache.org/licenses/LICENSE-2.0">Apache License, Version 2.0</a>.
+</div>
+<div>
+Apache Eagle, Eagle, Apache, the Apache feather logo, and the Apache Incubator project logo are trademarks of The Apache Software Foundation.
+</div>
+</div></div>
+    </div>
+  </div>
+</div>
+<!-- footer end --> 
+
+<!-- JavaScripts -->
+
+<script src="js/jquery-1.11.1.min.js"></script>
+<script src="js/jquery.singlePageNav.js"></script>
+<script src="js/jquery.flexslider.js"></script>
+<script src="js/custom.js"></script>
+<script src="js/jquery.colorbox.js"></script>
+<script src="js/modernizr.min.js"></script>
+<script src="js/svg.js"></script>
+
+<script>
+$(".youtube").colorbox({iframe:true, innerWidth:728, innerHeight:410});
+
+var lastId,
+    topMenu = $("#top-menu"),
+    topMenuHeight = topMenu.outerHeight() + 15,
+    // All list items
+    menuItems = topMenu.find("a"),
+    // Anchors corresponding to menu items
+    scrollItems = menuItems.map(function() {
+        var item = $($(this).attr("href"));
+        if (item.length) {
+            return item;
+        }
+    });
+
+menuItems.click(function(e) {
+    var href = $(this).attr("href"),
+        offsetTop = href === "#" ? 0 : $(href).offset().top - topMenuHeight + 1;
+    $('html, body').stop().animate({
+        scrollTop: offsetTop
+    }, 300);
+    e.preventDefault();
+});
+// Bind to scroll
+$(window).scroll(function() {
+  
+
+    // Get container scroll position
+    var fromTop = $(this).scrollTop() + topMenuHeight;
+    // Get id of current scroll item
+    var cur = scrollItems.map(function() {
+        if ($(this).offset().top < fromTop)
+            return this;
+    });
+    // Get the id of the current element
+    cur = cur[cur.length - 1];
+    var id = cur && cur.length ? cur[0].id : "";
+    if (lastId !== id) {
+        lastId = id;
+        // Set/remove active class
+        menuItems
+            .parent().removeClass("active")
+            .end().filter("[href=#" + id + "]").parent().addClass("active");
+    }
+});
+</script>
+<div id="fb-root"></div>
+<script>(function(d, s, id) {
+  var js, fjs = d.getElementsByTagName(s)[0];
+  if (d.getElementById(id)) return;
+  js = d.createElement(s); js.id = id;
+  js.src = "//connect.facebook.net/en_US/sdk.js#xfbml=1&version=v2.5";
+  fjs.parentNode.insertBefore(js, fjs);
+}(document, 'script', 'facebook-jssdk'));</script>
+</body>
+</html>

Propchange: incubator/eagle/web/index.html
------------------------------------------------------------------------------
    svn:eol-style = native

Propchange: incubator/eagle/web/index.html
------------------------------------------------------------------------------
    svn:executable = *

Added: incubator/eagle/web/install.md
URL: http://svn.apache.org/viewvc/incubator/eagle/web/install.md?rev=1728410&view=auto
==============================================================================
--- incubator/eagle/web/install.md (added)
+++ incubator/eagle/web/install.md Thu Feb  4 06:27:16 2016
@@ -0,0 +1,70 @@
+---
+layout: doc
+title:  "Install Eagle" 
+permalink: /docs/installation.html
+---
+
+### Install Eagle Sandbox
+
+#### Pre-requisites
+
+> To insall eagle on a sandbox you need to have orcale virtual box and HDP sandbox image.
+
+1. [Oracle VirtualBox](https://www.virtualbox.org/wiki/Downloads).
+2. [Hortonworks Sandbox](http://hortonworks.com/products/hortonworks-sandbox/#install) v 2.2.4 or later.
+
+#### Register HDP sandbox 
+
+1. [Register](http://127.0.0.1:8888/) Hortonworks sandbox.
+2. [Enable Ambari](http://127.0.0.1:8000/). Click on Enable Button.
+3. [Login](http://127.0.0.1:8080) as admin/admin.
+
+#### Install Eagle
+
+* **Step 1**: Clone stable version from [eagle github](https://github.corp.ebay.com/eagle/eagle/tree/release1.0)
+>       Build project mvn clean install -DskipTests=true
+
+* **Step 2**:  Download eagle-bin-0.1.0.tar.gz package from successful build into your HDP sandbox.
+
+    * Option 1: `scp -P 2222  eagle/eagle-assembly/target/eagle-0.1.0-bin.tar.gz root@127.0.0.1:/usr/hdp/current/`
+
+
+    * Option 2: Create shared directory between host and Sandbox, and restart Sandbox. Then you can find the shared directory under /media in Sandbox.
+
+		![Adding a shared folder](/images/docs/Sharedfolder.jpg "Adding a shared folder")
+
+* **Step 3**: Extract eagle tarball package
+
+      $ cd /usr/hdp/current
+      $ tar -zxvf eagle-0.1.0-bin.tar.gz
+      $ mv eagle-0.1.0 eagle
+
+* **Step 4**: Add root as a HBase superuser via [Ambari](http://127.0.0.1:8080/#/main/services/HBASE/configs) (Optional, a user can operate HBase by sudo su hbase, as an alternative).
+
+* **Step 5**: Install Eagle Ambari service 
+>
+    /usr/hdp/current/eagle/bin/eagle-ambari.sh install.
+
+* **Step 6**: Restart [Ambari](http://127.0.0.1:8000/) click on disable and enable Ambari back.
+
+* **Step 7**: Start HBase & Storm & Kafka
+From Ambari UI, restart any suggested components("Restart button on top") & Start Storm (Start "Nimbus" ,"Supervisor" & "Storm UI Server"), Kafka (Start "Kafka Broker") , HBase (Start "RegionServer"  and " HBase Master") 
+>
+![Restart Services](/images/docs/Services.png "Services")
+
+* **Step 8**: Add Eagle Service To Ambari. (Click For Video)
+
+	* Click on "Add Service" under Actions button on Ambari Main page 
+
+		![AddService](/images/docs/AddService.png "AddService")
+	
+	* Select "Eagle" in list of services and proceed to install all eagle services. 
+EagleServiceSuccess
+
+		![Eagle Services](/images/docs/EagleServiceSuccess.png "Eagle Services")
+
+* **Step 9**: Add Policies and meta data required by running below script.
+
+      $ /usr/hdp/current/eagle/examples/sample-sensitivity-resource-create.sh 
+      $ /usr/hdp/current/eagle/examples/sample-policy-create.sh
+

Propchange: incubator/eagle/web/install.md
------------------------------------------------------------------------------
    svn:eol-style = native

Added: incubator/eagle/web/introduction.md
URL: http://svn.apache.org/viewvc/incubator/eagle/web/introduction.md?rev=1728410&view=auto
==============================================================================
--- incubator/eagle/web/introduction.md (added)
+++ incubator/eagle/web/introduction.md Thu Feb  4 06:27:16 2016
@@ -0,0 +1,30 @@
+---
+layout: doc
+title:  "Introduction" 
+permalink: /docs/index.html
+---
+
+### Welcome to Eagle
+
+> Eagle is an Open Source Monitoring solution for Hadoop to instantly detect access to sensitive data, recognize attacks, malicious activities and block access in real time. 
+
+### Key Qualities
+
+* **Real Time**: We understand the importance of timing and acting fast in case of a security breach. So we designed Eagle to make sure the alerts are generated in sub second and stop the anomalous activity if it’s a real threat.
+* **Scalability**: At eBay Eagle is deployed on multiple large hadoop clusters with petabytes of data and 800M access events everyday.
+* **Ease of Use**: Usability is one of our core design principles. It takes only a few minutes to get started with Eagle sandbox. We have made it easy to get started with good examples and policies can be added with few clicks.
+* **User Profiles**: Eagle provides capabilities to create user profiles based on the user behaviour in hadoop. We have out of the box machine learning algorithms that you can leverage to build models with different HDFS features and get alerted on anomalies. 
+* **Open Source**: Eagle is built ground up using open source standards and various products from the big data space. We decided to open source Eagle to help the community and looking forward for your feedback, collaboration and support.
+* **Extensibility**: Eagle is designed with extensibility in mind. You can integrate Eagle with existing data classification tools and monitoring tools easily.
+
+### Use Cases
+
+Eagle data activity monitoring is currently deployed at eBay for monitoring the data access activities in a 2500 node hadoop cluster with plans of extending it to other hadoop clusters covering 10,000 nodes by end of this year. We have wide range of policies to detect and prevent data loss, data copy to unsecured location, sensitive data access from unauthorized zones etc. The flexibility of creating policies in eagle allows us to expand further and add more complex policies.
+
+Some other typical use cases:
+
+* Monitor data access traffic on Hadoop
+* Discover intrusions and security breach
+* Discover and prevent sensitive data loss 
+* Policy based detection and alerting 
+* Anomalous data access detection based on user behaviour 

Propchange: incubator/eagle/web/introduction.md
------------------------------------------------------------------------------
    svn:eol-style = native

Propchange: incubator/eagle/web/js/
------------------------------------------------------------------------------
--- svn:ignore (added)
+++ svn:ignore Thu Feb  4 06:27:16 2016
@@ -0,0 +1 @@
+.*

Added: incubator/eagle/web/policy-api.md
URL: http://svn.apache.org/viewvc/incubator/eagle/web/policy-api.md?rev=1728410&view=auto
==============================================================================
--- incubator/eagle/web/policy-api.md (added)
+++ incubator/eagle/web/policy-api.md Thu Feb  4 06:27:16 2016
@@ -0,0 +1,188 @@
+---
+layout: doc
+title:  "Policy API"
+permalink: /docs/policy-api.html
+---
+
+Eagle Provide RESTful APIs for create/update/query/delete policy for alert
+
+* Policy Definition API  
+* Stream Definition API  
+
+------  
+
+### Policy Definition API  
+
+------  
+
+#### **Create/Update Policy Example**      
+
+URL               |||    http://host:port/eagle-service/rest/entities?serviceName=AlertDefinitionService   
+METHOD            |||    POST
+HEADERS           |||    "Content-Type:application/json"   
+                  |||    "Authorization:Basic encodedusrpwd"  (encodedusrpwd is base64 encoded string for "user:password")  
+DATA              |||    [{  
+                  |||    &nbsp;&nbsp;"tags": {  
+                  |||    &nbsp;&nbsp;&nbsp;&nbsp; "site": "sandbox",  
+                  |||    &nbsp;&nbsp;&nbsp;&nbsp; "dataSource": "hdfsAuditLog",  
+                  |||    &nbsp;&nbsp;&nbsp;&nbsp; "policyId": "testPolicy",  
+                  |||    &nbsp;&nbsp;&nbsp;&nbsp; "alertExecutorId": "hdfsAuditLogAlertExecutor",  
+                  |||    &nbsp;&nbsp;&nbsp;&nbsp; "policyType": "siddhiCEPEngine"  
+                  |||    &nbsp;&nbsp;&nbsp;&nbsp;},  
+                  |||    &nbsp;&nbsp;"desc": "test alert policy",  
+                  |||    &nbsp;&nbsp;"policyDef": "{\"type\":\"siddhiCEPEngine\",\"expression\":\"from hdfsAuditLogEventStream[src =='/tmp/private'] select * insert into outputStream;\"}",  
+                  |||    &nbsp;&nbsp;"notificationDef": "[{
+                  |||    &nbsp;&nbsp;&nbsp;&nbsp; "sender":"noreply-eagle@company.com",
+                  |||    &nbsp;&nbsp;&nbsp;&nbsp; "recipients":"user@company.com",
+                  |||    &nbsp;&nbsp;&nbsp;&nbsp; "subject":"test alert policy",
+                  |||    &nbsp;&nbsp;&nbsp;&nbsp; "flavor":"email",
+                  |||    &nbsp;&nbsp;&nbsp;&nbsp; "id":"email_1"
+                  |||    &nbsp;&nbsp;&nbsp;&nbsp;}]",  
+                  |||    &nbsp;&nbsp;"enabled": true  
+                  |||    }]  
+
+**Field Specification**  
+
+Tags             |||    All Tags form the key for alert policy  
+                 |||    1) site: Which site is the policy for? e.g. sandbox  
+                 |||    2) dataSource: From which dataSource the policy consume from; e.g. hdfsAuditLog  
+                 |||    3) policyId  
+                 |||    4) alertExecutorId: Within which executor will the policy be executed e.g. hdfsAuditLog  
+                 |||    5) policyType: Which engine should the policy be executed with e.g. siddhiCEPEngine  
+policyDef        |||    Definition for the policy, tell  
+                 |||    1) which engine the policy should be executed with  
+                 |||    2) The policy expression to be evaluated  
+notificationDef  |||    Currently we only support email notification for alert, below are fields of alert definition  
+                 |||    1) sender: Email Sender  
+                 |||    2) recipients: Email Receipent  
+                 |||    3) subject: Email Subject  
+                 |||    4) flavor: way of notification, currently only supprot "email"  
+                 |||    5) id: notification id  
+enabled          |||    If the alert is enabled, true/false  
+desc             |||    Description of the policy  
+  
+**Response Body**  
+{  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;"meta": {  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;     "elapsedms": 11,  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;     "totalResults": 1  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;},  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;"success": true,  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;"obj": [  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;     "YEktKX_____62aP_6x97yoSv3B0ANd9Hby--xyCZKe1hk6BkS9hcZXeJk1Je-7-Mrq0lGQ"  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;],  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;"type": "java.lang.String"  
+}  
+
+------ 
+
+#### **Get Policy Example**  
+
+URL               |||    http://host:port/eagle-service/rest/list?query=AlertDefinitionService[@dataSource="hdfsAuditLog" AND @site="sandbox"]{*}&pageSize=100  
+METHOD            |||    GET
+HEADERS           |||    "Content-Type:application/json"   
+                  |||    "Authorization:Basic encodedusrpwd"  (encodedusrpwd is base64 encoded string for "user:password")  
+
+**Response Body**   
+{  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;prefix: "alertdef",  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;tags: {  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;site: "sandbox",  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;dataSource: "hdfsAuditLog",  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;policyId: "testPolicy",  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;alertExecutorId: "hdfsAuditLogAlertExecutor",  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;policyType: "siddhiCEPEngine"  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;},  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;encodedRowkey: "YEktKX_____62aP_6x97yoSv3B0ANd9Hby--xyCZKe1hk6BkS9hcZXeJk1Je-7-Mrq0lGQ",  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;desc: "nope alert for test",  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;policyDef: "{"type":"siddhiCEPEngine","expression":"from hdfsAuditLogEventStream[src=='/tmp/private'] select * into outputStream;"}",  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;notificationDef: "[{"sender":"noreplay-eagle@company.com","recipients":"user@company.com","subject":"testPolicy","flavor":"email","id":"email_1"}]",  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;enabled: true  
+}  
+
+------  
+
+#### **Delete Policy Example**    
+
+Delete policy by encodedRowkey
+
+URL               |||    http://host:port/eagle-service/rest/entities/delete?serviceName=AlertDefinitionService&byId=true  
+METHOD            |||    POST  
+HEADERS           |||    "Content-Type:application/json"  
+                  |||    "Authorization:Basic encodedusrpwd"  (encodedusrpwd is base64 encoded string for "user:password")  
+DATA              |||    [  
+                  |||       "YEktKX_____62aP_6x97yoSv3B0ANd9Hby--xyCZKe1hk6BkS9hcZXeJk1Je-7-Mrq0lGQ"  
+                  |||    ]  
+
+**Delete Request Response Body**  
+
+The folloing is the response body of a sucessfully delete request  
+{  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;"meta": {  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;     "elapsedms": 5,  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;     "totalResults": 1  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;},  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;"success": true,  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;"obj": [  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;     "YEktKX_____62aP_6x97yoSv3B0ANd9Hby--xyCZKe1hk6BkS9hcZXeJk1Je-7-Mrq0lGQ"  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;],  
+&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;"type": "java.lang.String"  
+}  
+
+-----
+
+### Stream Definition API  
+
+In the policy defintion, if the policyType is "siddhiCEPEngine" we need specify from which stream the query is against , like "from hdfsAuditLogEventStream"   
+
+So we need further define the stream schema along with the policy
+
+The response body of stream schema api is similar to policy api, we don't duplicate it in stream definition api  
+
+------  
+
+#### **Create/Update Stream Shceme Example**   
+
+URL               |||    http://host:port/eagle-service/rest/entities?serviceName=AlertStreamSchemaService   
+METHOD            |||    POST
+HEADERS           |||    "Content-Type:application/json"   
+                  |||    "Authorization:Basic encodedusrpwd"  (encodedusrpwd is base64 encoded string for "user:password")  
+DATA              |||    [{  
+                  |||    &nbsp;&nbsp;"tags": {  
+                  |||    &nbsp;&nbsp;&nbsp;&nbsp; "dataSource": "hiveQueryLog",  
+                  |||    &nbsp;&nbsp;&nbsp;&nbsp; "attrName": "user",  
+                  |||    &nbsp;&nbsp;&nbsp;&nbsp; "streamName": "hiveAccessLogStream"  
+                  |||    &nbsp;&nbsp;&nbsp;&nbsp; },  
+                  |||    &nbsp;&nbsp;"attrType": "string",  
+                  |||    &nbsp;&nbsp;"attrDescription": "process user"  
+                  |||    }]                  
+
+**Field Specification**  
+
+Tags             |||    All Tags form the key for alert policy  
+                 |||    1) dataSource: From which dataSource the policy consume from, e.g. "hdfsAuditLog"  
+                 |||    2) attrName: Attribute's name, e.g. "user"  
+                 |||    3) streamName: Stream's name, e.g.  "hiveAccessLogStream"  
+attrType         |||    Attribute's type, e.g. string, boolean, int, long  
+attrDescription  |||    Description for the attribute
+  
+------  
+
+#### **Get Stream Shceme Example**  
+
+URL               |||    http://host:port/eagle-service/rest/list?query=AlertStreamSchemaService[@dataSource="hdfsAuditLog" AND @streamName="hiveAccessLogStream"]{*}&pageSize=100  
+METHOD            |||    GET
+HEADERS           |||    "Content-Type:application/json"   
+                  |||    "Authorization:Basic encodedusrpwd"  (encodedusrpwd is base64 encoded string for "user:password")  
+
+------  
+   
+#### **Delete Stream Shceme Example**    
+
+Delete stream shceme by encodedRowkey
+
+URL               |||    http://host:port/eagle-service/rest/entities/delete?serviceName=AlertStreamSchemaService&byId=true  
+METHOD            |||    POST  
+HEADERS           |||    "Content-Type:application/json"  
+                  |||    "Authorization:Basic encodedusrpwd"  (encodedusrpwd is base64 encoded string for "user:password")  
+DATA              |||    [ "YEktKX_____62aP_6x97yoSv3B0ANd9Hby--xyCZKe1hk6BkS9hcZXeJk1Je-7-Mrq0lGQ" ]    

Propchange: incubator/eagle/web/policy-api.md
------------------------------------------------------------------------------
    svn:eol-style = native

Added: incubator/eagle/web/quick-start.md
URL: http://svn.apache.org/viewvc/incubator/eagle/web/quick-start.md?rev=1728410&view=auto
==============================================================================
--- incubator/eagle/web/quick-start.md (added)
+++ incubator/eagle/web/quick-start.md Thu Feb  4 06:27:16 2016
@@ -0,0 +1,68 @@
+---
+layout: doc
+title:  "Quick Start" 
+permalink: /docs/quick-start.html
+---
+
+This is a tutorial-style guide for users to have a quick image of Eagle. The main content are
+
+* Downloading
+* Installation
+* Demos
+
+### Download/Build tarball
+
+* Download tarball directly from latest released [binary package](http://66.211.190.194/eagle-0.1.0.tar.gz)
+
+* Build manually by cloning latest code from [github](https://github.com/apache/incubator-eagle) with [Apache Maven](https://maven.apache.org/):
+
+	  $ git clone git@github.com:apache/incubator-eagle.git
+	  $ cd Eagle
+	  $ mvn clean package -DskipTests
+
+	After building successfully, you will get the tarball under `eagle-assembly/target/` named as `eagle-${version}-bin.tar.gz`
+<br/>
+
+### Installation
+The fastest way to start with Eagle is to:
+
+* [Install Eagle with Sandbox](/docs/deployment-in-sandbox.html)
+* [Install Eagle with Docker](https://issues.apache.org/jira/browse/EAGLE-3)(under development)
+
+If you want to deploy eagle in production environment, please refer to:
+
+* [Deploy Eagle in the Production](/docs/deployment-in-production.html)
+<br/>
+
+### Demos
+
+* Define policy with Eagle web
+    * Step 1: Select the site which is monitored by the backend topologies. For example "sandbox"
+        ![](/images/docs/selectSite.png)
+    * Step 2: Create a policy
+        ![](/images/docs/hdfs-policy1.png)
+
+    Learn more about how to define policy, please refer to tutorial [Policy Management](/docs/tutorial/policy.html)
+<br/>
+
+* Test policy and check alerting
+
+    **Example 1** (HDFSAuditLog): validate sample policy “viewPrivate” on [Eagle web](http://localhost:9099/eagle-service) by running a HDFS command
+
+      $ hdfs dfs -cat /tmp/private
+
+    You should see an alert for policy name “viewPrivate” in [Eagle web](http://localhost:9099/eagle-service) . Under Alerts page.
+
+    **Example 2** (HiveQueryLog): validate sample policy “queryPhoneNumber” in [Eagle web](http://localhost:9099/eagle-service) by submitting a hive job
+
+      $ su hive
+      $ hive
+      > set hive.execution.engine=mr;
+      > use xademo;
+      > select a.phone_number from customer_details a, call_detail_records b where a.phone_number=b.phone_number;
+
+  You should see an alert for policy name “queryPhoneNumber” in [Eagle web](http://localhost:9099/eagle-service) . Under Alerts page.
+
+<br/>
+
+<br/>

Propchange: incubator/eagle/web/quick-start.md
------------------------------------------------------------------------------
    svn:eol-style = native

Added: incubator/eagle/web/standalone-install.md
URL: http://svn.apache.org/viewvc/incubator/eagle/web/standalone-install.md?rev=1728410&view=auto
==============================================================================
--- incubator/eagle/web/standalone-install.md (added)
+++ incubator/eagle/web/standalone-install.md Thu Feb  4 06:27:16 2016
@@ -0,0 +1,20 @@
+---
+layout: doc
+title:  "Overview" 
+permalink: /docs/standalone-install.html
+---
+
+## Hardware Requirements
+
+TBF
+
+## Software Requirements
+TBF
+
+## Installation Procedure
+
+TBF
+
+Below are some of the features we are working on:
+
+TBF
\ No newline at end of file

Propchange: incubator/eagle/web/standalone-install.md
------------------------------------------------------------------------------
    svn:eol-style = native

Added: incubator/eagle/web/terminology.md
URL: http://svn.apache.org/viewvc/incubator/eagle/web/terminology.md?rev=1728410&view=auto
==============================================================================
--- incubator/eagle/web/terminology.md (added)
+++ incubator/eagle/web/terminology.md Thu Feb  4 06:27:16 2016
@@ -0,0 +1,16 @@
+---
+layout: doc
+title:  "Terminology" 
+permalink: /docs/terminology.html
+---
+
+Here are some terms we are using in Apache Eagle, please check them for your reference.
+They are basic knowledge of Eagle which also will help to well understand Eagle.
+
+* **Site**: a site can be considered as a Hadoop environment. Eagle distinguishes different Hadoop environment with different sites.
+* **Policy**: a policy defines the rule to alert. Users can define their own policies based on the metadata of data sources
+* **Data source**: a data source is a monitoring target data. Currently Eagle only supports two kinds of data sources: Hive query log and HDFS audit log.
+* **Stream**: a stream is the streaming data from a data source. Each data source has its own stream.
+* **Data activity monitoring**: Data activity monitoring is to monitor the data from each data source and to alert according the policy defined by users.
+* **User profile**: a user profile is the historical activity model generated using machine learning algorithm which could be used for real-time online abnormal detection.
+* **Data classification**: data classification provides the ability to classify different data sources with different levels of sensitivity.

Propchange: incubator/eagle/web/terminology.md
------------------------------------------------------------------------------
    svn:eol-style = native

Added: incubator/eagle/web/tutorial-classfication.md
URL: http://svn.apache.org/viewvc/incubator/eagle/web/tutorial-classfication.md?rev=1728410&view=auto
==============================================================================
--- incubator/eagle/web/tutorial-classfication.md (added)
+++ incubator/eagle/web/tutorial-classfication.md Thu Feb  4 06:27:16 2016
@@ -0,0 +1,49 @@
+---
+layout: doc
+title:  "Data Classification Tutorial" 
+permalink: /docs/tutorial/classification.html
+---
+
+Eagle Data classification provides the ability to classify HDFS and Hive data with different levels of sensitivity.
+For both HDFS and Hive, a user can browse the resources and add/remove the sensitivity information.
+
+The document has two parts. The first part is about how to add/remove sensitivity to files/directories; the second part shows the application
+of sensitivity in policy definition. Showing HDFS as an example.
+
+> **WARNING**: sensitivity is classified by sites. Please select the right site first when there are multiple ones.
+
+#### **Part 1: Sensitivity Edit**
+
+  * add the sensitive mark to files/directories.
+
+    * **Basic**: Label sensitivity files directly (**recommended**)
+
+       ![HDFS classification](/images/docs/hdfs-mark1.png)
+       ![HDFS classification](/images/docs/hdfs-mark2.png)
+       ![HDFS classification](/images/docs/hdfs-mark3.png)
+    * **Advanced**: Import json file/content
+
+        ![HDFS classification](/images/docs/hdfs-import1.png)
+        ![HDFS classification](/images/docs/hdfs-import2.png)
+        ![HDFS classification](/images/docs/hdfs-import3.png)
+
+
+ * remove sensitive mark on files/directories
+
+   * **Basic**: remove label directly
+
+        ![HDFS classification](/images/docs/hdfs-delete1.png)
+        ![HDFS classification](/images/docs/hdfs-delete2.png)
+
+   * **Advanced**: delete lin batch
+
+        ![HDFS classification](/images/docs/hdfs-remove.png)
+
+#### **Part 2: Sensitivity Usage in Policy Definition**
+
+You can mark a particular folder/file as "PRIVATE". Once you have this information you can create policies using this label.
+
+> For example: the following policy monitors all the operations to resources with sensitivity type "PRIVATE".
+
+![sensitivity type policy](/images/docs/sensitivity-policy.png)
+

Propchange: incubator/eagle/web/tutorial-classfication.md
------------------------------------------------------------------------------
    svn:eol-style = native

Added: incubator/eagle/web/tutorial-ldap.md
URL: http://svn.apache.org/viewvc/incubator/eagle/web/tutorial-ldap.md?rev=1728410&view=auto
==============================================================================
--- incubator/eagle/web/tutorial-ldap.md (added)
+++ incubator/eagle/web/tutorial-ldap.md Thu Feb  4 06:27:16 2016
@@ -0,0 +1,38 @@
+---
+layout: doc
+title:  "Eagle LDAP Tutorial"
+permalink: /docs/tutorial/ldap.html
+---
+
+To enable Eagle LDAP authentication on the web, two steps are needed.
+
+Step 1: edit configuration under lib/tomcat/webapps/eagle-service/WEB-INF/classes/ldap.properties.
+
+    ldap.server=ldap://localhost:10389
+    ldap.username=uid=admin,ou=system
+    ldap.password=secret
+    ldap.user.searchBase=ou=Users,o=mojo
+    ldap.user.searchPattern=(uid={0})
+    ldap.user.groupSearchBase=ou=groups,o=mojo
+    acl.adminRole=
+    acl.defaultRole=ROLE_USER
+
+acl.adminRole and acl.defaultRole are two custom properties for Eagle. Eagle manages admin users with groups. If you set acl.adminRole as ROLE_{EAGLE-ADMIN-GROUP-NAME}, members in this group have admin privilege. acl.defaultRole is ROLE_USER.
+
+Step 2: edit conf/eagle-service.conf, and add springActiveProfile="default"
+
+    eagle{
+        service{
+            storage-type="hbase"
+            hbase-zookeeper-quorum="localhost"
+            hbase-zookeeper-property-clientPort=2181
+            zookeeper-znode-parent="/hbase",
+            springActiveProfile="default"
+        }
+    }
+
+
+
+
+
+

Propchange: incubator/eagle/web/tutorial-ldap.md
------------------------------------------------------------------------------
    svn:eol-style = native

Added: incubator/eagle/web/tutorial-policy.md
URL: http://svn.apache.org/viewvc/incubator/eagle/web/tutorial-policy.md?rev=1728410&view=auto
==============================================================================
--- incubator/eagle/web/tutorial-policy.md (added)
+++ incubator/eagle/web/tutorial-policy.md Thu Feb  4 06:27:16 2016
@@ -0,0 +1,59 @@
+---
+layout: doc
+title:  "Policy Tutorial" 
+permalink: /docs/tutorial/policy.html
+---
+
+Eagle currently supports to customize policies for data sources for each site:
+
+* HDFS Audit Log
+* Hive Query Log
+
+> NOTICE: policies are classified by sites. Please select the site first when there are multiple ones.
+
+### How to define HDFS Policy?
+In this example we will go through the steps for creating the following HDFS policy.
+
+> Example Policy: Create a policy to alert when a user is trying to delete a file with sensitive data
+
+* **Step 1**: Select Source as HDFS and Stream as HDFS Audit Log
+
+	![HDFS Policies](/images/docs/hdfs-policy1.png)
+
+* **Step 2**: Eagle supports a variety of properties for match critera where users can set different values. Eagle also supports window functions to extend policies with time functions.
+
+	  command = delete 
+	  (Eagle currently supports the following commands open, delete, copy, append, copy from local, get, move, mkdir, create, list, change permissions)
+		
+	  source = /tmp/private 
+	  (Eagle supports wildcarding for property values for example /tmp/*)
+
+	![HDFS Policies](/images/docs/hdfs-policy2.png)
+
+* **Step 3**: Name your policy and select de-duplication options if you need to avoid getting duplicate alerts within a particular time window. You have an option to configure email notifications for the alerts.
+
+	![HDFS Policies](/images/docs/hdfs-policy3.png)
+
+
+### How to define HIVE Policy?
+In this example we will go thru the steps for creating the following Hive policy.
+
+> Example Policy: Create a policy to alert when a user is trying to select PHONE_NUMBER from a hive table with sensitive data
+
+* **Step 1**:  Select Source as Hive and Stream as Hive Query Log
+
+	![Hive Policies](/images/docs/hive-policy1.png)
+
+* **Step 2**: Eagle support a variety of properties for match critera where users can set different values. Eagle also supports window functions to extend policies with time functions.
+
+	  command = Select 
+	  (Eagle currently supports the following commands DDL statements Create, Drop, Alter, Truncate, Show)
+		
+	  sensitivity type = PHONE_NUMBER
+      (Eagle supports classifying data in Hive with different sensitivity types. Users can use these sensitivity types to create policies)
+
+	![Hive Policies](/images/docs/hive-policy2.png)
+
+* **Step 3**: Name your policy and select de-duplication options if you need to avoid getting duplicate alerts within a particular time window. You have an option to configure email notifications for the alerts.
+
+	![Hive Policies](/images/docs/hive-policy3.png)

Propchange: incubator/eagle/web/tutorial-policy.md
------------------------------------------------------------------------------
    svn:eol-style = native

Added: incubator/eagle/web/tutorial-setup.md
URL: http://svn.apache.org/viewvc/incubator/eagle/web/tutorial-setup.md?rev=1728410&view=auto
==============================================================================
--- incubator/eagle/web/tutorial-setup.md (added)
+++ incubator/eagle/web/tutorial-setup.md Thu Feb  4 06:27:16 2016
@@ -0,0 +1,97 @@
+---
+layout: doc
+title:  "Site Management"
+permalink: /docs/tutorial/setup.html
+---
+
+Eagle identifies different Hadoop environments as different sites, such as sandbox, datacenter1, datacenter2. In each site,
+a user can add different data sources as the monitoring targets. For each data source, a connection configuration is required.
+
+#### Step 1: Add Site
+
+The following is an example which creates a new site "Demo", and add two data sources as its monitoring targets.
+![setup a site](/images/docs/new-site.png)
+
+#### Step 2: Add Configuration
+
+After creating a new site, we need to edit the configuration to connect the cluster. 
+![hdfs setup](/images/docs/hdfs-setup.png)
+
+
+* HDFS
+
+    * Base case
+
+        You may configure the default path for Hadoop clients to connect remote hdfs namenode.
+
+            {"fs.defaultFS":"hdfs://sandbox.hortonworks.com:8020"}
+
+    * HA case
+
+        Basically, you point your fs.defaultFS at your nameservice and let the client know how its configured (the backing namenodes) and how to fail over between them under the HA mode
+
+            {"fs.defaultFS":"hdfs://nameservice1",
+             "dfs.nameservices": "nameservice1",
+             "dfs.ha.namenodes.nameservice1":"namenode1,namenode2",
+             "dfs.namenode.rpc-address.nameservice1.namenode1": "hadoopnamenode01:8020",
+             "dfs.namenode.rpc-address.nameservice1.namenode2": "hadoopnamenode02:8020",
+             "dfs.client.failover.proxy.provider.apollo-phx-nn-ha": "org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider"
+            }
+
+    * Kerberos-secured cluster
+
+        For Kerberos-secured cluster, you need to get a keytab file and the principal from your admin, and configure "eagle.keytab.file" and "eagle.kerberos.principal" to authenticate its access.
+
+            { "eagle.keytab.file":"/EAGLE-HOME/.keytab/b_eagle.keytab_apd",
+              "eagle.kerberos.principal":"eagle@APD.EBAY.COM"
+            }
+
+        If there is an exception about "invalid server principal name", you may need to check the DNS resolver, or the data transfer , such as "dfs.encrypt.data.transfer", "dfs.encrypt.data.transfer.algorithm", "dfs.trustedchannel.resolver.class", "dfs.datatransfer.client.encrypt".
+
+* Hive
+    * Basic
+
+            {
+              "accessType": "metastoredb_jdbc",
+              "password": "hive",
+              "user": "hive",
+              "jdbcDriverClassName": "com.mysql.jdbc.Driver",
+              "jdbcUrl": "jdbc:mysql://sandbox.hortonworks.com/hive?createDatabaseIfNotExist=true"
+            }
+
+
+* HBase
+
+    * Basic case
+
+        You need to sett "hbase.zookeeper.quorum":"localhost" property and "hbase.zookeeper.property.clientPort" property.
+
+            {
+                "hbase.zookeeper.property.clientPort":"2181",
+                "hbase.zookeeper.quorum":"localhost"
+            }
+
+    * Kerberos-secured cluster
+
+        According to your environment, you can add or remove some of the following properties. Here is the reference.
+
+            {
+                "hbase.zookeeper.property.clientPort":"2181",
+                "hbase.zookeeper.quorum":"localhost",
+                "hbase.security.authentication":"kerberos",
+                "hbase.master.kerberos.principal":"hadoop/_HOST@EXAMPLE.COM",
+                "zookeeper.znode.parent":"/hbase",
+                "eagle.keytab.file":"/EAGLE-HOME/.keytab/eagle.keytab",
+                "eagle.kerberos.principal":"eagle@EXAMPLE.COM"
+            }
+
+* UserProfile
+
+        {
+          "features": "getfileinfo,open,listStatus,setTimes,setPermission,rename,mkdirs,create,setReplication,contentSummary,delete,setOwner,fsck"
+        }
+
+#### Step 3: Checking the connection
+After the configuration is ready, you can go to [classification page](/docs/tutorial/classification.html) and browse the data. If the configuration is correct, data will returned immediately.
+
+Any questions on Kerberos configuration, please first check [FAQ](/docs/FAQ.html)
\ No newline at end of file

Propchange: incubator/eagle/web/tutorial-setup.md
------------------------------------------------------------------------------
    svn:eol-style = native

Added: incubator/eagle/web/tutorial-userprofile.md
URL: http://svn.apache.org/viewvc/incubator/eagle/web/tutorial-userprofile.md?rev=1728410&view=auto
==============================================================================
--- incubator/eagle/web/tutorial-userprofile.md (added)
+++ incubator/eagle/web/tutorial-userprofile.md Thu Feb  4 06:27:16 2016
@@ -0,0 +1,65 @@
+---
+layout: doc
+title:  "User Profile Tutorial"
+permalink: /docs/tutorial/userprofile.html
+---
+This document will introduce how to start the online processing on user profiles. Assume Eagle has been installed and [Eagle service](http://sandbox.hortonworks.com:9099/eagle-service)
+is started.
+
+### User Profile Offline Training
+
+* **Step 1**: Start Spark if not started
+![Start Spark](/images/docs/start-spark.png)
+
+* **Step 2**: start offline scheduler
+
+	* Option 1: command line
+
+	      $ cd <eagle-home>/bin
+	      $ bin/eagle-userprofile-scheduler.sh --site sandbox start
+
+	* Option 2: start via Ambari
+	![Click "ops"](/images/docs/offline-userprofile.png)
+
+* **Step 3**: generate a model
+
+	![Click "ops"](/images/docs/userProfile1.png)
+	![Click "Update Now"](/images/docs/userProfile2.png)
+	![Click "Confirm"](/images/docs/userProfile3.png)
+	![Check](/images/docs/userprofile4.png)
+
+### User Profile Online Detection
+
+Two options to start the topology are provided.
+
+* **Option 1**: command line
+
+	submit userProfiles topology if it's not on [topology UI](http://sandbox.hortonworks.com:8744)
+
+      $ bin/eagle-topology.sh --main eagle.security.userprofile.UserProfileDetectionMain --config conf/sandbox-userprofile-topology.conf start
+
+* **Option 2**: Ambari
+	
+	![Online userProfiles](/images/docs/online-userprofile.png)
+
+### Evaluate User Profile in Sandbox
+
+1. Prepare sample data for ML training and validation sample data
+* a. Download following sample data to be used for training 
+	* [`user1.hdfs-audit.2015-10-11-00.txt`](/data/user1.hdfs-audit.2015-10-11-00.txt) 
+	* [`user1.hdfs-audit.2015-10-11-01.txt`](/data/user1.hdfs-audit.2015-10-11-01.txt)
+* b. Downlaod [`userprofile-validate.txt`](/data/userprofile-validate.txt)file which contains data points that you can try to test the models
+
+2. Copy the files (downloaded in the previous step) into a location in sandbox 
+For example: `/usr/hdp/current/eagle/lib/userprofile/data/`
+3. Modify `<Eagle-home>/conf/sandbox-userprofile-scheduler.conf `
+update `training-audit-path` to set to the path for training data sample (the path you used for Step 1.a)
+update detection-audit-path to set to the path for validation (the path you used for Step 1.b)
+4. Run ML training program from eagle UI
+5. Produce kafka data using the contents from validate file (Step 1.b)
+Run the command (assuming the eagle configuration uses kafka topic `sandbox_hdfs_audit_log`) 
+
+		./kafka-console-producer.sh --broker-list sandbox.hortonworks.com:6667 --topic sandbox_hdfs_audit_log
+
+6. Paste few lines of data from file validate onto kafka-console-producer 
+Check [http://localhost:9099/eagle-service/#/dam/alertList](http://localhost:9099/eagle-service/#/dam/alertList) for generated alerts 

Propchange: incubator/eagle/web/tutorial-userprofile.md
------------------------------------------------------------------------------
    svn:eol-style = native

Added: incubator/eagle/web/usecase.md
URL: http://svn.apache.org/viewvc/incubator/eagle/web/usecase.md?rev=1728410&view=auto
==============================================================================
--- incubator/eagle/web/usecase.md (added)
+++ incubator/eagle/web/usecase.md Thu Feb  4 06:27:16 2016
@@ -0,0 +1,14 @@
+---
+layout: doc
+title:  "Use Cases" 
+permalink: /docs/usecase.html
+---
+
+Here are some of the hadoop eagle data activity monitoring use cases:
+
+* Anomalous access detection.
+* Monitor data access traffic. 
+* Discover intrusions and security breach.
+* Discover and prevent sensitive data loss and leaks.
+* Monitor data access based on user behaviour patterns
+* Ability to analyze access logs for audit purpose
\ No newline at end of file

Propchange: incubator/eagle/web/usecase.md
------------------------------------------------------------------------------
    svn:eol-style = native

Added: incubator/eagle/web/user-profile-ml.md
URL: http://svn.apache.org/viewvc/incubator/eagle/web/user-profile-ml.md?rev=1728410&view=auto
==============================================================================
--- incubator/eagle/web/user-profile-ml.md (added)
+++ incubator/eagle/web/user-profile-ml.md Thu Feb  4 06:27:16 2016
@@ -0,0 +1,23 @@
+---
+layout: doc
+title:  "User Profile Machine Learning" 
+permalink: /docs/user-profile-ml.html
+---
+
+Eagle provides capabilities to define user activity patterns or user profiles for Hadoop users based on the user behavior in the platform. The idea is to provide anomaly detection capability without setting hard thresholds in the system. The user profiles generated by our system are modeled using machine-learning algorithms and used for detection of anomalous user activities, where users’ activity pattern differs from their pattern history. Currently Eagle uses two algorithms for anomaly detection: Eigen-Value Decomposition and Density Estimation. The algorithms read data from HDFS audit logs, slice and dice data, and generate models for each user in the system. Once models are generated, Eagle uses the Storm framework for near-real-time anomaly detection to determine if current user activities are suspicious or not with respect to their model. The block diagram below shows the current pipeline for user profile training and online detection.
+
+![](/images/docs/userprofile-arch.png)
+
+Eagle online anomaly detection uses the Eagle policy framework, and the user profile is defined as one of the policies in the system. The user profile policy is evaluated by a machine-learning evaluator extended from the Eagle policy evaluator. Policy definition includes the features that are needed for anomaly detection (same as the ones used for training purposes).
+
+A scheduler runs a Spark-based offline training program (to generate user profiles or models) at a configurable time interval; currently, the training program generates new models once every month.
+
+The following are some details on the algorithms.
+
+* **Density Estimation**: In this algorithm, the idea is to evaluate, for each user, a probability density function from the observed training data sample. We mean-normalize a training dataset for each feature. Normalization allows datasets to be on the same scale. In our probability density estimation, we use a Gaussian distribution function as the method for computing probability density. Features are conditionally independent of one another; therefore, the final Gaussian probability density can be computed by factorizing each feature’s probability density. During the online detection phase, we compute the probability of a user’s activity. If the probability of the user performing the activity is below threshold (determined from the training program, using a method called Mathews Correlation Coefficient), we signal anomaly alerts.
+* **Eigen-Value Decomposition**: Our goal in user profile generation is to find interesting behavioral patterns for users. One way to achieve that goal is to consider a combination of features and see how each one influences the others. When the data volume is large, which is generally the case for us, abnormal patterns among features may go unnoticed due to the huge number of normal patterns. As normal behavioral patterns can lie within very low-dimensional subspace, we can potentially reduce the dimension of the dataset to better understand the user behavior pattern. This method also reduces noise, if any, in the training dataset. Based on the amount of variance of the data we maintain for a user, which is usually 95% for our case, we seek to find the number of principal components k that represents 95% variance. We consider first k principal components as normal subspace for the user. The remaining (n-k) principal components are considered as abnormal subspace.
+
+During online anomaly detection, if the user behavior lies near normal subspace, we consider the behavior to be normal. On the other hand, if the user behavior lies near the abnormal subspace, we raise an alarm as we believe usual user behavior should generally fall within normal subspace. We use the Euclidian distance method to compute whether a user’s current activity is near normal or abnormal subspace.
+
+![](/images/docs/userprofile-model.png)
+

Propchange: incubator/eagle/web/user-profile-ml.md
------------------------------------------------------------------------------
    svn:eol-style = native



Mime
View raw message