Return-Path: Delivered-To: apmail-lucene-hadoop-commits-archive@locus.apache.org Received: (qmail 81021 invoked from network); 6 Oct 2007 03:50:19 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.2) by minotaur.apache.org with SMTP; 6 Oct 2007 03:50:19 -0000 Received: (qmail 89567 invoked by uid 500); 6 Oct 2007 03:50:07 -0000 Delivered-To: apmail-lucene-hadoop-commits-archive@lucene.apache.org Received: (qmail 89531 invoked by uid 500); 6 Oct 2007 03:50:07 -0000 Mailing-List: contact hadoop-commits-help@lucene.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hadoop-dev@lucene.apache.org Delivered-To: mailing list hadoop-commits@lucene.apache.org Received: (qmail 89522 invoked by uid 99); 6 Oct 2007 03:50:07 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 05 Oct 2007 20:50:07 -0700 X-ASF-Spam-Status: No, hits=-98.7 required=10.0 tests=ALL_TRUSTED,URI_HEX X-Spam-Check-By: apache.org Received: from [140.211.11.3] (HELO eris.apache.org) (140.211.11.3) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 06 Oct 2007 03:50:09 +0000 Received: by eris.apache.org (Postfix, from userid 65534) id AD1CF1A9832; Fri, 5 Oct 2007 20:49:48 -0700 (PDT) Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit Subject: svn commit: r582443 - in /lucene/hadoop/trunk/src/contrib/hbase: ./ src/java/org/apache/hadoop/hbase/generated/ src/java/org/apache/hadoop/hbase/generated/master/ src/java/org/apache/hadoop/hbase/generated/regionserver/ src/java/org/apache/hadoop/hbase... Date: Sat, 06 Oct 2007 03:49:46 -0000 To: hadoop-commits@lucene.apache.org From: stack@apache.org X-Mailer: svnmailer-1.0.8 Message-Id: <20071006034948.AD1CF1A9832@eris.apache.org> X-Virus-Checked: Checked by ClamAV on apache.org Author: stack Date: Fri Oct 5 20:49:43 2007 New Revision: 582443 URL: http://svn.apache.org/viewvc?rev=582443&view=rev Log: HADOOP-1957 Web UI with report on cluster state and basic browsing of tables Added: lucene/hadoop/trunk/src/contrib/hbase/build-webapps.xml lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/ lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/master/ lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/master/hql_jsp.java lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/master/master_jsp.java lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/regionserver/ lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/regionserver/regionserver_jsp.java lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/shell/formatter/ lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/shell/formatter/AsciiTableFormatter.java lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/shell/formatter/HtmlTableFormatter.java lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/util/InfoServer.java lucene/hadoop/trunk/src/contrib/hbase/src/test/org/apache/hadoop/hbase/TestInfoServers.java lucene/hadoop/trunk/src/contrib/hbase/src/test/org/apache/hadoop/hbase/TestSerialization.java lucene/hadoop/trunk/src/contrib/hbase/src/webapps/ lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/ lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/WEB-INF/ lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/WEB-INF/web.xml lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/hql.jsp lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/index.html lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/master.jsp lucene/hadoop/trunk/src/contrib/hbase/src/webapps/regionserver/ lucene/hadoop/trunk/src/contrib/hbase/src/webapps/regionserver/WEB-INF/ lucene/hadoop/trunk/src/contrib/hbase/src/webapps/regionserver/WEB-INF/web.xml lucene/hadoop/trunk/src/contrib/hbase/src/webapps/regionserver/index.html lucene/hadoop/trunk/src/contrib/hbase/src/webapps/regionserver/regionserver.jsp lucene/hadoop/trunk/src/contrib/hbase/src/webapps/static/ lucene/hadoop/trunk/src/contrib/hbase/src/webapps/static/hbase.css Removed: lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/shell/HelpContents.java lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/shell/HelpManager.java Modified: lucene/hadoop/trunk/src/contrib/hbase/src/test/hbase-site.xml Added: lucene/hadoop/trunk/src/contrib/hbase/build-webapps.xml URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/build-webapps.xml?rev=582443&view=auto ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/build-webapps.xml (added) +++ lucene/hadoop/trunk/src/contrib/hbase/build-webapps.xml Fri Oct 5 20:49:43 2007 @@ -0,0 +1,80 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + Added: lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/master/hql_jsp.java URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/master/hql_jsp.java?rev=582443&view=auto ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/master/hql_jsp.java (added) +++ lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/master/hql_jsp.java Fri Oct 5 20:49:43 2007 @@ -0,0 +1,93 @@ +package org.apache.hadoop.hbase.generated.master; + +import javax.servlet.*; +import javax.servlet.http.*; +import javax.servlet.jsp.*; +import java.util.*; +import org.apache.hadoop.hbase.HBaseConfiguration; +import org.apache.hadoop.hbase.shell.TableFormatter; +import org.apache.hadoop.hbase.shell.ReturnMsg; +import org.apache.hadoop.hbase.shell.generated.Parser; +import org.apache.hadoop.hbase.shell.Command; +import org.apache.hadoop.hbase.shell.formatter.HtmlTableFormatter; + +public final class hql_jsp extends org.apache.jasper.runtime.HttpJspBase + implements org.apache.jasper.runtime.JspSourceDependent { + + private static java.util.Vector _jspx_dependants; + + public java.util.List getDependants() { + return _jspx_dependants; + } + + public void _jspService(HttpServletRequest request, HttpServletResponse response) + throws java.io.IOException, ServletException { + + JspFactory _jspxFactory = null; + PageContext pageContext = null; + HttpSession session = null; + ServletContext application = null; + ServletConfig config = null; + JspWriter out = null; + Object page = this; + JspWriter _jspx_out = null; + PageContext _jspx_page_context = null; + + + try { + _jspxFactory = JspFactory.getDefaultFactory(); + response.setContentType("text/html;charset=UTF-8"); + pageContext = _jspxFactory.getPageContext(this, request, response, + null, true, 8192, true); + _jspx_page_context = pageContext; + application = pageContext.getServletContext(); + config = pageContext.getServletConfig(); + session = pageContext.getSession(); + out = pageContext.getOut(); + _jspx_out = out; + + out.write("\n \n\n\nHQL\n\n\n\n\n

HQL

\n

Home

\n"); + String query = request.getParameter("query"); + if (query == null) { + query = ""; + } + + out.write("\n
\n

\n \n \n \n

\n
\n

Enter 'help;' -- thats 'help' plus a semi-colon -- for a list of HQL commands.\n Data Definition, SHELL, INSERTS, DELETES, and UPDATE commands are disabled in this interface\n

\n \n "); + + if (query.length() > 0) { + + out.write("\n
\n "); + + Parser parser = new Parser(query, out, new HtmlTableFormatter(out)); + Command cmd = parser.terminatedCommand(); + if (cmd.getCommandType() != Command.CommandType.SELECT) { + + out.write("\n

"); + out.print( cmd.getCommandType() ); + out.write("-type commands are disabled in this interface.

\n "); + + } else { + ReturnMsg rm = cmd.execute(new HBaseConfiguration()); + String summary = rm == null? "": rm.toString(); + + out.write("\n

"); + out.print( summary ); + out.write("

\n "); + } + } + + out.write("\n\n"); + } catch (Throwable t) { + if (!(t instanceof SkipPageException)){ + out = _jspx_out; + if (out != null && out.getBufferSize() != 0) + out.clearBuffer(); + if (_jspx_page_context != null) _jspx_page_context.handlePageException(t); + } + } finally { + if (_jspxFactory != null) _jspxFactory.releasePageContext(_jspx_page_context); + } + } +} Added: lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/master/master_jsp.java URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/master/master_jsp.java?rev=582443&view=auto ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/master/master_jsp.java (added) +++ lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/master/master_jsp.java Fri Oct 5 20:49:43 2007 @@ -0,0 +1,140 @@ +package org.apache.hadoop.hbase.generated.master; + +import javax.servlet.*; +import javax.servlet.http.*; +import javax.servlet.jsp.*; +import java.util.*; +import org.apache.hadoop.io.Text; +import org.apache.hadoop.hbase.HMaster; +import org.apache.hadoop.hbase.HConstants; +import org.apache.hadoop.hbase.HMaster.MetaRegion; +import org.apache.hadoop.hbase.HBaseAdmin; +import org.apache.hadoop.hbase.HServerInfo; +import org.apache.hadoop.hbase.HServerAddress; +import org.apache.hadoop.hbase.HRegionInfo; +import org.apache.hadoop.hbase.HBaseConfiguration; +import org.apache.hadoop.hbase.shell.ShowCommand; +import org.apache.hadoop.hbase.shell.TableFormatter; +import org.apache.hadoop.hbase.shell.ReturnMsg; +import org.apache.hadoop.hbase.shell.formatter.HtmlTableFormatter; +import org.apache.hadoop.hbase.HTableDescriptor; + +public final class master_jsp extends org.apache.jasper.runtime.HttpJspBase + implements org.apache.jasper.runtime.JspSourceDependent { + + private static java.util.Vector _jspx_dependants; + + public java.util.List getDependants() { + return _jspx_dependants; + } + + public void _jspService(HttpServletRequest request, HttpServletResponse response) + throws java.io.IOException, ServletException { + + JspFactory _jspxFactory = null; + PageContext pageContext = null; + HttpSession session = null; + ServletContext application = null; + ServletConfig config = null; + JspWriter out = null; + Object page = this; + JspWriter _jspx_out = null; + PageContext _jspx_page_context = null; + + + try { + _jspxFactory = JspFactory.getDefaultFactory(); + response.setContentType("text/html;charset=UTF-8"); + pageContext = _jspxFactory.getPageContext(this, request, response, + null, true, 8192, true); + _jspx_page_context = pageContext; + application = pageContext.getServletContext(); + config = pageContext.getServletConfig(); + session = pageContext.getSession(); + out = pageContext.getOut(); + _jspx_out = out; + + + HMaster master = (HMaster)getServletContext().getAttribute(HMaster.MASTER); + HBaseConfiguration conf = new HBaseConfiguration(); + TableFormatter formatter = new HtmlTableFormatter(out); + ShowCommand show = new ShowCommand(out, formatter, "tables"); + HServerAddress rootLocation = master.getRootRegionLocation(); + Map onlineRegions = master.getOnlineMetaRegions(); + Map serverToServerInfos = + master.getServersToServerInfo(); + + out.write("\n \n\n\nHbase Master: "); + out.print( master.getMasterAddress()); + out.write("\n\n\n\n\n

Hbase Master: "); + out.print(master.getMasterAddress()); + out.write("

\n

HQL,\nLocal logs, Thread Dump

\n\n

Master Attributes

\n\n\n\n\n
Attribute NameValue
Filesystem"); + out.print( conf.get("fs.default.name") ); + out.write("
Hbase Root Directory"); + out.print( master.getRootDir().toString() ); + out.write("
\n\n

Online META Regions

\n"); + if (rootLocation != null) { + out.write("\n\n\n\n"); + + if (onlineRegions != null && onlineRegions.size() > 0) { + out.write('\n'); + out.write(' '); + out.write(' '); + for (Map.Entry e: onlineRegions.entrySet()) { + MetaRegion meta = e.getValue(); + + out.write("\n \n "); + } + } + out.write("\n
NameServer
"); + out.print( HConstants.ROOT_TABLE_NAME.toString() ); + out.write(""); + out.print( rootLocation.toString() ); + out.write("
"); + out.print( meta.getRegionName().toString() ); + out.write(""); + out.print( meta.getServer().toString() ); + out.write("
\n"); + } + out.write("\n\n

Tables

\n"); + ReturnMsg msg = show.execute(conf); + out.write("\n

"); + out.print(msg ); + out.write("

\n\n

Region Servers

\n"); + if (serverToServerInfos != null && serverToServerInfos.size() > 0) { + out.write("\n\n\n"); + for (Map.Entry e: serverToServerInfos.entrySet()) { + HServerInfo hsi = e.getValue(); + String url = "http://" + + hsi.getServerAddress().getBindAddress().toString() + ":" + + hsi.getInfoPort() + "/"; + String load = hsi.getLoad().toString(); + long startCode = hsi.getStartCode(); + String address = hsi.getServerAddress().toString(); + + out.write("\n\n"); + } + out.write("\n
AddressStart CodeLoad
'); + out.print( address ); + out.write(""); + out.print( startCode ); + out.write(""); + out.print( load ); + out.write("
\n"); + } + out.write("\n\n"); + } catch (Throwable t) { + if (!(t instanceof SkipPageException)){ + out = _jspx_out; + if (out != null && out.getBufferSize() != 0) + out.clearBuffer(); + if (_jspx_page_context != null) _jspx_page_context.handlePageException(t); + } + } finally { + if (_jspxFactory != null) _jspxFactory.releasePageContext(_jspx_page_context); + } + } +} Added: lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/regionserver/regionserver_jsp.java URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/regionserver/regionserver_jsp.java?rev=582443&view=auto ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/regionserver/regionserver_jsp.java (added) +++ lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/generated/regionserver/regionserver_jsp.java Fri Oct 5 20:49:43 2007 @@ -0,0 +1,88 @@ +package org.apache.hadoop.hbase.generated.regionserver; + +import javax.servlet.*; +import javax.servlet.http.*; +import javax.servlet.jsp.*; +import java.util.*; +import org.apache.hadoop.io.Text; +import org.apache.hadoop.hbase.HRegionServer; +import org.apache.hadoop.hbase.HRegion; +import org.apache.hadoop.hbase.HConstants; +import org.apache.hadoop.hbase.HServerInfo; +import org.apache.hadoop.hbase.HRegionInfo; + +public final class regionserver_jsp extends org.apache.jasper.runtime.HttpJspBase + implements org.apache.jasper.runtime.JspSourceDependent { + + private static java.util.Vector _jspx_dependants; + + public java.util.List getDependants() { + return _jspx_dependants; + } + + public void _jspService(HttpServletRequest request, HttpServletResponse response) + throws java.io.IOException, ServletException { + + JspFactory _jspxFactory = null; + PageContext pageContext = null; + HttpSession session = null; + ServletContext application = null; + ServletConfig config = null; + JspWriter out = null; + Object page = this; + JspWriter _jspx_out = null; + PageContext _jspx_page_context = null; + + + try { + _jspxFactory = JspFactory.getDefaultFactory(); + response.setContentType("text/html;charset=UTF-8"); + pageContext = _jspxFactory.getPageContext(this, request, response, + null, true, 8192, true); + _jspx_page_context = pageContext; + application = pageContext.getServletContext(); + config = pageContext.getServletConfig(); + session = pageContext.getSession(); + out = pageContext.getOut(); + _jspx_out = out; + + + HRegionServer regionServer = (HRegionServer)getServletContext().getAttribute(HRegionServer.REGIONSERVER); + HServerInfo serverInfo = regionServer.getServerInfo(); + SortedMap onlineRegions = regionServer.getOnlineRegions(); + + out.write("\n \n\n\nHbase Region Server: "); + out.print( serverInfo.getServerAddress().toString() ); + out.write("\n\n\n\n\n

Hbase Region Server: "); + out.print( serverInfo.getServerAddress().toString() ); + out.write("

\n

Local logs, Thread Dump

\n\n

Region Server Attributes

\n\n\n\n
Attribute NameValue
Load"); + out.print( serverInfo.getLoad().toString() ); + out.write("
\n\n

Online Regions

\n"); + if (onlineRegions != null && onlineRegions.size() > 0) { + out.write("\n\n\n"); + for (HRegion r: onlineRegions.values()) { + out.write("\n\n"); + } + out.write("\n
Region NameStart KeyEnd Key
"); + out.print( r.getRegionName().toString() ); + out.write(""); + out.print( r.getStartKey().toString() ); + out.write(""); + out.print( r.getEndKey().toString() ); + out.write("
\n

Region names are made of the containing table's name, a comma,\nthe start key, a comma, and a randomly generated region id. To illustrate,\nthe region named\ndomains,apache.org,5464829424211263407 is party to the table \ndomains, has an id of 5464829424211263407 and the first key\nin the region is apache.org. The -ROOT-\nand .META. 'tables' are internal sytem tables.\nThe -ROOT- keeps a list of all regions in the .META. table. The .META. table\nkeeps a list of all regions in the system. The empty key is used to denote\ntable start and table end. A region with an\nempty start key is the first region in a table. If region has both an empty\nstart and an empty end key, its the only region in the table. See\nHbase Home for\nfurther explication.

\n"); + } else { + out.write("\n

Not serving regions

\n"); + } + out.write("\n\n"); + } catch (Throwable t) { + if (!(t instanceof SkipPageException)){ + out = _jspx_out; + if (out != null && out.getBufferSize() != 0) + out.clearBuffer(); + if (_jspx_page_context != null) _jspx_page_context.handlePageException(t); + } + } finally { + if (_jspxFactory != null) _jspxFactory.releasePageContext(_jspx_page_context); + } + } +} Added: lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/shell/formatter/AsciiTableFormatter.java URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/shell/formatter/AsciiTableFormatter.java?rev=582443&view=auto ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/shell/formatter/AsciiTableFormatter.java (added) +++ lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/shell/formatter/AsciiTableFormatter.java Fri Oct 5 20:49:43 2007 @@ -0,0 +1,151 @@ +package org.apache.hadoop.hbase.shell.formatter; + +import java.io.IOException; +import java.io.Writer; + +import org.apache.hadoop.hbase.shell.TableFormatter; + + +/** + * Formatter that outputs data inside an ASCII table. + * If only a single cell result, then no formatting is done. Presumption is + * that client manages serial access outputting tables. Does not close passed + * {@link Writer}. + */ +public class AsciiTableFormatter implements TableFormatter { + private static final String COLUMN_DELIMITER = "| "; + private static final String COLUMN_CLOSER = "|"; + private static final int DEFAULT_COLUMN_WIDTH = 26; + // Width is a line of content + delimiter + private int columnWidth = DEFAULT_COLUMN_WIDTH; + // Amount of width to use for a line of content. + private int columnContentWidth = + DEFAULT_COLUMN_WIDTH - COLUMN_DELIMITER.length(); + // COLUMN_LINE is put at head and foot of a column and per column, is drawn + // as row delimiter + private String columnHorizLine; + private final String COLUMN_HORIZ_LINE_CLOSER = "+"; + // Used padding content to fill column + private final String PADDING_CHAR = " "; + // True if we are to output no formatting. + private boolean noFormatting = false; + private final Writer out; + private final String LINE_SEPARATOR = System.getProperty("line.separator"); + + // Not instantiable + @SuppressWarnings("unused") + private AsciiTableFormatter() { + this(null); + } + + public AsciiTableFormatter(final Writer o) { + this.out = o; + } + + public Writer getOut() { + return this.out; + } + + /** + * @param titles List of titles. Pass null if no formatting (i.e. + * no header, no footer, etc. + * @throws IOException + */ + public void header(String[] titles) throws IOException { + if (titles == null) { + // print nothing. + setNoFormatting(true); + return; + } + // Calculate width of columns. + this.columnWidth = titles.length == 1? 3 * DEFAULT_COLUMN_WIDTH: + titles.length == 2? 39: DEFAULT_COLUMN_WIDTH; + this.columnContentWidth = this.columnWidth - COLUMN_DELIMITER.length(); + // Create the horizontal line to draw across the top of each column. + this.columnHorizLine = calculateColumnHorizLine(this.columnWidth); + // Print out a column topper per column. + printRowDelimiter(titles.length); + row(titles); + } + + public void row(String [] cells) throws IOException { + if (isNoFormatting()) { + getOut().write(cells[0]); + getOut().flush(); + return; + } + // Ok. Output cells a line at a time w/ delimiters between cells. + int [] indexes = new int[cells.length]; + for (int i = 0; i < indexes.length; i++) { + indexes[i] = 0; + } + int allFinished = 0; + while (allFinished < indexes.length) { + StringBuffer sb = new StringBuffer(); + for (int i = 0; i < cells.length; i++) { + sb.append(COLUMN_DELIMITER); + int offset = indexes[i]; + if (offset + this.columnContentWidth >= cells[i].length()) { + String substr = cells[i].substring(offset); + if (substr.length() > 0) { + // This column is finished + allFinished++; + sb.append(substr); + } + for (int j = 0; j < this.columnContentWidth - substr.length(); j++) { + sb.append(PADDING_CHAR); + } + indexes[i] = cells[i].length(); + } else { + String substr = cells[i].substring(indexes[i], + indexes[i] + this.columnContentWidth); + indexes[i] += this.columnContentWidth; + sb.append(substr); + } + } + sb.append(COLUMN_CLOSER); + getOut().write(sb.toString()); + getOut().write(LINE_SEPARATOR); + getOut().flush(); + } + printRowDelimiter(cells.length); + } + + public void footer() throws IOException { + if (isNoFormatting()) { + // If no formatting, output a newline to delimit cell and the + // result summary output at end of every command. + getOut().write(LINE_SEPARATOR); + getOut().flush(); + } + // We're done. Clear flag. + setNoFormatting(false); + } + + private void printRowDelimiter(final int columnCount) throws IOException { + for (int i = 0; i < columnCount; i++) { + getOut().write(this.columnHorizLine); + + } + getOut().write(COLUMN_HORIZ_LINE_CLOSER); + getOut().write(LINE_SEPARATOR); + getOut().flush(); + } + + private String calculateColumnHorizLine(final int width) { + StringBuffer sb = new StringBuffer(); + sb.append("+"); + for (int i = 1; i < width; i++) { + sb.append("-"); + } + return sb.toString(); + } + + public boolean isNoFormatting() { + return this.noFormatting; + } + + public void setNoFormatting(boolean noFormatting) { + this.noFormatting = noFormatting; + } +} \ No newline at end of file Added: lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/shell/formatter/HtmlTableFormatter.java URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/shell/formatter/HtmlTableFormatter.java?rev=582443&view=auto ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/shell/formatter/HtmlTableFormatter.java (added) +++ lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/shell/formatter/HtmlTableFormatter.java Fri Oct 5 20:49:43 2007 @@ -0,0 +1,111 @@ +package org.apache.hadoop.hbase.shell.formatter; + +import java.io.IOException; +import java.io.Writer; + +import org.apache.hadoop.hbase.shell.TableFormatter; +import org.znerd.xmlenc.LineBreak; +import org.znerd.xmlenc.XMLOutputter; + +/** + * Formatter that outputs data inside an HTML table. + * If only a single cell result, then no formatting is done. Presumption is + * that client manages serial access outputting tables. Does not close passed + * {@link Writer}. + *

TODO: Uses xmlenc. Hopefully it flushes every so often (Claims its a + * stream-based outputter). Verify. + *

For now, invoke it this way (until shell starts to take cmdline params); + * $ HBASE_OPTS='-Dhbaseshell.formatter=org.apache.hadoop.hbase.shell.TableFormatterFactory$HtmlTableFormatter' ./bin/hbase shell + */ +public class HtmlTableFormatter implements TableFormatter { + private final XMLOutputter outputter; + private boolean noFormatting = false; + private final Writer out; + + // Uninstantiable + @SuppressWarnings("unused") + private HtmlTableFormatter() { + this(null); + } + + public HtmlTableFormatter(final Writer o) { + this.out = o; + try { + // Looking at the xmlenc source, there should be no issue w/ wrapping + // the stream -- i.e. no hanging resources. + this.outputter = new XMLOutputter(this.out, "UTF-8"); + String os = System.getProperty("os.name").toLowerCase(); + // Shell likes the DOS output. + this.outputter.setLineBreak(os.contains("windows")? + LineBreak.DOS: LineBreak.UNIX); + this.outputter.setIndentation(" "); + } catch (Exception e) { + throw new RuntimeException(e); + } + } + + + /** + * @param titles List of titles. Pass null if no formatting (i.e. + * no header, no footer, etc. + * @throws IOException + */ + public void header(String[] titles) throws IOException { + if (titles == null) { + // print nothing. + setNoFormatting(true); + return; + } + // Can't add a 'border=1' attribute because its included on the end in + + this.outputter.startTag("table"); + this.outputter.startTag("tr"); + for (int i = 0; i < titles.length; i++) { + this.outputter.startTag("th"); + this.outputter.pcdata(titles[i]); + this.outputter.endTag(); + } + this.outputter.endTag(); + } + + public void row(String [] cells) throws IOException{ + if (isNoFormatting()) { + this.outputter.pcdata(cells[0]); + return; + } + this.outputter.startTag("tr"); + for (int i = 0; i < cells.length; i++) { + this.outputter.startTag("td"); + this.outputter.pcdata(cells[i]); + this.outputter.endTag(); + } + this.outputter.endTag(); + } + + public void footer() throws IOException { + if (!isNoFormatting()) { + // To close the table + this.outputter.endTag(); + this.outputter.endDocument(); + } + // We're done. Clear flag. + this.setNoFormatting(false); + // If no formatting, output a newline to delimit cell and the + // result summary output at end of every command. If html, also emit a + // newline to delimit html and summary line. + getOut().write(System.getProperty("line.separator")); + getOut().flush(); + } + + public Writer getOut() { + return this.out; + } + + public boolean isNoFormatting() { + return this.noFormatting; + } + + public void setNoFormatting(boolean noFormatting) { + this.noFormatting = noFormatting; + } +} \ No newline at end of file Added: lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/util/InfoServer.java URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/util/InfoServer.java?rev=582443&view=auto ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/util/InfoServer.java (added) +++ lucene/hadoop/trunk/src/contrib/hbase/src/java/org/apache/hadoop/hbase/util/InfoServer.java Fri Oct 5 20:49:43 2007 @@ -0,0 +1,229 @@ +/** + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.hbase.util; + +import java.io.File; +import java.io.FileNotFoundException; +import java.io.IOException; +import java.net.URL; + +import javax.servlet.http.HttpServlet; + +import org.apache.commons.logging.Log; +import org.apache.commons.logging.LogFactory; +import org.apache.hadoop.mapred.StatusHttpServer; +import org.mortbay.http.HttpContext; +import org.mortbay.http.SocketListener; +import org.mortbay.http.handler.ResourceHandler; +import org.mortbay.jetty.servlet.WebApplicationContext; + +/** + * Create a Jetty embedded server to answer http requests. The primary goal + * is to serve up status information for the server. + * There are three contexts: + * "/stacks/" -> points to stack trace + * "/static/" -> points to common static files (src/webapps/static) + * "/" -> the jsp server code from (src/webapps/) + */ +public class InfoServer { + // Bulk of this class is copied from + // {@link org.apache.hadoop.mapred.StatusHttpServer}. StatusHttpServer + // is not amenable to subclassing. It keeps webAppContext inaccessible + // and will find webapps only in the jar the class StatusHttpServer was + // loaded from. + private static final Log LOG = LogFactory.getLog(InfoServer.class.getName()); + private org.mortbay.jetty.Server webServer; + private SocketListener listener; + private boolean findPort; + private WebApplicationContext webAppContext; + + /** + * Create a status server on the given port. + * The jsp scripts are taken from src/webapps/name. + * @param name The name of the server + * @param port The port to use on the server + * @param findPort whether the server should start at the given port and + * increment by 1 until it finds a free port. + */ + public InfoServer(String name, String bindAddress, int port, boolean findPort) + throws IOException { + this.webServer = new org.mortbay.jetty.Server(); + this.findPort = findPort; + this.listener = new SocketListener(); + this.listener.setPort(port); + this.listener.setHost(bindAddress); + this.webServer.addListener(listener); + + // Set up the context for "/static/*" + String appDir = getWebAppsPath(); + + // Set up the context for "/logs/" if "hadoop.log.dir" property is defined. + String logDir = System.getProperty("hadoop.log.dir"); + if (logDir != null) { + HttpContext logContext = new HttpContext(); + logContext.setContextPath("/logs/*"); + logContext.setResourceBase(logDir); + logContext.addHandler(new ResourceHandler()); + webServer.addContext(logContext); + } + + HttpContext staticContext = new HttpContext(); + staticContext.setContextPath("/static/*"); + staticContext.setResourceBase(appDir + "/static"); + staticContext.addHandler(new ResourceHandler()); + this.webServer.addContext(staticContext); + + // set up the context for "/" jsp files + String webappDir = null; + try { + webappDir = getWebAppsPath("webapps" + File.separator + name); + } catch (FileNotFoundException e) { + // Retry. Resource may be inside jar on a windows machine. + webappDir = getWebAppsPath("webapps/" + name); + } + this.webAppContext = + this.webServer.addWebApplication("/", webappDir); + addServlet("stacks", "/stacks", StatusHttpServer.StackServlet.class); + } + + /** + * Set a value in the webapp context. These values are available to the jsp + * pages as "application.getAttribute(name)". + * @param name The name of the attribute + * @param value The value of the attribute + */ + public void setAttribute(String name, Object value) { + this.webAppContext.setAttribute(name, value); + } + + /** + * Add a servlet in the server. + * @param name The name of the servlet (can be passed as null) + * @param pathSpec The path spec for the servlet + * @param servletClass The servlet class + */ + public void addServlet(String name, String pathSpec, + Class servletClass) { + WebApplicationContext context = webAppContext; + try { + if (name == null) { + context.addServlet(pathSpec, servletClass.getName()); + } else { + context.addServlet(name, pathSpec, servletClass.getName()); + } + } catch (ClassNotFoundException ex) { + throw makeRuntimeException("Problem instantiating class", ex); + } catch (InstantiationException ex) { + throw makeRuntimeException("Problem instantiating class", ex); + } catch (IllegalAccessException ex) { + throw makeRuntimeException("Problem instantiating class", ex); + } + } + + private static RuntimeException makeRuntimeException(String msg, Throwable cause) { + RuntimeException result = new RuntimeException(msg); + if (cause != null) { + result.initCause(cause); + } + return result; + } + + /** + * Get the value in the webapp context. + * @param name The name of the attribute + * @return The value of the attribute + */ + public Object getAttribute(String name) { + return this.webAppContext.getAttribute(name); + } + + /** + * Get the pathname to the webapps files. + * @return the pathname as a URL + */ + private static String getWebAppsPath() throws IOException { + return getWebAppsPath("webapps"); + } + + /** + * Get the pathname to the patch files. + * @param path Path to find. + * @return the pathname as a URL + */ + private static String getWebAppsPath(final String path) throws IOException { + URL url = InfoServer.class.getClassLoader().getResource(path); + if (url == null) + throw new IOException("webapps not found in CLASSPATH"); + return url.toString(); + } + + /** + * Get the port that the server is on + * @return the port + */ + public int getPort() { + return this.listener.getPort(); + } + + public void setThreads(int min, int max) { + this.listener.setMinThreads(min); + this.listener.setMaxThreads(max); + } + + /** + * Start the server. Does not wait for the server to start. + */ + public void start() throws IOException { + try { + while (true) { + try { + this.webServer.start(); + break; + } catch (org.mortbay.util.MultiException ex) { + // look for the multi exception containing a bind exception, + // in that case try the next port number. + boolean needNewPort = false; + for(int i=0; i < ex.size(); ++i) { + Exception sub = ex.getException(i); + if (sub instanceof java.net.BindException) { + needNewPort = true; + break; + } + } + if (!findPort || !needNewPort) { + throw ex; + } + this.listener.setPort(listener.getPort() + 1); + } + } + } catch (IOException ie) { + throw ie; + } catch (Exception e) { + IOException ie = new IOException("Problem starting http server"); + ie.initCause(e); + throw ie; + } + } + + /** + * stop the server + */ + public void stop() throws InterruptedException { + this.webServer.stop(); + } +} \ No newline at end of file Modified: lucene/hadoop/trunk/src/contrib/hbase/src/test/hbase-site.xml URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/src/test/hbase-site.xml?rev=582443&r1=582442&r2=582443&view=diff ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/src/test/hbase-site.xml (original) +++ lucene/hadoop/trunk/src/contrib/hbase/src/test/hbase-site.xml Fri Oct 5 20:49:43 2007 @@ -82,6 +82,20 @@ + hbase.master.info.port + -1 + The port for the hbase master web UI + Set to -1 if you do not want the info server to run. + + + + hbase.regionserver.info.port + -1 + The port for the hbase regionserver web UI + Set to -1 if you do not want the info server to run. + + + hbase.master.lease.thread.wakefrequency 3000 The interval between checks for expired region server leases. Added: lucene/hadoop/trunk/src/contrib/hbase/src/test/org/apache/hadoop/hbase/TestInfoServers.java URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/src/test/org/apache/hadoop/hbase/TestInfoServers.java?rev=582443&view=auto ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/src/test/org/apache/hadoop/hbase/TestInfoServers.java (added) +++ lucene/hadoop/trunk/src/contrib/hbase/src/test/org/apache/hadoop/hbase/TestInfoServers.java Fri Oct 5 20:49:43 2007 @@ -0,0 +1,84 @@ +/** + * Copyright 2007 The Apache Software Foundation + * + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.hbase; + +import java.io.BufferedInputStream; +import java.io.IOException; +import java.net.URL; + +import org.apache.commons.logging.Log; +import org.apache.commons.logging.LogFactory; +import org.apache.hadoop.io.Text; + +/** + * Testing, info servers are disabled. This test enables then and checks that + * they serve pages. + */ +public class TestInfoServers extends HBaseTestCase { + static final Log LOG = LogFactory.getLog(TestInfoServers.class); + + protected void setUp() throws Exception { + super.setUp(); + } + + protected void tearDown() throws Exception { + super.tearDown(); + } + + public void testInfoServersAreUp() throws Exception { + // Bring up info servers on 'odd' port numbers in case the test is not + // sourcing the src/test/hbase-default.xml. + this.conf.setInt("hbase.master.info.port", 60011); + this.conf.setInt("hbase.regionserver.info.port", 60031); + MiniHBaseCluster miniHbase = new MiniHBaseCluster(this.conf, 1); + // Create table so info servers are given time to spin up. + HBaseAdmin a = new HBaseAdmin(conf); + a.createTable(new HTableDescriptor(getName())); + assertTrue(a.tableExists(new Text(getName()))); + try { + int port = miniHbase.getMasterThread().getMaster().infoServer.getPort(); + assertHasExpectedContent(new URL("http://localhost:" + port + + "/index.html"), "Master"); + port = miniHbase.getRegionThreads().get(0).getRegionServer(). + infoServer.getPort(); + assertHasExpectedContent(new URL("http://localhost:" + port + + "/index.html"), "Region Server"); + } finally { + miniHbase.shutdown(); + } + } + + private void assertHasExpectedContent(final URL u, final String expected) + throws IOException { + LOG.info("Testing " + u.toString() + " has " + expected); + java.net.URLConnection c = u.openConnection(); + c.connect(); + assertTrue(c.getContentLength() > 0); + StringBuilder sb = new StringBuilder(c.getContentLength()); + BufferedInputStream bis = new BufferedInputStream(c.getInputStream()); + byte [] bytes = new byte[1024]; + for (int read = -1; (read = bis.read(bytes)) != -1;) { + sb.append(new String(bytes, 0, read)); + } + bis.close(); + String content = sb.toString(); + content.matches(expected); + } +} \ No newline at end of file Added: lucene/hadoop/trunk/src/contrib/hbase/src/test/org/apache/hadoop/hbase/TestSerialization.java URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/src/test/org/apache/hadoop/hbase/TestSerialization.java?rev=582443&view=auto ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/src/test/org/apache/hadoop/hbase/TestSerialization.java (added) +++ lucene/hadoop/trunk/src/contrib/hbase/src/test/org/apache/hadoop/hbase/TestSerialization.java Fri Oct 5 20:49:43 2007 @@ -0,0 +1,52 @@ +/** + * Copyright 2007 The Apache Software Foundation + * + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.hadoop.hbase; + +import java.io.ByteArrayInputStream; +import java.io.ByteArrayOutputStream; +import java.io.DataInputStream; +import java.io.DataOutputStream; + +import junit.framework.TestCase; + +public class TestSerialization extends TestCase { + + protected void setUp() throws Exception { + super.setUp(); + } + + protected void tearDown() throws Exception { + super.tearDown(); + } + + public void testServerInfo() throws Exception { + HServerInfo hsi = new HServerInfo(new HServerAddress("0.0.0.0:123"), -1, + 1245); + ByteArrayOutputStream baos = new ByteArrayOutputStream(); + DataOutputStream dao = new DataOutputStream(baos); + hsi.write(dao); + dao.close(); + ByteArrayInputStream bais = new ByteArrayInputStream(baos.toByteArray()); + DataInputStream dis = new DataInputStream(bais); + HServerInfo deserializedHsi = new HServerInfo(); + deserializedHsi.readFields(dis); + assertTrue(hsi.equals(deserializedHsi)); + } +} Added: lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/WEB-INF/web.xml URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/WEB-INF/web.xml?rev=582443&view=auto ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/WEB-INF/web.xml (added) +++ lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/WEB-INF/web.xml Fri Oct 5 20:49:43 2007 @@ -0,0 +1,33 @@ + + + + + + + + + org.apache.hadoop.hbase.generated.master.hql_jsp + org.apache.hadoop.hbase.generated.master.hql_jsp + + + + org.apache.hadoop.hbase.generated.master.master_jsp + org.apache.hadoop.hbase.generated.master.master_jsp + + + + org.apache.hadoop.hbase.generated.master.hql_jsp + /hql.jsp + + + + org.apache.hadoop.hbase.generated.master.master_jsp + /master.jsp + + + + Added: lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/hql.jsp URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/hql.jsp?rev=582443&view=auto ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/hql.jsp (added) +++ lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/hql.jsp Fri Oct 5 20:49:43 2007 @@ -0,0 +1,57 @@ +<%@ page contentType="text/html;charset=UTF-8" + import="java.util.*" + import="org.apache.hadoop.hbase.HBaseConfiguration" + import="org.apache.hadoop.hbase.shell.TableFormatter" + import="org.apache.hadoop.hbase.shell.ReturnMsg" + import="org.apache.hadoop.hbase.shell.generated.Parser" + import="org.apache.hadoop.hbase.shell.Command" + import="org.apache.hadoop.hbase.shell.formatter.HtmlTableFormatter" +%> + + + +HQL + + + + +

HQL

+

Home

+<% String query = request.getParameter("query"); + if (query == null) { + query = ""; + } +%> +
+

+ + + +

+
+

Enter 'help;' -- thats 'help' plus a semi-colon -- for a list of HQL commands. + Data Definition, SHELL, INSERTS, DELETES, and UPDATE commands are disabled in this interface +

+ + <% + if (query.length() > 0) { + %> +
+ <% + Parser parser = new Parser(query, out, new HtmlTableFormatter(out)); + Command cmd = parser.terminatedCommand(); + if (cmd.getCommandType() != Command.CommandType.SELECT) { + %> +

<%= cmd.getCommandType() %>-type commands are disabled in this interface.

+ <% + } else { + ReturnMsg rm = cmd.execute(new HBaseConfiguration()); + String summary = rm == null? "": rm.toString(); + %> +

<%= summary %>

+ <% } + } + %> + + \ No newline at end of file Added: lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/index.html URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/index.html?rev=582443&view=auto ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/index.html (added) +++ lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/index.html Fri Oct 5 20:49:43 2007 @@ -0,0 +1 @@ + Added: lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/master.jsp URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/master.jsp?rev=582443&view=auto ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/master.jsp (added) +++ lucene/hadoop/trunk/src/contrib/hbase/src/webapps/master/master.jsp Fri Oct 5 20:49:43 2007 @@ -0,0 +1,84 @@ +<%@ page contentType="text/html;charset=UTF-8" + import="java.util.*" + import="org.apache.hadoop.io.Text" + import="org.apache.hadoop.hbase.HMaster" + import="org.apache.hadoop.hbase.HConstants" + import="org.apache.hadoop.hbase.HMaster.MetaRegion" + import="org.apache.hadoop.hbase.HBaseAdmin" + import="org.apache.hadoop.hbase.HServerInfo" + import="org.apache.hadoop.hbase.HServerAddress" + import="org.apache.hadoop.hbase.HRegionInfo" + import="org.apache.hadoop.hbase.HBaseConfiguration" + import="org.apache.hadoop.hbase.shell.ShowCommand" + import="org.apache.hadoop.hbase.shell.TableFormatter" + import="org.apache.hadoop.hbase.shell.ReturnMsg" + import="org.apache.hadoop.hbase.shell.formatter.HtmlTableFormatter" + import="org.apache.hadoop.hbase.HTableDescriptor" %><% + HMaster master = (HMaster)getServletContext().getAttribute(HMaster.MASTER); + HBaseConfiguration conf = new HBaseConfiguration(); + TableFormatter formatter = new HtmlTableFormatter(out); + ShowCommand show = new ShowCommand(out, formatter, "tables"); + HServerAddress rootLocation = master.getRootRegionLocation(); + Map onlineRegions = master.getOnlineMetaRegions(); + Map serverToServerInfos = + master.getServersToServerInfo(); +%> + + + +Hbase Master: <%= master.getMasterAddress()%> + + + + +

Hbase Master: <%=master.getMasterAddress()%>

+

HQL, +Local logs, Thread Dump

+ +

Master Attributes

+ + + + +
Attribute NameValue
Filesystem<%= conf.get("fs.default.name") %>
Hbase Root Directory<%= master.getRootDir().toString() %>
+ +

Online META Regions

+<% if (rootLocation != null) { %> + + + +<% + if (onlineRegions != null && onlineRegions.size() > 0) { %> + <% for (Map.Entry e: onlineRegions.entrySet()) { + MetaRegion meta = e.getValue(); + %> + + <% } + } %> +
NameServer
<%= HConstants.ROOT_TABLE_NAME.toString() %><%= rootLocation.toString() %>
<%= meta.getRegionName().toString() %><%= meta.getServer().toString() %>
+<% } %> + +

Tables

+<% ReturnMsg msg = show.execute(conf); %> +

<%=msg %>

+ +

Region Servers

+<% if (serverToServerInfos != null && serverToServerInfos.size() > 0) { %> + + +<% for (Map.Entry e: serverToServerInfos.entrySet()) { + HServerInfo hsi = e.getValue(); + String url = "http://" + + hsi.getServerAddress().getBindAddress().toString() + ":" + + hsi.getInfoPort() + "/"; + String load = hsi.getLoad().toString(); + long startCode = hsi.getStartCode(); + String address = hsi.getServerAddress().toString(); +%> + +<% } %> +
AddressStart CodeLoad
<%= address %><%= startCode %><%= load %>
+<% } %> + + \ No newline at end of file Added: lucene/hadoop/trunk/src/contrib/hbase/src/webapps/regionserver/WEB-INF/web.xml URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/src/webapps/regionserver/WEB-INF/web.xml?rev=582443&view=auto ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/src/webapps/regionserver/WEB-INF/web.xml (added) +++ lucene/hadoop/trunk/src/contrib/hbase/src/webapps/regionserver/WEB-INF/web.xml Fri Oct 5 20:49:43 2007 @@ -0,0 +1,23 @@ + + + + + + + + + org.apache.hadoop.hbase.generated.regionserver.regionserver_jsp + org.apache.hadoop.hbase.generated.regionserver.regionserver_jsp + + + + org.apache.hadoop.hbase.generated.regionserver.regionserver_jsp + /regionserver.jsp + + + + Added: lucene/hadoop/trunk/src/contrib/hbase/src/webapps/regionserver/index.html URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/src/webapps/regionserver/index.html?rev=582443&view=auto ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/src/webapps/regionserver/index.html (added) +++ lucene/hadoop/trunk/src/contrib/hbase/src/webapps/regionserver/index.html Fri Oct 5 20:49:43 2007 @@ -0,0 +1 @@ + Added: lucene/hadoop/trunk/src/contrib/hbase/src/webapps/regionserver/regionserver.jsp URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/src/webapps/regionserver/regionserver.jsp?rev=582443&view=auto ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/src/webapps/regionserver/regionserver.jsp (added) +++ lucene/hadoop/trunk/src/contrib/hbase/src/webapps/regionserver/regionserver.jsp Fri Oct 5 20:49:43 2007 @@ -0,0 +1,57 @@ +<%@ page contentType="text/html;charset=UTF-8" + import="java.util.*" + import="org.apache.hadoop.io.Text" + import="org.apache.hadoop.hbase.HRegionServer" + import="org.apache.hadoop.hbase.HRegion" + import="org.apache.hadoop.hbase.HConstants" + import="org.apache.hadoop.hbase.HServerInfo" + import="org.apache.hadoop.hbase.HRegionInfo" %><% + HRegionServer regionServer = (HRegionServer)getServletContext().getAttribute(HRegionServer.REGIONSERVER); + HServerInfo serverInfo = regionServer.getServerInfo(); + SortedMap onlineRegions = regionServer.getOnlineRegions(); +%> + + + +Hbase Region Server: <%= serverInfo.getServerAddress().toString() %> + + + + +

Hbase Region Server: <%= serverInfo.getServerAddress().toString() %>

+

Local logs, Thread Dump

+ +

Region Server Attributes

+ + + +
Attribute NameValue
Load<%= serverInfo.getLoad().toString() %>
+ +

Online Regions

+<% if (onlineRegions != null && onlineRegions.size() > 0) { %> + + +<% for (HRegion r: onlineRegions.values()) { %> + +<% } %> +
Region NameStart KeyEnd Key
<%= r.getRegionName().toString() %><%= r.getStartKey().toString() %><%= r.getEndKey().toString() %>
+

Region names are made of the containing table's name, a comma, +the start key, a comma, and a randomly generated region id. To illustrate, +the region named +domains,apache.org,5464829424211263407 is party to the table +domains, has an id of 5464829424211263407 and the first key +in the region is apache.org. The -ROOT- +and .META. 'tables' are internal sytem tables. +The -ROOT- keeps a list of all regions in the .META. table. The .META. table +keeps a list of all regions in the system. The empty key is used to denote +table start and table end. A region with an +empty start key is the first region in a table. If region has both an empty +start and an empty end key, its the only region in the table. See +Hbase Home for +further explication.

+<% } else { %> +

Not serving regions

+<% } %> + + \ No newline at end of file Added: lucene/hadoop/trunk/src/contrib/hbase/src/webapps/static/hbase.css URL: http://svn.apache.org/viewvc/lucene/hadoop/trunk/src/contrib/hbase/src/webapps/static/hbase.css?rev=582443&view=auto ============================================================================== --- lucene/hadoop/trunk/src/contrib/hbase/src/webapps/static/hbase.css (added) +++ lucene/hadoop/trunk/src/contrib/hbase/src/webapps/static/hbase.css Fri Oct 5 20:49:43 2007 @@ -0,0 +1,5 @@ +h1, h2, h3 { color: DarkSlateBlue } +table { border: thin solid DodgerBlue } +tr { border: thin solid DodgerBlue } +td { border: thin solid DodgerBlue } +th { border: thin solid DodgerBlue }