Return-Path: Delivered-To: apmail-hadoop-hbase-commits-archive@minotaur.apache.org Received: (qmail 45286 invoked from network); 12 Nov 2009 05:49:30 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 12 Nov 2009 05:49:30 -0000 Received: (qmail 15160 invoked by uid 500); 12 Nov 2009 05:49:30 -0000 Delivered-To: apmail-hadoop-hbase-commits-archive@hadoop.apache.org Received: (qmail 15136 invoked by uid 500); 12 Nov 2009 05:49:29 -0000 Mailing-List: contact hbase-commits-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hbase-dev@hadoop.apache.org Delivered-To: mailing list hbase-commits@hadoop.apache.org Received: (qmail 15127 invoked by uid 99); 12 Nov 2009 05:49:29 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 12 Nov 2009 05:49:29 +0000 X-ASF-Spam-Status: No, hits=-2000.0 required=10.0 tests=ALL_TRUSTED X-Spam-Check-By: apache.org Received: from [140.211.11.4] (HELO eris.apache.org) (140.211.11.4) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 12 Nov 2009 05:49:18 +0000 Received: by eris.apache.org (Postfix, from userid 65534) id A241E23888D8; Thu, 12 Nov 2009 05:48:57 +0000 (UTC) Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit Subject: svn commit: r835241 - in /hadoop/hbase/trunk: lib/ src/java/org/apache/hadoop/hbase/regionserver/ src/java/org/apache/hadoop/hbase/regionserver/wal/ src/java/org/apache/hadoop/hbase/util/ src/test/org/apache/hadoop/hbase/ src/test/org/apache/hadoop/hba... Date: Thu, 12 Nov 2009 05:48:57 -0000 To: hbase-commits@hadoop.apache.org From: stack@apache.org X-Mailer: svnmailer-1.0.8 Message-Id: <20091112054857.A241E23888D8@eris.apache.org> X-Virus-Checked: Checked by ClamAV on apache.org Author: stack Date: Thu Nov 12 05:48:56 2009 New Revision: 835241 URL: http://svn.apache.org/viewvc?rev=835241&view=rev Log: HBASE-1974 Update to latest on hadoop 0.21 branch (November11th, 2009) Added: hadoop/hbase/trunk/lib/hadoop-core-0.21.0-SNAPSHOT-r832250.jar (with props) hadoop/hbase/trunk/lib/hadoop-core-test-0.21.0-SNAPSHOT-r832250.jar (with props) hadoop/hbase/trunk/lib/hadoop-hdfs-0.21.0-dev-r833507.jar (with props) hadoop/hbase/trunk/lib/hadoop-hdfs-test-0.21.0-dev-r833507.jar (with props) hadoop/hbase/trunk/lib/hadoop-mapred-0.21.0-dev-r833993.jar (with props) hadoop/hbase/trunk/lib/hadoop-mapred-test-0.21.0-dev-r833993.jar (with props) Removed: hadoop/hbase/trunk/lib/hadoop-core-0.21.0-SNAPSHOT-r831142.jar hadoop/hbase/trunk/lib/hadoop-core-test-0.21.0-SNAPSHOT-r831142.jar hadoop/hbase/trunk/lib/hadoop-hdfs-0.21.0-dev-r831142.jar hadoop/hbase/trunk/lib/hadoop-hdfs-test-0.21.0-dev-r831142.jar hadoop/hbase/trunk/lib/hadoop-mapred-0.21.0-dev-r831135.jar hadoop/hbase/trunk/lib/hadoop-mapred-test-0.21.0-dev-r831135.jar Modified: hadoop/hbase/trunk/src/java/org/apache/hadoop/hbase/regionserver/HRegion.java hadoop/hbase/trunk/src/java/org/apache/hadoop/hbase/regionserver/wal/HLog.java hadoop/hbase/trunk/src/java/org/apache/hadoop/hbase/util/Merge.java hadoop/hbase/trunk/src/test/org/apache/hadoop/hbase/HBaseTestingUtility.java hadoop/hbase/trunk/src/test/org/apache/hadoop/hbase/regionserver/wal/TestHLog.java Added: hadoop/hbase/trunk/lib/hadoop-core-0.21.0-SNAPSHOT-r832250.jar URL: http://svn.apache.org/viewvc/hadoop/hbase/trunk/lib/hadoop-core-0.21.0-SNAPSHOT-r832250.jar?rev=835241&view=auto ============================================================================== Binary file - no diff available. Propchange: hadoop/hbase/trunk/lib/hadoop-core-0.21.0-SNAPSHOT-r832250.jar ------------------------------------------------------------------------------ svn:mime-type = application/octet-stream Added: hadoop/hbase/trunk/lib/hadoop-core-test-0.21.0-SNAPSHOT-r832250.jar URL: http://svn.apache.org/viewvc/hadoop/hbase/trunk/lib/hadoop-core-test-0.21.0-SNAPSHOT-r832250.jar?rev=835241&view=auto ============================================================================== Binary file - no diff available. Propchange: hadoop/hbase/trunk/lib/hadoop-core-test-0.21.0-SNAPSHOT-r832250.jar ------------------------------------------------------------------------------ svn:mime-type = application/octet-stream Added: hadoop/hbase/trunk/lib/hadoop-hdfs-0.21.0-dev-r833507.jar URL: http://svn.apache.org/viewvc/hadoop/hbase/trunk/lib/hadoop-hdfs-0.21.0-dev-r833507.jar?rev=835241&view=auto ============================================================================== Binary file - no diff available. Propchange: hadoop/hbase/trunk/lib/hadoop-hdfs-0.21.0-dev-r833507.jar ------------------------------------------------------------------------------ svn:mime-type = application/octet-stream Added: hadoop/hbase/trunk/lib/hadoop-hdfs-test-0.21.0-dev-r833507.jar URL: http://svn.apache.org/viewvc/hadoop/hbase/trunk/lib/hadoop-hdfs-test-0.21.0-dev-r833507.jar?rev=835241&view=auto ============================================================================== Binary file - no diff available. Propchange: hadoop/hbase/trunk/lib/hadoop-hdfs-test-0.21.0-dev-r833507.jar ------------------------------------------------------------------------------ svn:mime-type = application/octet-stream Added: hadoop/hbase/trunk/lib/hadoop-mapred-0.21.0-dev-r833993.jar URL: http://svn.apache.org/viewvc/hadoop/hbase/trunk/lib/hadoop-mapred-0.21.0-dev-r833993.jar?rev=835241&view=auto ============================================================================== Binary file - no diff available. Propchange: hadoop/hbase/trunk/lib/hadoop-mapred-0.21.0-dev-r833993.jar ------------------------------------------------------------------------------ svn:mime-type = application/octet-stream Added: hadoop/hbase/trunk/lib/hadoop-mapred-test-0.21.0-dev-r833993.jar URL: http://svn.apache.org/viewvc/hadoop/hbase/trunk/lib/hadoop-mapred-test-0.21.0-dev-r833993.jar?rev=835241&view=auto ============================================================================== Binary file - no diff available. Propchange: hadoop/hbase/trunk/lib/hadoop-mapred-test-0.21.0-dev-r833993.jar ------------------------------------------------------------------------------ svn:mime-type = application/octet-stream Modified: hadoop/hbase/trunk/src/java/org/apache/hadoop/hbase/regionserver/HRegion.java URL: http://svn.apache.org/viewvc/hadoop/hbase/trunk/src/java/org/apache/hadoop/hbase/regionserver/HRegion.java?rev=835241&r1=835240&r2=835241&view=diff ============================================================================== --- hadoop/hbase/trunk/src/java/org/apache/hadoop/hbase/regionserver/HRegion.java (original) +++ hadoop/hbase/trunk/src/java/org/apache/hadoop/hbase/regionserver/HRegion.java Thu Nov 12 05:48:56 2009 @@ -1177,8 +1177,7 @@ if (writeToWAL) { this.log.append(regionInfo.getRegionName(), - regionInfo.getTableDesc().getName(), kvs, - (regionInfo.isMetaRegion() || regionInfo.isRootRegion()), now); + regionInfo.getTableDesc().getName(), kvs, now); } flush = isFlushSize(size); } finally { @@ -1451,8 +1450,7 @@ if (writeToWAL) { long now = System.currentTimeMillis(); this.log.append(regionInfo.getRegionName(), - regionInfo.getTableDesc().getName(), edits, - (regionInfo.isMetaRegion() || regionInfo.isRootRegion()), now); + regionInfo.getTableDesc().getName(), edits, now); } long size = 0; Store store = getStore(family); @@ -2363,8 +2361,7 @@ List edits = new ArrayList(1); edits.add(newKv); this.log.append(regionInfo.getRegionName(), - regionInfo.getTableDesc().getName(), edits, - (regionInfo.isMetaRegion() || regionInfo.isRootRegion()), now); + regionInfo.getTableDesc().getName(), edits, now); } // Now request the ICV to the store, this will set the timestamp @@ -2550,4 +2547,4 @@ if (bc != null) bc.shutdown(); } } -} \ No newline at end of file +} Modified: hadoop/hbase/trunk/src/java/org/apache/hadoop/hbase/regionserver/wal/HLog.java URL: http://svn.apache.org/viewvc/hadoop/hbase/trunk/src/java/org/apache/hadoop/hbase/regionserver/wal/HLog.java?rev=835241&r1=835240&r2=835241&view=diff ============================================================================== --- hadoop/hbase/trunk/src/java/org/apache/hadoop/hbase/regionserver/wal/HLog.java (original) +++ hadoop/hbase/trunk/src/java/org/apache/hadoop/hbase/regionserver/wal/HLog.java Thu Nov 12 05:48:56 2009 @@ -649,9 +649,7 @@ // region being flushed is removed if the sequence number of the flush // is greater than or equal to the value in lastSeqWritten. this.lastSeqWritten.putIfAbsent(regionName, Long.valueOf(seqNum)); - boolean sync = regionInfo.isMetaRegion() || regionInfo.isRootRegion(); - doWrite(logKey, logEdit, sync, logKey.getWriteTime()); - + doWrite(logKey, logEdit, logKey.getWriteTime()); this.unflushedEntries.incrementAndGet(); this.numEntries.incrementAndGet(); } @@ -682,12 +680,11 @@ * @param regionName * @param tableName * @param edits - * @param sync * @param now * @throws IOException */ public void append(byte [] regionName, byte [] tableName, List edits, - boolean sync, final long now) + final long now) throws IOException { if (this.closed) { throw new IOException("Cannot append; log is closed"); @@ -702,7 +699,7 @@ int counter = 0; for (KeyValue kv: edits) { HLogKey logKey = makeKey(regionName, tableName, seqNum[counter++], now); - doWrite(logKey, kv, sync, now); + doWrite(logKey, kv, now); this.numEntries.incrementAndGet(); } @@ -808,13 +805,7 @@ logSyncerThread.addToSyncQueue(force); } - /** - * Multiple threads will call sync() at the same time, only the winner - * will actually flush if there is any race or build up. - * - * @throws IOException - */ - protected void hflush() throws IOException { + public void hflush() throws IOException { synchronized (this.updateLock) { if (this.closed) { return; @@ -822,6 +813,7 @@ if (this.forceSync || this.unflushedEntries.get() >= this.flushlogentries) { try { + LOG.info("hflush remove"); this.writer.sync(); if (this.writer_out != null) { this.writer_out.sync(); @@ -837,20 +829,25 @@ } } + public void hsync() throws IOException { + // Not yet implemented up in hdfs so just call hflush. + hflush(); + } + private void requestLogRoll() { if (this.listener != null) { this.listener.logRollRequested(); } } - private void doWrite(HLogKey logKey, KeyValue logEdit, boolean sync, - final long now) + private void doWrite(HLogKey logKey, KeyValue logEdit, final long now) throws IOException { if (!this.enabled) { return; } try { this.editsSize.addAndGet(logKey.heapSize() + logEdit.heapSize()); + if (this.numEntries.get() % this.flushlogentries == 0) LOG.info("edit=" + this.numEntries.get() + ", write=" + logKey.toString()); this.writer.append(logKey, logEdit); long took = System.currentTimeMillis() - now; if (took > 1000) { Modified: hadoop/hbase/trunk/src/java/org/apache/hadoop/hbase/util/Merge.java URL: http://svn.apache.org/viewvc/hadoop/hbase/trunk/src/java/org/apache/hadoop/hbase/util/Merge.java?rev=835241&r1=835240&r2=835241&view=diff ============================================================================== --- hadoop/hbase/trunk/src/java/org/apache/hadoop/hbase/util/Merge.java (original) +++ hadoop/hbase/trunk/src/java/org/apache/hadoop/hbase/util/Merge.java Thu Nov 12 05:48:56 2009 @@ -333,7 +333,7 @@ * * @throws IOException */ - private int parseArgs(String[] args) { + private int parseArgs(String[] args) throws IOException { GenericOptionsParser parser = new GenericOptionsParser(this.getConf(), args); Modified: hadoop/hbase/trunk/src/test/org/apache/hadoop/hbase/HBaseTestingUtility.java URL: http://svn.apache.org/viewvc/hadoop/hbase/trunk/src/test/org/apache/hadoop/hbase/HBaseTestingUtility.java?rev=835241&r1=835240&r2=835241&view=diff ============================================================================== --- hadoop/hbase/trunk/src/test/org/apache/hadoop/hbase/HBaseTestingUtility.java (original) +++ hadoop/hbase/trunk/src/test/org/apache/hadoop/hbase/HBaseTestingUtility.java Thu Nov 12 05:48:56 2009 @@ -46,7 +46,6 @@ import org.apache.hadoop.hbase.util.Writables; import org.apache.hadoop.hdfs.MiniDFSCluster; import org.apache.hadoop.mapred.MiniMRCluster; -import com.sun.corba.se.pept.transport.Connection; /** * Facility for testing HBase. Added as tool to abet junit4 testing. Replaces @@ -471,4 +470,4 @@ ((Jdk14Logger) l).getLogger().setLevel(java.util.logging.Level.ALL); } } -} \ No newline at end of file +} Modified: hadoop/hbase/trunk/src/test/org/apache/hadoop/hbase/regionserver/wal/TestHLog.java URL: http://svn.apache.org/viewvc/hadoop/hbase/trunk/src/test/org/apache/hadoop/hbase/regionserver/wal/TestHLog.java?rev=835241&r1=835240&r2=835241&view=diff ============================================================================== --- hadoop/hbase/trunk/src/test/org/apache/hadoop/hbase/regionserver/wal/TestHLog.java (original) +++ hadoop/hbase/trunk/src/test/org/apache/hadoop/hbase/regionserver/wal/TestHLog.java Thu Nov 12 05:48:56 2009 @@ -92,7 +92,7 @@ System.currentTimeMillis(), column)); System.out.println("Region " + i + ": " + edit); log.append(Bytes.toBytes("" + i), tableName, edit, - false, System.currentTimeMillis()); + System.currentTimeMillis()); } } log.rollWriter(); @@ -132,7 +132,7 @@ for (int i = 0; i < total; i++) { List kvs = new ArrayList(); kvs.add(new KeyValue(Bytes.toBytes(i), bytes, bytes)); - wal.append(bytes, bytes, kvs, false, System.currentTimeMillis()); + wal.append(bytes, bytes, kvs, System.currentTimeMillis()); } // Now call sync and try reading. Opening a Reader before you sync just // gives you EOFE. @@ -150,7 +150,7 @@ for (int i = 0; i < total; i++) { List kvs = new ArrayList(); kvs.add(new KeyValue(Bytes.toBytes(i), bytes, bytes)); - wal.append(bytes, bytes, kvs, false, System.currentTimeMillis()); + wal.append(bytes, bytes, kvs, System.currentTimeMillis()); } reader = HLog.getReader(this.fs, walPath, this.conf); count = 0; @@ -169,7 +169,7 @@ for (int i = 0; i < total; i++) { List kvs = new ArrayList(); kvs.add(new KeyValue(Bytes.toBytes(i), bytes, value)); - wal.append(bytes, bytes, kvs, false, System.currentTimeMillis()); + wal.append(bytes, bytes, kvs, System.currentTimeMillis()); } // Now I should have written out lots of blocks. Sync then read. wal.sync(); @@ -238,7 +238,7 @@ Bytes.toBytes(Integer.toString(i)), timestamp, new byte[] { (byte)(i + '0') })); } - log.append(regionName, tableName, cols, false, System.currentTimeMillis()); + log.append(regionName, tableName, cols, System.currentTimeMillis()); long logSeqId = log.startCacheFlush(); log.completeCacheFlush(regionName, tableName, logSeqId); log.close(); @@ -275,4 +275,4 @@ } } } -} \ No newline at end of file +}