Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 94AD3200BC8 for ; Tue, 8 Nov 2016 14:50:10 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 93365160B12; Tue, 8 Nov 2016 13:50:10 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 8E593160B18 for ; Tue, 8 Nov 2016 14:50:08 +0100 (CET) Received: (qmail 26050 invoked by uid 500); 8 Nov 2016 13:49:59 -0000 Mailing-List: contact commits-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hbase.apache.org Delivered-To: mailing list commits@hbase.apache.org Received: (qmail 23468 invoked by uid 99); 8 Nov 2016 13:49:57 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 08 Nov 2016 13:49:57 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id 652A7E3A9C; Tue, 8 Nov 2016 13:49:57 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: stack@apache.org To: commits@hbase.apache.org Date: Tue, 08 Nov 2016 13:50:37 -0000 Message-Id: <9182cdd9d9274fe5808ecd661c1f38a4@git.apache.org> In-Reply-To: References: X-Mailer: ASF-Git Admin Mailer Subject: [42/52] [partial] hbase-site git commit: Published site at 28de528c6ea19c261213ee229381a18ed3b5ef94. archived-at: Tue, 08 Nov 2016 13:50:10 -0000 http://git-wip-us.apache.org/repos/asf/hbase-site/blob/f96628d5/apidocs/org/apache/hadoop/hbase/client/class-use/Put.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/client/class-use/Put.html b/apidocs/org/apache/hadoop/hbase/client/class-use/Put.html index ff11c1f..a9848ed 100644 --- a/apidocs/org/apache/hadoop/hbase/client/class-use/Put.html +++ b/apidocs/org/apache/hadoop/hbase/client/class-use/Put.html @@ -301,6 +301,16 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. +default CompletableFuture<Boolean> +AsyncTable.checkAndPut(byte[] row, + byte[] family, + byte[] qualifier, + byte[] value, + Put put) +
Atomically checks if a row/family/qualifier value equals to the expected value.
+ + + boolean Table.checkAndPut(byte[] row, byte[] family, @@ -312,6 +322,17 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. value. + +CompletableFuture<Boolean> +AsyncTable.checkAndPut(byte[] row, + byte[] family, + byte[] qualifier, + CompareFilter.CompareOp compareOp, + byte[] value, + Put put) +
Atomically checks if a row/family/qualifier value matches the expected value.
+ + boolean HTableMultiplexer.put(byte[] tableName, http://git-wip-us.apache.org/repos/asf/hbase-site/blob/f96628d5/apidocs/org/apache/hadoop/hbase/client/class-use/Result.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/client/class-use/Result.html b/apidocs/org/apache/hadoop/hbase/client/class-use/Result.html index 74d6bdd..c64460a 100644 --- a/apidocs/org/apache/hadoop/hbase/client/class-use/Result.html +++ b/apidocs/org/apache/hadoop/hbase/client/class-use/Result.html @@ -233,10 +233,35 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. CompletableFuture<Result> +AsyncTable.append(Append append) +
Appends values to one or more columns within a single row.
+ + + +CompletableFuture<Result> AsyncTable.get(Get get)
Extracts certain cells from a given row.
+ +CompletableFuture<Result> +AsyncTable.increment(Increment increment) +
Increments one or more columns within a single row.
+ + + +default CompletableFuture<List<Result>> +AsyncTable.smallScan(Scan scan) + + + + +CompletableFuture<List<Result>> +AsyncTable.smallScan(Scan scan, + int limit) +
Return all the results that match the given scan object.
+ + @@ -313,9 +338,11 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. - + org.apache.hadoop.mapred.Reporter reporter) +
Builds a TableRecordReader.
+ @@ -325,11 +352,9 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. - + org.apache.hadoop.mapred.Reporter reporter) 
org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result>MultiTableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, +TableInputFormatBase.getRecordReader(org.apache.hadoop.mapred.InputSplit split, org.apache.hadoop.mapred.JobConf job, - org.apache.hadoop.mapred.Reporter reporter) 
org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result>
org.apache.hadoop.mapred.RecordReader<ImmutableBytesWritable,Result>TableInputFormatBase.getRecordReader(org.apache.hadoop.mapred.InputSplit split, +MultiTableSnapshotInputFormat.getRecordReader(org.apache.hadoop.mapred.InputSplit split, org.apache.hadoop.mapred.JobConf job, - org.apache.hadoop.mapred.Reporter reporter) -
Builds a TableRecordReader.
-
@@ -348,20 +373,20 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -IdentityTableMap.map(ImmutableBytesWritable key, +GroupingTableMap.map(ImmutableBytesWritable key, Result value, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, org.apache.hadoop.mapred.Reporter reporter) -
Pass the key, value to reduce
+
Extract the grouping columns from value to construct a new key.
void -GroupingTableMap.map(ImmutableBytesWritable key, +IdentityTableMap.map(ImmutableBytesWritable key, Result value, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, org.apache.hadoop.mapred.Reporter reporter) -
Extract the grouping columns from value to construct a new key.
+
Pass the key, value to reduce
@@ -385,20 +410,20 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. void -IdentityTableMap.map(ImmutableBytesWritable key, +GroupingTableMap.map(ImmutableBytesWritable key, Result value, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, org.apache.hadoop.mapred.Reporter reporter) -
Pass the key, value to reduce
+
Extract the grouping columns from value to construct a new key.
void -GroupingTableMap.map(ImmutableBytesWritable key, +IdentityTableMap.map(ImmutableBytesWritable key, Result value, org.apache.hadoop.mapred.OutputCollector<ImmutableBytesWritable,Result> output, org.apache.hadoop.mapred.Reporter reporter) -
Extract the grouping columns from value to construct a new key.
+
Pass the key, value to reduce
@@ -438,9 +463,9 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapreduce.RecordReader<ImmutableBytesWritable,Result> -MultiTableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, +TableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, org.apache.hadoop.mapreduce.TaskAttemptContext context) -
Builds a TableRecordReader.
+ @@ -450,9 +475,9 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. org.apache.hadoop.mapreduce.RecordReader<ImmutableBytesWritable,Result> -TableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, +MultiTableInputFormatBase.createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, org.apache.hadoop.mapreduce.TaskAttemptContext context) - +
Builds a TableRecordReader.
http://git-wip-us.apache.org/repos/asf/hbase-site/blob/f96628d5/apidocs/org/apache/hadoop/hbase/client/class-use/Row.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/client/class-use/Row.html b/apidocs/org/apache/hadoop/hbase/client/class-use/Row.html index 00b7f4a..84a0748 100644 --- a/apidocs/org/apache/hadoop/hbase/client/class-use/Row.html +++ b/apidocs/org/apache/hadoop/hbase/client/class-use/Row.html @@ -172,19 +172,19 @@ int -RowMutations.compareTo(Row i)  +Mutation.compareTo(Row d)  int -Increment.compareTo(Row i)  +Get.compareTo(Row other)  int -Mutation.compareTo(Row d)  +RowMutations.compareTo(Row i)  int -Get.compareTo(Row other)  +Increment.compareTo(Row i)  http://git-wip-us.apache.org/repos/asf/hbase-site/blob/f96628d5/apidocs/org/apache/hadoop/hbase/client/class-use/RowMutations.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/client/class-use/RowMutations.html b/apidocs/org/apache/hadoop/hbase/client/class-use/RowMutations.html index db4779f..85a960e 100644 --- a/apidocs/org/apache/hadoop/hbase/client/class-use/RowMutations.html +++ b/apidocs/org/apache/hadoop/hbase/client/class-use/RowMutations.html @@ -109,6 +109,16 @@ +default CompletableFuture<Boolean> +AsyncTable.checkAndMutate(byte[] row, + byte[] family, + byte[] qualifier, + byte[] value, + RowMutations mutation) +
Atomically checks if a row/family/qualifier value equals to the expected value.
+ + + boolean Table.checkAndMutate(byte[] row, byte[] family, @@ -119,12 +129,29 @@
Atomically checks if a row/family/qualifier value matches the expected value.
+ +CompletableFuture<Boolean> +AsyncTable.checkAndMutate(byte[] row, + byte[] family, + byte[] qualifier, + CompareFilter.CompareOp compareOp, + byte[] value, + RowMutations mutation) +
Atomically checks if a row/family/qualifier value matches the expected value.
+ + void Table.mutateRow(RowMutations rm)
Performs multiple mutations atomically on a single row.
+ +CompletableFuture<Void> +AsyncTable.mutateRow(RowMutations mutation) +
Performs multiple mutations atomically on a single row.
+ + http://git-wip-us.apache.org/repos/asf/hbase-site/blob/f96628d5/apidocs/org/apache/hadoop/hbase/client/class-use/Scan.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/client/class-use/Scan.html b/apidocs/org/apache/hadoop/hbase/client/class-use/Scan.html index 038b799..4e05157 100644 --- a/apidocs/org/apache/hadoop/hbase/client/class-use/Scan.html +++ b/apidocs/org/apache/hadoop/hbase/client/class-use/Scan.html @@ -321,6 +321,19 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. object. + +default CompletableFuture<List<Result>> +AsyncTable.smallScan(Scan scan) + + + + +CompletableFuture<List<Result>> +AsyncTable.smallScan(Scan scan, + int limit) +
Return all the results that match the given scan object.
+ + @@ -594,19 +607,19 @@ Input/OutputFormats, a table indexing MapReduce job, and utility methods. - - - http://git-wip-us.apache.org/repos/asf/hbase-site/blob/f96628d5/apidocs/org/apache/hadoop/hbase/client/class-use/SnapshotType.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/client/class-use/SnapshotType.html b/apidocs/org/apache/hadoop/hbase/client/class-use/SnapshotType.html index 48f841d..eff4031 100644 --- a/apidocs/org/apache/hadoop/hbase/client/class-use/SnapshotType.html +++ b/apidocs/org/apache/hadoop/hbase/client/class-use/SnapshotType.html @@ -149,13 +149,21 @@ the order they are declared. + SnapshotType type) +
Deprecated.  +
Use the version with the TableName instance instead
+
+ + String owner) +
Deprecated.  +
Use the version with the TableName instance instead
+
+ + + + + + + + + + http://git-wip-us.apache.org/repos/asf/hbase-site/blob/f96628d5/apidocs/org/apache/hadoop/hbase/client/package-tree.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/client/package-tree.html b/apidocs/org/apache/hadoop/hbase/client/package-tree.html index 0c31367..582f01d 100644 --- a/apidocs/org/apache/hadoop/hbase/client/package-tree.html +++ b/apidocs/org/apache/hadoop/hbase/client/package-tree.html @@ -207,13 +207,13 @@ http://git-wip-us.apache.org/repos/asf/hbase-site/blob/f96628d5/apidocs/org/apache/hadoop/hbase/exceptions/RegionInRecoveryException.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/exceptions/RegionInRecoveryException.html b/apidocs/org/apache/hadoop/hbase/exceptions/RegionInRecoveryException.html index 965e19c..31acf7e 100644 --- a/apidocs/org/apache/hadoop/hbase/exceptions/RegionInRecoveryException.html +++ b/apidocs/org/apache/hadoop/hbase/exceptions/RegionInRecoveryException.html @@ -44,7 +44,7 @@
voidTableRecordReaderImpl.setScan(Scan scan) +TableInputFormatBase.setScan(Scan scan)
Sets the scan defining the actual details like columns etc.
voidTableRecordReader.setScan(Scan scan) +TableRecordReaderImpl.setScan(Scan scan)
Sets the scan defining the actual details like columns etc.
voidTableInputFormatBase.setScan(Scan scan) +TableRecordReader.setScan(Scan scan)
Sets the scan defining the actual details like columns etc.
SnapshotDescription(String name, String table, - SnapshotType type) 
SnapshotDescription(String name, String table, SnapshotType type, - String owner) 
SnapshotDescription(String name, @@ -163,6 +171,29 @@ the order they are declared. SnapshotType type, String owner, long creationTime, + int version) +
Deprecated.  +
Use the version with the TableName instance instead
+
+
SnapshotDescription(String name, + TableName table, + SnapshotType type) 
SnapshotDescription(String name, + TableName table, + SnapshotType type, + String owner) 
SnapshotDescription(String name, + TableName table, + SnapshotType type, + String owner, + long creationTime, int version) 
+ + + + + + + + + + + + + +
Fields 
Modifier and TypeField and Description
static intMAJOR_VERSION 
static intMINOR_VERSION 
+ + + + + + + + + +
+ +
+ + + + + + +

Copyright © 2007–2016 The Apache Software Foundation. All rights reserved.

+ + http://git-wip-us.apache.org/repos/asf/hbase-site/blob/f96628d5/apidocs/org/apache/hadoop/hbase/exceptions/ScannerResetException.html ---------------------------------------------------------------------- diff --git a/apidocs/org/apache/hadoop/hbase/exceptions/ScannerResetException.html b/apidocs/org/apache/hadoop/hbase/exceptions/ScannerResetException.html index c7312b0..909003a 100644 --- a/apidocs/org/apache/hadoop/hbase/exceptions/ScannerResetException.html +++ b/apidocs/org/apache/hadoop/hbase/exceptions/ScannerResetException.html @@ -43,7 +43,7 @@