Return-Path: Delivered-To: apmail-hadoop-hbase-dev-archive@locus.apache.org Received: (qmail 14040 invoked from network); 13 Jan 2009 01:07:29 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.2) by minotaur.apache.org with SMTP; 13 Jan 2009 01:07:29 -0000 Received: (qmail 33499 invoked by uid 500); 13 Jan 2009 01:07:28 -0000 Delivered-To: apmail-hadoop-hbase-dev-archive@hadoop.apache.org Received: (qmail 33486 invoked by uid 500); 13 Jan 2009 01:07:28 -0000 Mailing-List: contact hbase-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hbase-dev@hadoop.apache.org Delivered-To: mailing list hbase-dev@hadoop.apache.org Received: (qmail 33475 invoked by uid 99); 13 Jan 2009 01:07:28 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 12 Jan 2009 17:07:28 -0800 X-ASF-Spam-Status: No, hits=-1998.5 required=10.0 tests=ALL_TRUSTED,WEIRD_PORT X-Spam-Check-By: apache.org Received: from [140.211.11.140] (HELO brutus.apache.org) (140.211.11.140) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 13 Jan 2009 01:07:20 +0000 Received: from brutus (localhost [127.0.0.1]) by brutus.apache.org (Postfix) with ESMTP id F2CAF234C4B0 for ; Mon, 12 Jan 2009 17:06:59 -0800 (PST) Message-ID: <1224217859.1231808819993.JavaMail.jira@brutus> Date: Mon, 12 Jan 2009 17:06:59 -0800 (PST) From: "Michael Gottesman (JIRA)" To: hbase-dev@hadoop.apache.org Subject: [jira] Commented: (HBASE-1064) HBase REST xml/json improvements In-Reply-To: <2141440793.1229461004312.JavaMail.jira@brutus> MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-Virus-Checked: Checked by ClamAV on apache.org [ https://issues.apache.org/jira/browse/HBASE-1064?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12663169#action_12663169 ] Michael Gottesman commented on HBASE-1064: ------------------------------------------ @Brian I am going to submit it to json.org soon (I should have done it six months ago but things got in the way). It should really be called Agile-Json2.0.jar because the whole idea about it is to make Serialization to JSON of lots and lots of classes obscenely easy and then to be replaced with more hardcore items later if more performance is needed. Imagine using JSON.org serialization code for 5 objects. Now imagine writing JSON.org serialization code for 100 objects. Wouldnt you rather just mark it with @ToJSON?. That is the premise anyways. @Stacks the source is here: http://github.com/gottesmm/agile-json-2.0/tree/master > HBase REST xml/json improvements > -------------------------------- > > Key: HBASE-1064 > URL: https://issues.apache.org/jira/browse/HBASE-1064 > Project: Hadoop HBase > Issue Type: Improvement > Components: rest > Reporter: Brian Beggs > Attachments: hbase-1064-patch-v2.patch, hbase-1064-patch-v3.patch, json2.jar, REST-Upgrade-Notes.txt, RESTPatch-pass1.patch > > > I've begun work on creating a REST based interface for HBase that can use both JSON and XML and would be extensible enough to add new formats down the road. I'm at a point with this where I would like to submit it for review and to get feedback as I continue to work towards new features. > Attached to this issue you will find the patch for the changes to this point along with a necessary jar file for the JSON serialization. Also below you will find my notes on how to use what is finished with the interface to this point. > This patch is based off of jira issues: > HBASE-814 and HBASE-815 > I am interested on gaining feedback on: > -what you guys think works > -what doesn't work for the project > -anything that may need to be added > -code style > -anything else... > Finished components: > -framework around parsing json/xml input > -framework around serialzing xml/json output > -changes to exception handing > -changes to the response object to better handle the serializing of output data > -table CRUD calls > -Full table fetching > -creating/fetching scanners > TODO: > -fix up the filtering with scanners > -row insert/delete operations > -individual row fetching > -cell fetching interface > -scanner use interface > Here are the wiki(ish) notes for what is done to this point: > REST Service for HBASE Notes: > GET / > -retrieves a list of all the tables with their meta data in HBase > curl -v -H "Accept: text/xml" -X GET -T - http://localhost:60050/ > curl -v -H "Accept: application/json" -X GET -T - http://localhost:60050/ > POST / > -Create a table > curl -H "Content-Type: text/xml" -H "Accept: text/xml" -v -X POST -T - http://localhost:60050/newTable > > test14 > > > subscription > 2 > NONE > false > true > > >
> Response: > 200success > JSON: > curl -H "Content-Type: application/json" -H "Accept: application/json" -v -X POST -T - http://localhost:60050/newTable > {"name":"test5", "column_families":[{ > "name":"columnfam1", > "bloomfilter":true, > "time_to_live":10, > "in_memory":false, > "max_versions":2, > "compression":"", > "max_value_length":50, > "block_cache_enabled":true > } > ]} > *NOTE* this is an enum defined in class HColumnDescriptor.CompressionType > GET /[table_name] > -returns all records for the table > curl -v -H "Accept: text/xml" -X GET -T - http://localhost:60050/tablename > curl -v -H "Accept: application/json" -X GET -T - http://localhost:60050/tablename > GET /[table_name] > -Parameter Action > metadata - returns the metadata for this table. > regions - returns the regions for this table > curl -v -H "Accept: text/xml" -X GET -T - http://localhost:60050/pricing1?action=metadata > Update Table > PUT /[table_name] > -updates a table > curl -v -H "Content-Type: text/xml" -H "Accept: text/xml" -X PUT -T - http://localhost:60050/pricing1 > > > subscription > 3 > NONE > false > true > > > subscription1 > 3 > NONE > false > true > > > curl -v -H "Content-Type: application/json" -H "Accept: application/json" -X PUT -T - http://localhost:60050/pricing1 > {"column_families":[{ > "name":"columnfam1", > "bloomfilter":true, > "time_to_live":10, > "in_memory":false, > "max_versions":2, > "compression":"", > "max_value_length":50, > "block_cache_enabled":true > }, > { > "name":"columnfam2", > "bloomfilter":true, > "time_to_live":10, > "in_memory":false, > "max_versions":2, > "compression":"", > "max_value_length":50, > "block_cache_enabled":true > } > ]} > Delete Table > curl -v -H "Content-Type: text/xml" -H "Accept: text/xml" -X DELETE -T - http://localhost:60050/TEST16 > creating a scanner > curl -v -H "Content-Type: application/json" -H "Accept: application/json" -X POST -T - http://localhost:60050/TEST16?action=newscanner > //TODO fix up the scanner filters. > response: > xml: > > > 2 > > > json: > {"id":1} > Using a scanner > curl -v -H "Content-Type: application/json" -H "Accept: application/json" -X POST -T - "http://localhost:60050/TEST16?action=scan&scannerId=&numrows=" > This would be my first submission to an open source project of this size, so please, give it to me rough. =) > Thanks. -- This message is automatically generated by JIRA. - You can reply to this email to add a comment to the issue online.