Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 5FDE310D70 for ; Wed, 29 Jan 2014 23:04:19 +0000 (UTC) Received: (qmail 87953 invoked by uid 500); 29 Jan 2014 23:04:16 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 87891 invoked by uid 500); 29 Jan 2014 23:04:16 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 87882 invoked by uid 99); 29 Jan 2014 23:04:16 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 29 Jan 2014 23:04:16 +0000 X-ASF-Spam-Status: No, hits=-2.8 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_HI,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of sean.kennedy@merck.com designates 155.91.38.115 as permitted sender) Received: from [155.91.38.115] (HELO marvin.merck.com) (155.91.38.115) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 29 Jan 2014 23:04:08 +0000 X-IronPort-AV: E=Sophos;i="4.95,744,1384318800"; d="scan'208,217";a="568335970" Received: from unknown (HELO ipmh1.merck.com) ([54.62.195.239]) by marvin.merck.com with ESMTP; 29 Jan 2014 18:03:46 -0500 X-IronPort-AV: E=Sophos;i="4.95,744,1384318800"; d="scan'208,217";a="64646741" Received: from usctcl1-usctcl2-intranet-snat.merck.com (HELO uscthcp51012.merck.com) ([54.62.195.245]) by ipmh1.merck.com with ESMTP/TLS/AES128-SHA; 29 Jan 2014 18:03:45 -0500 Received: from USCTMXP51003.merck.com ([169.254.1.113]) by uscthcp51012.merck.com ([54.48.238.25]) with mapi; Wed, 29 Jan 2014 18:03:45 -0500 From: "Kennedy, Sean C." To: "user@hive.apache.org" Date: Wed, 29 Jan 2014 18:03:43 -0500 Subject: hbase importsv problem.. Thread-Topic: hbase importsv problem.. Thread-Index: Ac8dRlqhnlZpLoR1RDioPPzV0OggSQ== Message-ID: <586ABBCA8C267644849E7ECF038E054E032D2D91A5@USCTMXP51003.merck.com> Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: acceptlanguage: en-US Content-Type: multipart/alternative; boundary="_000_586ABBCA8C267644849E7ECF038E054E032D2D91A5USCTMXP51003m_" MIME-Version: 1.0 X-Virus-Checked: Checked by ClamAV on apache.org --_000_586ABBCA8C267644849E7ECF038E054E032D2D91A5USCTMXP51003m_ Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: quoted-printable I am trying to use the following command to import a simple text file into = an hbase table but am having issues. It looks like my job is getting execut= ed from hdfs to hbase. However I think I have some fundamental data setup i= ssues. I have a copy of my file and the format of the hbase employee table = pasted below. Any help or pointer to a simple example would be appreciated= .. Hadoop Version: 1.2.0 Hbase Version: 0.94.15 Hbase Tablename: employee Command I am issuing: /hd/hadoop/bin/hadoop jar /hbase/hbase-0.9= 4.15/hbase-0.94.15.jar importtsv -Dimporttsv.columns=3DHBASE_ROW_KEY,basic_= info:empname,basic_info:age employee /user/hduser/employee1 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:zookeeper.ve= rsion=3D3.4.5-1392090, built on 09/30/2012 17:52 GMT 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:host.name=3D= usann01 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java.version= =3D1.7.0_25 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java.vendor= =3DOracle Corporation 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java.home=3D= /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.25/jre 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java.class.p= ath=3D/hd/hadoop/libexec/../conf:/usr/bin/java/lib/tools.jar:/hd/hadoop/lib= exec/..:/hd/hadoop/libexec/../hadoop-core-1.2.1.jar:/hd/hadoop/libexec/../l= ib/asm-3.2.jar:/hd/hadoop/libexec/../lib/aspectjrt-1.6.11.jar:/hd/hadoop/li= bexec/../lib/aspectjtools-1.6.11.jar:/hd/hadoop/libexec/../lib/commons-bean= utils-1.7.0.jar:/hd/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:= /hd/hadoop/libexec/../lib/commons-cli-1.2.jar:/hd/hadoop/libexec/../lib/com= mons-codec-1.4.jar:/hd/hadoop/libexec/../lib/commons-collections-3.2.1.jar:= /hd/hadoop/libexec/../lib/commons-configuration-1.6.jar:/hd/hadoop/libexec/= ../lib/commons-daemon-1.0.1.jar:/hd/hadoop/libexec/../lib/commons-digester-= 1.8.jar:/hd/hadoop/libexec/../lib/commons-el-1.0.jar:/hd/hadoop/libexec/../= lib/commons-httpclient-3.0.1.jar:/hd/hadoop/libexec/../lib/commons-io-2.1.j= ar:/hd/hadoop/libexec/../lib/commons-lang-2.4.jar:/hd/hadoop/libexec/../lib= /commons-logging-1.1.1.jar:/hd/hadoop/libexec/../lib/commons-logging-api-1.= 0.4.jar:/hd/hadoop/libexec/../lib/commons-math-2.1.jar:/hd/hadoop/libexec/.= ./lib/commons-net-3.1.jar:/hd/hadoop/libexec/../lib/core-3.1.1.jar:/hd/hado= op/libexec/../lib/hadoop-capacity-scheduler-1.2.1.jar:/hd/hadoop/libexec/..= /lib/hadoop-fairscheduler-1.2.1.jar:/hd/hadoop/libexec/../lib/hadoop-thrift= fs-1.2.1.jar:/hd/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/hd/hadoop/libex= ec/../lib/jackson-core-asl-1.8.8.jar:/hd/hadoop/libexec/../lib/jackson-mapp= er-asl-1.8.8.jar:/hd/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/hd/h= adoop/libexec/../lib/jasper-runtime-5.5.12.jar:/hd/hadoop/libexec/../lib/jd= eb-0.8.jar:/hd/hadoop/libexec/../lib/jersey-core-1.8.jar:/hd/hadoop/libexec= /../lib/jersey-json-1.8.jar:/hd/hadoop/libexec/../lib/jersey-server-1.8.jar= :/hd/hadoop/libexec/../lib/jets3t-0.6.1.jar:/hd/hadoop/libexec/../lib/jetty= -6.1.26.jar:/hd/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/hd/hadoop/libe= xec/../lib/jsch-0.1.42.jar:/hd/hadoop/libexec/../lib/junit-4.5.jar:/hd/hado= op/libexec/../lib/kfs-0.2.2.jar:/hd/hadoop/libexec/../lib/log4j-1.2.15.jar:= /hd/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/hd/hadoop/libexec/../lib/o= ro-2.0.8.jar:/hd/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/hd/had= oop/libexec/../lib/slf4j-api-1.4.3.jar:/hd/hadoop/libexec/../lib/slf4j-log4= j12-1.4.3.jar:/hd/hadoop/libexec/../lib/xmlenc-0.52.jar:/hd/hadoop/libexec/= ../lib/jsp-2.1/jsp-2.1.jar:/hd/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.ja= r::/hbase/hbase-0.94.15/lib/guava-11.0.2.jar:/hbase/hbase-0.94.15/lib/zooke= eper-3.4.5.jar:/hbase/hbase-0.94.15/lib/protobuf-java-2.4.0a.jar 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java.library= .path=3D/hd/hadoop/libexec/../lib/native/Linux-i386-32 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpd= ir=3D/tmp 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java.compile= r=3D 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:os.name=3DLi= nux 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:os.arch=3Di3= 86 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:os.version= =3D2.6.32-358.14.1.el6.i686 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:user.name=3D= hduser 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:user.home=3D= /home/hduser 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:user.dir=3D/= hbase/hbase-0.94.15/bin 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Initiating client connection, c= onnectString=3Dlocalhost:2181 sessionTimeout=3D180000 watcher=3Dhconnection 14/01/29 17:36:02 INFO zookeeper.RecoverableZooKeeper: The identifier of th= is process is 18287@usann01 14/01/29 17:36:02 INFO zookeeper.ClientCnxn: Opening socket connection to s= erver localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL= (unknown error) 14/01/29 17:36:02 INFO zookeeper.ClientCnxn: Socket connection established = to localhost/127.0.0.1:2181, initiating session 14/01/29 17:36:02 INFO zookeeper.ClientCnxn: Session establishment complete= on server localhost/127.0.0.1:2181, sessionid =3D 0x143dff9049b0007, negot= iated timeout =3D 40000 14/01/29 17:36:04 INFO mapreduce.TableOutputFormat: Created table instance = for employee 14/01/29 17:36:04 INFO input.FileInputFormat: Total input paths to process = : 1 14/01/29 17:36:04 INFO util.NativeCodeLoader: Loaded the native-hadoop libr= ary 14/01/29 17:36:04 WARN snappy.LoadSnappy: Snappy native library not loaded 14/01/29 17:36:04 INFO mapred.JobClient: Running job: job_201401291611_0001 14/01/29 17:36:05 INFO mapred.JobClient: map 0% reduce 0% 14/01/29 17:36:19 INFO mapred.JobClient: map 100% reduce 0% 14/01/29 17:36:21 INFO mapred.JobClient: Job complete: job_201401291611_0001 14/01/29 17:36:21 INFO mapred.JobClient: Counters: 19 14/01/29 17:36:21 INFO mapred.JobClient: Job Counters 14/01/29 17:36:21 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=3D8187 14/01/29 17:36:21 INFO mapred.JobClient: Total time spent by all reduce= s waiting after reserving slots (ms)=3D0 14/01/29 17:36:21 INFO mapred.JobClient: Total time spent by all maps w= aiting after reserving slots (ms)=3D0 14/01/29 17:36:21 INFO mapred.JobClient: Launched map tasks=3D1 14/01/29 17:36:21 INFO mapred.JobClient: Data-local map tasks=3D1 14/01/29 17:36:21 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=3D0 14/01/29 17:36:21 INFO mapred.JobClient: ImportTsv 14/01/29 17:36:21 INFO mapred.JobClient: Bad Lines=3D3 14/01/29 17:36:21 INFO mapred.JobClient: File Output Format Counters 14/01/29 17:36:21 INFO mapred.JobClient: Bytes Written=3D0 14/01/29 17:36:21 INFO mapred.JobClient: FileSystemCounters 14/01/29 17:36:21 INFO mapred.JobClient: HDFS_BYTES_READ=3D131 14/01/29 17:36:21 INFO mapred.JobClient: FILE_BYTES_WRITTEN=3D79714 14/01/29 17:36:21 INFO mapred.JobClient: File Input Format Counters 14/01/29 17:36:21 INFO mapred.JobClient: Bytes Read=3D24 14/01/29 17:36:21 INFO mapred.JobClient: Map-Reduce Framework 14/01/29 17:36:21 INFO mapred.JobClient: Map input records=3D3 14/01/29 17:36:21 INFO mapred.JobClient: Physical memory (bytes) snapsh= ot=3D38928384 14/01/29 17:36:21 INFO mapred.JobClient: Spilled Records=3D0 14/01/29 17:36:21 INFO mapred.JobClient: CPU time spent (ms)=3D120 14/01/29 17:36:21 INFO mapred.JobClient: Total committed heap usage (by= tes)=3D9502720 14/01/29 17:36:21 INFO mapred.JobClient: Virtual memory (bytes) snapsho= t=3D347398144 14/01/29 17:36:21 INFO mapred.JobClient: Map output records=3D0 14/01/29 17:36:21 INFO mapred.JobClient: SPLIT_RAW_BYTES=3D107 [hduser@usann01 bin]$ Here is my file on hdfs: [hduser@usann01 bin]$ /hd/hadoop/bin/hadoop dfs -cat /user/hduser/employ= ee1 emp1,24 emp2,26 emp3,24 Here is my table in hbase: hbase(main):003:0> scan 'employee' ROW COLUMN+CELL 0 row(s) in 0.0160 seconds hbase(main):004:0> Notice: This e-mail message, together with any attachments, contains information of Merck & Co., Inc. (One Merck Drive, Whitehouse Station, New Jersey, USA 08889), and/or its affiliates Direct contact information for affiliates is available at = http://www.merck.com/contact/contacts.html) that may be confidential, proprietary copyrighted and/or legally privileged. It is intended solely for the use of the individual or entity named on this message. If you are not the intended recipient, and have received this message in error, please notify us immediately by reply e-mail and then delete it from = your system. --_000_586ABBCA8C267644849E7ECF038E054E032D2D91A5USCTMXP51003m_ Content-Type: text/html; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: quoted-printable

I am trying to use the following command to impor= t a simple text file into an hbase table but am having issues. It looks lik= e my job is getting executed from hdfs to hbase. However I think I have som= e fundamental data setup issues. I have a copy of my file and the format of= the hbase employee table pasted below.  Any help or pointer to a simp= le example would be appreciated..

 

 

Hadoo= p Version:  1.2.0

Hbase Version: 0.94.15

=

Hbase Tablenam= e:  employee

 

Command I am issuing:            /hd= /hadoop/bin/hadoop jar /hbase/hbase-0.94.15/hbase-0.94.15.jar importtsv -Di= mporttsv.columns=3DHBASE_ROW_KEY,basic_info:empname,basic_info:age &nb= sp;   employee    /user/hduser/employee1

14/01/29 17:36:02 INFO= zookeeper.ZooKeeper: Client environment:zookeeper.version=3D3.4.5-1392090,= built on 09/30/2012 17:52 GMT

14/01/29 17:36:02 INFO = zookeeper.ZooKeeper: Client environment:host.name=3Dusann01

= 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java.v= ersion=3D1.7.0_25

14/01/29 17:36:02 INFO zookeeper.Zoo= Keeper: Client environment:java.vendor=3DOracle Corporation

= 14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java.h= ome=3D/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.25/jre

14/= 01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java.class.path= =3D/hd/hadoop/libexec/../conf:/usr/bin/java/lib/tools.jar:/hd/hadoop/libexe= c/..:/hd/hadoop/libexec/../hadoop-core-1.2.1.jar:/hd/hadoop/libexec/../lib/= asm-3.2.jar:/hd/hadoop/libexec/../lib/aspectjrt-1.6.11.jar:/hd/hadoop/libex= ec/../lib/aspectjtools-1.6.11.jar:/hd/hadoop/libexec/../lib/commons-beanuti= ls-1.7.0.jar:/hd/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/hd= /hadoop/libexec/../lib/commons-cli-1.2.jar:/hd/hadoop/libexec/../lib/common= s-codec-1.4.jar:/hd/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/hd= /hadoop/libexec/../lib/commons-configuration-1.6.jar:/hd/hadoop/libexec/../= lib/commons-daemon-1.0.1.jar:/hd/hadoop/libexec/../lib/commons-digester-1.8= .jar:/hd/hadoop/libexec/../lib/commons-el-1.0.jar:/hd/hadoop/libexec/../lib= /commons-httpclient-3.0.1.jar:/hd/hadoop/libexec/../lib/commons-io-2.1.jar:= /hd/hadoop/libexec/../lib/commons-lang-2.4.jar:/hd/hadoop/libexec/../lib/co= mmons-logging-1.1.1.jar:/hd/hadoop/libexec/../lib/commons-logging-api-1.0.4= .jar:/hd/hadoop/libexec/../lib/commons-math-2.1.jar:/hd/hadoop/libexec/../l= ib/commons-net-3.1.jar:/hd/hadoop/libexec/../lib/core-3.1.1.jar:/hd/hadoop/= libexec/../lib/hadoop-capacity-scheduler-1.2.1.jar:/hd/hadoop/libexec/../li= b/hadoop-fairscheduler-1.2.1.jar:/hd/hadoop/libexec/../lib/hadoop-thriftfs-= 1.2.1.jar:/hd/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/hd/hadoop/libexec/= ../lib/jackson-core-asl-1.8.8.jar:/hd/hadoop/libexec/../lib/jackson-mapper-= asl-1.8.8.jar:/hd/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/hd/hado= op/libexec/../lib/jasper-runtime-5.5.12.jar:/hd/hadoop/libexec/../lib/jdeb-= 0.8.jar:/hd/hadoop/libexec/../lib/jersey-core-1.8.jar:/hd/hadoop/libexec/..= /lib/jersey-json-1.8.jar:/hd/hadoop/libexec/../lib/jersey-server-1.8.jar:/h= d/hadoop/libexec/../lib/jets3t-0.6.1.jar:/hd/hadoop/libexec/../lib/jetty-6.= 1.26.jar:/hd/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/hd/hadoop/libexec= /../lib/jsch-0.1.42.jar:/hd/hadoop/libexec/../lib/junit-4.5.jar:/hd/hadoop/= libexec/../lib/kfs-0.2.2.jar:/hd/hadoop/libexec/../lib/log4j-1.2.15.jar:/hd= /hadoop/libexec/../lib/mockito-all-1.8.5.jar:/hd/hadoop/libexec/../lib/oro-= 2.0.8.jar:/hd/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/hd/hadoop= /libexec/../lib/slf4j-api-1.4.3.jar:/hd/hadoop/libexec/../lib/slf4j-log4j12= -1.4.3.jar:/hd/hadoop/libexec/../lib/xmlenc-0.52.jar:/hd/hadoop/libexec/../= lib/jsp-2.1/jsp-2.1.jar:/hd/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar::= /hbase/hbase-0.94.15/lib/guava-11.0.2.jar:/hbase/hbase-0.94.15/lib/zookeepe= r-3.4.5.jar:/hbase/hbase-0.94.15/lib/protobuf-java-2.4.0a.jar

14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:java= .library.path=3D/hd/hadoop/libexec/../lib/native/Linux-i386-32

14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:jav= a.io.tmpdir=3D/tmp

14/01/29 17:36:02 INFO zookeeper.Zo= oKeeper: Client environment:java.compiler=3D<NA>

14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:os.name=3DL= inux

14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Clien= t environment:os.arch=3Di386

14/01/29 17:36:02 INFO zo= okeeper.ZooKeeper: Client environment:os.version=3D2.6.32-358.14.1.el6.i686=

14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client e= nvironment:user.name=3Dhduser

14/01/29 17:36:02 INFO = zookeeper.ZooKeeper: Client environment:user.home=3D/home/hduser=

14/01/29 17:36:02 INFO zookeeper.ZooKeeper: Client environment:= user.dir=3D/hbase/hbase-0.94.15/bin

14/01/29 17:36:02= INFO zookeeper.ZooKeeper: Initiating client connection, connectString=3Dlo= calhost:2181 sessionTimeout=3D180000 watcher=3Dhconnection

<= span style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F= 497D'>14/01/29 17:36:02 INFO zookeeper.RecoverableZooKeeper: The identifier= of this process is 18287@usann01

14/01/29 17:36:02 I= NFO zookeeper.ClientCnxn: Opening socket connection to server localhost/127= .0.0.1:2181. Will not attempt to authenticate using SASL (unknown error)

14/01/29 17:36:02 INFO zookeeper.ClientCnxn: Socket con= nection established to localhost/127.0.0.1:2181, initiating session

14/01/29 17:36:02 INFO zookeeper.ClientCnxn: Session establi= shment complete on server localhost/127.0.0.1:2181, sessionid =3D 0x143dff9= 049b0007, negotiated timeout =3D 40000

14/01/29 17:36= :04 INFO mapreduce.TableOutputFormat: Created table instance for employee

14/01/29 17:36:04 INFO input.FileInputFormat: Total in= put paths to process : 1

14/01/29 17:36:04 INFO util.= NativeCodeLoader: Loaded the native-hadoop library

14= /01/29 17:36:04 WARN snappy.LoadSnappy: Snappy native library not loaded

14/01/29 17:36:04 INFO mapred.JobClient: Running job: j= ob_201401291611_0001

14/01/29 17:36:05 INFO mapred.Jo= bClient:  map 0% reduce 0%

14/01/29 17:36:19 INF= O mapred.JobClient:  map 100% reduce 0%

14/01/2= 9 17:36:21 INFO mapred.JobClient: Job complete: job_201401291611_0001<= /o:p>

14/01/29 17:36:21 INFO mapred.JobClient: Counters: 19=

14/01/29 17:36:21 INFO mapred.JobClient:   Job = Counters

14/01/29 17:36:21 INFO mapred.JobClient:&nbs= p;    SLOTS_MILLIS_MAPS=3D8187

14/01/2= 9 17:36:21 INFO mapred.JobClient:     Total time spent = by all reduces waiting after reserving slots (ms)=3D0

=

14/01/29 17:36:21 INFO mapred.JobClient:     Total tim= e spent by all maps waiting after reserving slots (ms)=3D0

<= span style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F= 497D'>14/01/29 17:36:21 INFO mapred.JobClient:     Laun= ched map tasks=3D1

14/01/29 17:36:21 INFO mapred.JobC= lient:     Data-local map tasks=3D1

14/01/29 17:36:21 INFO mapred.JobClient:     SLOTS_M= ILLIS_REDUCES=3D0

14/01/29 17:36:21 INFO mapred.JobCl= ient:   ImportTsv

14/01/29 17:36:21 INFO mapred.JobClient:   &nb= sp; Bad Lines=3D3

14/01/29 17:36:21 INF= O mapred.JobClient:   File Output Format Counters

= 14/01/29 17:36:21 INFO mapred.JobClient:     Byt= es Written=3D0

14/01/29 17:36:21 INFO mapred.JobClien= t:   FileSystemCounters

14/01/29 17:36:21 I= NFO mapred.JobClient:     HDFS_BYTES_READ=3D131

14/01/29 17:36:21 INFO mapred.JobClient:   &n= bsp; FILE_BYTES_WRITTEN=3D79714

14/01/29 17:36:21 INF= O mapred.JobClient:   File Input Format Counters

<= span style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F= 497D'>14/01/29 17:36:21 INFO mapred.JobClient:     Byte= s Read=3D24

14/01/29 17:36:21 INFO mapred.JobClient:&= nbsp;  Map-Reduce Framework

14/01/29 17:36:21 IN= FO mapred.JobClient:     Map input records=3D3

14/01/29 17:36:21 INFO mapred.JobClient:   &nb= sp; Physical memory (bytes) snapshot=3D38928384

14/0= 1/29 17:36:21 INFO mapred.JobClient:     Spilled Record= s=3D0

14/01/29 17:36:21 INFO mapred.JobClient: &= nbsp;   CPU time spent (ms)=3D120

14/01/2= 9 17:36:21 INFO mapred.JobClient:     Total committed h= eap usage (bytes)=3D9502720

14/01/29 17:36:21 INFO ma= pred.JobClient:     Virtual memory (bytes) snapshot=3D3= 47398144

14/01/29 17:36:21 INFO mapred.JobClient:&nbs= p;    Map output records=3D0

14/01/2= 9 17:36:21 INFO mapred.JobClient:     SPLIT_RAW_BYTES= =3D107

[hduser@usann01 bin]$

 

 

Here is my file = on hdfs:

[hduser@usann01 bin]$ /hd/hadoop/bin/hadoop dfs –cat   &= nbsp;/user/hduser/employee1

emp1,24

emp2,26<= /span>

emp3,24

<= o:p> 

=  

 

Here is my= table in hbase:

 

hbase(m= ain):003:0> scan= 'employee'

ROW       = ;            &n= bsp;            = ;            COLUMN+= CELL

<= span style=3D'font-size:11.0pt;font-family:"Calibri","sans-serif";color:#1F= 497D'>0 row(s) in 0.0160 seconds

 

hbase(main):004:0><= /span>

Notice:  This e-mail message, together with any att= achments, contains
information of Merck & Co., Inc. (One Merck Drive, Whitehouse Station,<= br> New Jersey, USA 08889), and/or its affiliates Direct contact information
for affiliates is available at
http://www.merck.com/contact/contacts.html) that may be confidential,
proprietary copyrighted and/or legally privileged. It is intended solely
for the use of the individual or entity named on this message. If you are not the intended recipient, and have received this message in error,
please notify us immediately by reply e-mail and then delete it from
your system.

--_000_586ABBCA8C267644849E7ECF038E054E032D2D91A5USCTMXP51003m_--