Return-Path: X-Original-To: apmail-hbase-dev-archive@www.apache.org Delivered-To: apmail-hbase-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 469A210BF1 for ; Sat, 14 Dec 2013 18:03:04 +0000 (UTC) Received: (qmail 46712 invoked by uid 500); 14 Dec 2013 18:03:03 -0000 Delivered-To: apmail-hbase-dev-archive@hbase.apache.org Received: (qmail 46643 invoked by uid 500); 14 Dec 2013 18:03:03 -0000 Mailing-List: contact dev-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hbase.apache.org Delivered-To: mailing list dev@hbase.apache.org Received: (qmail 46635 invoked by uid 99); 14 Dec 2013 18:03:02 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 14 Dec 2013 18:03:02 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: local policy includes SPF record at spf.trusted-forwarder.org) Received: from [209.85.128.176] (HELO mail-ve0-f176.google.com) (209.85.128.176) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 14 Dec 2013 18:02:58 +0000 Received: by mail-ve0-f176.google.com with SMTP id oz11so2243684veb.21 for ; Sat, 14 Dec 2013 10:02:36 -0800 (PST) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to:content-type; bh=1YF9khIfxAYrUg8up2Z/u6TeccZ5+mpKm6oJ5kba6T0=; b=WapqfQlBNr8u60yHsBGfWbLTmtCkQ+Fa5JKuKhqkrPYE7pjHMN2PS/1vVXCploYVdc 84hPqGjhEy1me2HDqUXDe5OxSJ2JxJ1J+thmJDHv61OM5u1nZWingMrMRzzxzGcIeCoh HaDvoBXmGy2r1RSBQ3qhp+wWOI0MbY/BnvxI8eFXpWO6afIFqTnIVJ+L23mJzdK9u3zw hY4k0dWgCzgmX0TMzGTGlewc4Iu4RtsIK6xaoxQoY29V0beJnXQkq56dBSOB4KIjzGjZ ycCac9KqfLMwLVU3+kR875DXHMC4SWTQoQavsiKabAvd6u7TcdEvNNgsdaSLg/I5w/QV Iz7Q== X-Gm-Message-State: ALoCoQmJvq0HF/stAlgQi8/YS7PJKCf8PWeEQ59qYmNRS73e2WNaFFKo2SeBFoIYwhiY/kvsV1g1 X-Received: by 10.52.50.177 with SMTP id d17mr3813228vdo.23.1387044156360; Sat, 14 Dec 2013 10:02:36 -0800 (PST) MIME-Version: 1.0 Received: by 10.52.24.49 with HTTP; Sat, 14 Dec 2013 10:02:16 -0800 (PST) In-Reply-To: References: From: Jean-Marc Spaggiari Date: Sat, 14 Dec 2013 13:02:16 -0500 Message-ID: Subject: Re: HBase 0.96 LogLevel? To: dev Content-Type: multipart/alternative; boundary=089e0111c2dc3a616b04ed826244 X-Virus-Checked: Checked by ClamAV on apache.org --089e0111c2dc3a616b04ed826244 Content-Type: text/plain; charset=UTF-8 Finally I copied the hadoop 2.2.0 log4j.properties in place of the HBase one and got what I was looking for ;) Thanks for your recommendations. 2013/12/14 Ted Yu > TRACE,RFA was used in the command you posted - it overrode DEBUG,console in > your properties file. > > I should have mentioned that I got the snippet from > /grid/0/var/log/hbase/hbase.log > where /grid/0/var/log/hbase is the log dir. > > You can try specifying TRACE,console > > Cheers > > > On Sat, Dec 14, 2013 at 7:47 AM, Jean-Marc Spaggiari < > jean-marc@spaggiari.org> wrote: > > > I have updated the log4.properties. > > > > Here is the output with log4j debug enabled: > > hbase@hbasetest1:~$ bin/hbase -Dlog4j.debug > > org.apache.hadoop.hbase.util.CompressionTest file:///tmp/test.txt snappy > > log4j: Trying to find [log4j.xml] using context classloader > > sun.misc.Launcher$AppClassLoader@713c817. > > log4j: Trying to find [log4j.xml] using > > sun.misc.Launcher$AppClassLoader@713c817 class loader. > > log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource(). > > log4j: Trying to find [log4j.properties] using context classloader > > sun.misc.Launcher$AppClassLoader@713c817. > > log4j: Using URL [file:/home/hbase/conf/log4j.properties] for automatic > > log4j configuration. > > log4j: Reading configuration from URL > > file:/home/hbase/conf/log4j.properties > > log4j: Hierarchy threshold set to [ALL]. > > log4j: Parsing for [root] with value=[INFO,console]. > > log4j: Level token is [INFO]. > > log4j: Category root set to INFO > > log4j: Parsing appender named "console". > > log4j: Parsing layout options for "console". > > log4j: Setting property [conversionPattern] to [%d{ISO8601} %-5p [%t] > > %c{2}: %m%n]. > > log4j: End of parsing for "console". > > log4j: Setting property [target] to [System.err]. > > log4j: Parsed "console" options. > > log4j: Parsing for [SecurityLogger] with value=[INFO,NullAppender]. > > log4j: Level token is [INFO]. > > log4j: Category SecurityLogger set to INFO > > log4j: Parsing appender named "NullAppender". > > log4j: Parsed "NullAppender" options. > > log4j: Handling log4j.additivity.SecurityLogger=[false] > > log4j: Setting additivity for "SecurityLogger" to false > > log4j: Finished configuring. > > 2013-12-14 10:36:45,985 INFO [main] Configuration.deprecation: > > hadoop.native.lib is deprecated. Instead, use io.native.lib.available > > Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library > > /home/hbase/lib/native/Linux-amd64-64/libhadoop.so which might have > > disabled stack guard. The VM will try to fix the stack guard now. > > It's highly recommended that you fix the library with 'execstack -c > > ', or link it with '-z noexecstack'. > > 2013-12-14 10:36:46,577 WARN [main] util.NativeCodeLoader: Unable to > load > > native-hadoop library for your platform... using builtin-java classes > where > > applicable > > 2013-12-14 10:36:46,881 INFO [main] util.ChecksumType: Checksum using > > org.apache.hadoop.util.PureJavaCrc32 > > 2013-12-14 10:36:46,883 INFO [main] util.ChecksumType: Checksum can use > > org.apache.hadoop.util.PureJavaCrc32C > > Exception in thread "main" java.lang.UnsatisfiedLinkError: > > org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z > > at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native > > Method) > > at > > > > > org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:62) > > at > > > > > org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:131) > > at > > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:147) > > at > > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:162) > > at > > > > > org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:312) > > at > > > > > org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.(HFileBlockDefaultEncodingContext.java:79) > > at > > > > > org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.(HFileBlock.java:719) > > at > > > > > org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:131) > > at > > > > > org.apache.hadoop.hbase.io.hfile.HFileWriterV2.(HFileWriterV2.java:122) > > at > > > > > org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:105) > > at > > > org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:426) > > at > > > > > org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:115) > > at > > > org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:145) > > > > > > As you can see, it's reading from the default log4j.property file. > > > > Inside of which I have this: > > # Define some default values that can be overridden by system properties > > hbase.root.logger=DEBUG,console > > hbase.security.logger=DEBUG,console > > hbase.log.dir=. > > hbase.log.file=hbase.log > > root.logger=DEBUG.console > > log4j.rootLogger=DEBUG,console > > > > > > However as you can see, no debug info displayed. > > > > Using your parameter seems to be working: > > hbase@hbasetest1:~$ bin/hbase -Dlog4j.debug > -Dhbase.root.logger=TRACE,RFA > > org.apache.hadoop.hbase.util.CompressionTest file:///tmp/test.txt snappy > > /home/hbase/bin/../lib/native/Linux-amd64-64 > > log4j: Trying to find [log4j.xml] using context classloader > > sun.misc.Launcher$AppClassLoader@713c817. > > log4j: Trying to find [log4j.xml] using > > sun.misc.Launcher$AppClassLoader@713c817 class loader. > > log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource(). > > log4j: Trying to find [log4j.properties] using context classloader > > sun.misc.Launcher$AppClassLoader@713c817. > > log4j: Using URL [file:/home/hbase/conf/log4j.properties] for automatic > > log4j configuration. > > log4j: Reading configuration from URL > > file:/home/hbase/conf/log4j.properties > > log4j: Hierarchy threshold set to [ALL]. > > log4j: Parsing for [root] with value=[TRACE,RFA]. > > log4j: Level token is [TRACE]. > > log4j: Category root set to TRACE > > log4j: Parsing appender named "RFA". > > log4j: Parsing layout options for "RFA". > > log4j: Setting property [conversionPattern] to [%d{ISO8601} %-5p [%t] > > %c{2}: %m%n]. > > log4j: End of parsing for "RFA". > > log4j: Setting property [maxBackupIndex] to [20]. > > log4j: Setting property [file] to [/home/hbase/bin/../logs/hbase.log]. > > log4j: Setting property [maxFileSize] to [256MB]. > > log4j: setFile called: /home/hbase/bin/../logs/hbase.log, true > > log4j: setFile ended > > log4j: Parsed "RFA" options. > > log4j: Parsing for [SecurityLogger] with value=[INFO,NullAppender]. > > log4j: Level token is [INFO]. > > log4j: Category SecurityLogger set to INFO > > log4j: Parsing appender named "NullAppender". > > log4j: Parsed "NullAppender" options. > > log4j: Handling log4j.additivity.SecurityLogger=[false] > > log4j: Setting additivity for "SecurityLogger" to false > > log4j: Finished configuring. > > Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library > > /home/hbase/lib/native/Linux-amd64-64/libhadoop.so which might have > > disabled stack guard. The VM will try to fix the stack guard now. > > It's highly recommended that you fix the library with 'execstack -c > > ', or link it with '-z noexecstack'. > > Exception in thread "main" java.lang.UnsatisfiedLinkError: > > org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z > > at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native > > Method) > > at > > > > > org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:62) > > at > > > > > org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:131) > > at > > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:147) > > at > > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:162) > > at > > > > > org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:312) > > at > > > > > org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.(HFileBlockDefaultEncodingContext.java:79) > > at > > > > > org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.(HFileBlock.java:719) > > at > > > > > org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:131) > > at > > > > > org.apache.hadoop.hbase.io.hfile.HFileWriterV2.(HFileWriterV2.java:122) > > at > > > > > org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:105) > > at > > > org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:426) > > at > > > > > org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:115) > > at > > > org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:145) > > > > > > But still no debug info displayed. My goal is to display the debug logs > of > > NativeCodeLoader for its static piece, but so far, no luck... > > > > > > 2013/12/14 Ted Yu > > > > > How did you set log level to DEBUG ? > > > > > > I tried the following command and it worked: > > > > > > hbase -Dhbase.root.logger=DEBUG,RFA > > > org.apache.hadoop.hbase.util.CompressionTest file:///tmp/test.txt > snappy > > > > > > Snippet of log file: > > > > > > 2013-12-14 15:23:37,853 DEBUG [main] hdfs.BlockReaderLocal: The > > > short-circuit local reads feature is enabled. > > > 2013-12-14 15:23:37,900 INFO [main] compress.CodecPool: Got brand-new > > > decompressor [.snappy] > > > 2013-12-14 15:23:37,901 DEBUG [main] compress.CodecPool: Got recycled > > > decompressor > > > > > > Cheers > > > > > > > > > On Sat, Dec 14, 2013 at 5:42 AM, Jean-Marc Spaggiari < > > > jean-marc@spaggiari.org> wrote: > > > > > > > Hi there, > > > > > > > > I'm trying this tool: > > > > bin/hbase org.apache.hadoop.hbase.util.CompressionTest > > > file:///tmp/test.txt > > > > snappy > > > > > > > > And I want to set the log level to debug to see why it fails. But it > > > seems > > > > that it's not taking the log4j.conf into consideraion. I tried to > > remove > > > > it, same result. I tried to set to debug, same result. > > > > > > > > Any idea how to change the loglevel and why it's not taking our > default > > > > config file into consideration? > > > > > > > > JM > > > > > > > > > > --089e0111c2dc3a616b04ed826244--