Return-Path: X-Original-To: apmail-hadoop-hdfs-issues-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-issues-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 675189A5A for ; Fri, 24 May 2013 06:57:25 +0000 (UTC) Received: (qmail 61936 invoked by uid 500); 24 May 2013 06:57:23 -0000 Delivered-To: apmail-hadoop-hdfs-issues-archive@hadoop.apache.org Received: (qmail 61495 invoked by uid 500); 24 May 2013 06:57:22 -0000 Mailing-List: contact hdfs-issues-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-issues@hadoop.apache.org Delivered-To: mailing list hdfs-issues@hadoop.apache.org Received: (qmail 61417 invoked by uid 99); 24 May 2013 06:57:21 -0000 Received: from arcas.apache.org (HELO arcas.apache.org) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 24 May 2013 06:57:21 +0000 Date: Fri, 24 May 2013 06:57:20 +0000 (UTC) From: "Jing Zhao (JIRA)" To: hdfs-issues@hadoop.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Updated] (HDFS-4846) Snapshot CLI commands output stacktrace for invalid arguments MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/HDFS-4846?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Jing Zhao updated HDFS-4846: ---------------------------- Labels: snapshot (was: snapshot snapshots) Status: Patch Available (was: Open) > Snapshot CLI commands output stacktrace for invalid arguments > ------------------------------------------------------------- > > Key: HDFS-4846 > URL: https://issues.apache.org/jira/browse/HDFS-4846 > Project: Hadoop HDFS > Issue Type: Bug > Affects Versions: 3.0.0 > Reporter: Stephen Chu > Assignee: Jing Zhao > Priority: Minor > Labels: snapshot > Attachments: HDFS-4846.001.patch, HDFS-4846.002.patch > > > It'd be useful to clean up the stacktraces output by the snapshot CLI commands when the commands are used incorrectly. This will make things more readable for operators and hopefully prevent confusion. > Allowing a snapshot on a directory that doesn't exist > {code} > schu-mbp:~ schu$ hdfs dfsadmin -allowSnapshot adfasdf > 2013-05-23 15:46:46.052 java[24580:1203] Unable to load realm info from SCDynamicStore > 2013-05-23 15:46:46,066 WARN [main] util.NativeCodeLoader (NativeCodeLoader.java:(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable > allowSnapshot: Directory does not exist: /user/schu/adfasdf > at org.apache.hadoop.hdfs.server.namenode.INodeDirectory.valueOf(INodeDirectory.java:52) > at org.apache.hadoop.hdfs.server.namenode.snapshot.SnapshotManager.setSnapshottable(SnapshotManager.java:106) > at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.allowSnapshot(FSNamesystem.java:5861) > at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.allowSnapshot(NameNodeRpcServer.java:1121) > at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.allowSnapshot(ClientNamenodeProtocolServerSideTranslatorPB.java:932) > at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:48087) > at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:527) > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1033) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1842) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1838) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:396) > at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1489) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1836) > schu-mbp:~ schu$ > {code} > Disallow a snapshot on a directory that isn't snapshottable > {code} > schu-mbp:~ schu$ hdfs dfsadmin -disallowSnapshot /user > 2013-05-23 15:49:07.251 java[24687:1203] Unable to load realm info from SCDynamicStore > 2013-05-23 15:49:07,265 WARN [main] util.NativeCodeLoader (NativeCodeLoader.java:(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable > disallowSnapshot: Directory is not a snapshottable directory: /user > at org.apache.hadoop.hdfs.server.namenode.snapshot.INodeDirectorySnapshottable.valueOf(INodeDirectorySnapshottable.java:68) > at org.apache.hadoop.hdfs.server.namenode.snapshot.SnapshotManager.resetSnapshottable(SnapshotManager.java:151) > at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.disallowSnapshot(FSNamesystem.java:5889) > at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.disallowSnapshot(NameNodeRpcServer.java:1128) > at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.disallowSnapshot(ClientNamenodeProtocolServerSideTranslatorPB.java:943) > at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:48089) > at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:527) > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1033) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1842) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1838) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:396) > at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1489) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1836) > {code} > Snapshot diffs with non-existent snapshot paths > {code} > chu-mbp:~ schu$ hdfs snapshotDiff / gibberish1 gibberish2 > 2013-05-23 15:53:32.986 java[24877:1203] Unable to load realm info from SCDynamicStore > 2013-05-23 15:53:33,001 WARN [main] util.NativeCodeLoader (NativeCodeLoader.java:(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable > Exception in thread "main" org.apache.hadoop.hdfs.server.namenode.snapshot.SnapshotException: Cannot find the snapshot of directory / with name gibberish1 > at org.apache.hadoop.hdfs.server.namenode.snapshot.INodeDirectorySnapshottable.getSnapshotByName(INodeDirectorySnapshottable.java:389) > at org.apache.hadoop.hdfs.server.namenode.snapshot.INodeDirectorySnapshottable.computeDiff(INodeDirectorySnapshottable.java:363) > at org.apache.hadoop.hdfs.server.namenode.snapshot.SnapshotManager.diff(SnapshotManager.java:358) > at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getSnapshotDiffReport(FSNamesystem.java:6035) > at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getSnapshotDiffReport(NameNodeRpcServer.java:1153) > at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getSnapshotDiffReport(ClientNamenodeProtocolServerSideTranslatorPB.java:985) > at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:48095) > at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:527) > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1033) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1842) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1838) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:396) > at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1489) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1836) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106) > at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95) > at org.apache.hadoop.hdfs.DFSClient.getSnapshotDiffReport(DFSClient.java:2161) > at org.apache.hadoop.hdfs.DistributedFileSystem.getSnapshotDiffReport(DistributedFileSystem.java:990) > at org.apache.hadoop.hdfs.tools.snapshot.SnapshotDiff.main(SnapshotDiff.java:85) > Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.snapshot.SnapshotException): Cannot find the snapshot of directory / with name gibberish1 > at org.apache.hadoop.hdfs.server.namenode.snapshot.INodeDirectorySnapshottable.getSnapshotByName(INodeDirectorySnapshottable.java:389) > at org.apache.hadoop.hdfs.server.namenode.snapshot.INodeDirectorySnapshottable.computeDiff(INodeDirectorySnapshottable.java:363) > at org.apache.hadoop.hdfs.server.namenode.snapshot.SnapshotManager.diff(SnapshotManager.java:358) > at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getSnapshotDiffReport(FSNamesystem.java:6035) > at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getSnapshotDiffReport(NameNodeRpcServer.java:1153) > at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getSnapshotDiffReport(ClientNamenodeProtocolServerSideTranslatorPB.java:985) > at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:48095) > at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:527) > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1033) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1842) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1838) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:396) > at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1489) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1836) > at org.apache.hadoop.ipc.Client.call(Client.java:1303) > at org.apache.hadoop.ipc.Client.call(Client.java:1255) > at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:204) > at com.sun.proxy.$Proxy9.getSnapshotDiffReport(Unknown Source) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:163) > at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:82) > at com.sun.proxy.$Proxy9.getSnapshotDiffReport(Unknown Source) > at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getSnapshotDiffReport(ClientNamenodeProtocolTranslatorPB.java:975) > at org.apache.hadoop.hdfs.DFSClient.getSnapshotDiffReport(DFSClient.java:2158) > ... 2 more > schu-mbp:~ schu$ > {code} -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira