hadoop-hdfs-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From gschen <10210240...@fudan.edu.cn>
Subject Not subscribed to hdfs-issues@hadoop.apache.org
Date Fri, 24 May 2013 07:10:29 GMT
On 05/24/2013 02:57 PM, Jing Zhao (JIRA) wrote:
>       [ https://issues.apache.org/jira/browse/HDFS-4846?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
>
> Jing Zhao updated HDFS-4846:
> ----------------------------
>
>      Labels: snapshot  (was: snapshot snapshots)
>      Status: Patch Available  (was: Open)
>      
>> Snapshot CLI commands output stacktrace for invalid arguments
>> -------------------------------------------------------------
>>
>>                  Key: HDFS-4846
>>                  URL: https://issues.apache.org/jira/browse/HDFS-4846
>>              Project: Hadoop HDFS
>>           Issue Type: Bug
>>     Affects Versions: 3.0.0
>>             Reporter: Stephen Chu
>>             Assignee: Jing Zhao
>>             Priority: Minor
>>               Labels: snapshot
>>          Attachments: HDFS-4846.001.patch, HDFS-4846.002.patch
>>
>>
>> It'd be useful to clean up the stacktraces output by the snapshot CLI commands when
the commands are used incorrectly. This will make things more readable for operators and hopefully
prevent confusion.
>> Allowing a snapshot on a directory that doesn't exist
>> {code}
>> schu-mbp:~ schu$ hdfs dfsadmin -allowSnapshot adfasdf
>> 2013-05-23 15:46:46.052 java[24580:1203] Unable to load realm info from SCDynamicStore
>> 2013-05-23 15:46:46,066 WARN  [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62))
- Unable to load native-hadoop library for your platform... using builtin-java classes where
applicable
>> allowSnapshot: Directory does not exist: /user/schu/adfasdf
>> 	at org.apache.hadoop.hdfs.server.namenode.INodeDirectory.valueOf(INodeDirectory.java:52)
>> 	at org.apache.hadoop.hdfs.server.namenode.snapshot.SnapshotManager.setSnapshottable(SnapshotManager.java:106)
>> 	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.allowSnapshot(FSNamesystem.java:5861)
>> 	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.allowSnapshot(NameNodeRpcServer.java:1121)
>> 	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.allowSnapshot(ClientNamenodeProtocolServerSideTranslatorPB.java:932)
>> 	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:48087)
>> 	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:527)
>> 	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1033)
>> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1842)
>> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1838)
>> 	at java.security.AccessController.doPrivileged(Native Method)
>> 	at javax.security.auth.Subject.doAs(Subject.java:396)
>> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1489)
>> 	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1836)
>> schu-mbp:~ schu$
>> {code}
>> Disallow a snapshot on a directory that isn't snapshottable
>> {code}
>> schu-mbp:~ schu$ hdfs dfsadmin -disallowSnapshot /user
>> 2013-05-23 15:49:07.251 java[24687:1203] Unable to load realm info from SCDynamicStore
>> 2013-05-23 15:49:07,265 WARN  [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62))
- Unable to load native-hadoop library for your platform... using builtin-java classes where
applicable
>> disallowSnapshot: Directory is not a snapshottable directory: /user
>> 	at org.apache.hadoop.hdfs.server.namenode.snapshot.INodeDirectorySnapshottable.valueOf(INodeDirectorySnapshottable.java:68)
>> 	at org.apache.hadoop.hdfs.server.namenode.snapshot.SnapshotManager.resetSnapshottable(SnapshotManager.java:151)
>> 	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.disallowSnapshot(FSNamesystem.java:5889)
>> 	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.disallowSnapshot(NameNodeRpcServer.java:1128)
>> 	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.disallowSnapshot(ClientNamenodeProtocolServerSideTranslatorPB.java:943)
>> 	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:48089)
>> 	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:527)
>> 	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1033)
>> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1842)
>> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1838)
>> 	at java.security.AccessController.doPrivileged(Native Method)
>> 	at javax.security.auth.Subject.doAs(Subject.java:396)
>> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1489)
>> 	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1836)
>> {code}
>> Snapshot diffs with non-existent snapshot paths
>> {code}
>> chu-mbp:~ schu$ hdfs snapshotDiff / gibberish1 gibberish2
>> 2013-05-23 15:53:32.986 java[24877:1203] Unable to load realm info from SCDynamicStore
>> 2013-05-23 15:53:33,001 WARN  [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62))
- Unable to load native-hadoop library for your platform... using builtin-java classes where
applicable
>> Exception in thread "main" org.apache.hadoop.hdfs.server.namenode.snapshot.SnapshotException:
Cannot find the snapshot of directory / with name gibberish1
>> 	at org.apache.hadoop.hdfs.server.namenode.snapshot.INodeDirectorySnapshottable.getSnapshotByName(INodeDirectorySnapshottable.java:389)
>> 	at org.apache.hadoop.hdfs.server.namenode.snapshot.INodeDirectorySnapshottable.computeDiff(INodeDirectorySnapshottable.java:363)
>> 	at org.apache.hadoop.hdfs.server.namenode.snapshot.SnapshotManager.diff(SnapshotManager.java:358)
>> 	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getSnapshotDiffReport(FSNamesystem.java:6035)
>> 	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getSnapshotDiffReport(NameNodeRpcServer.java:1153)
>> 	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getSnapshotDiffReport(ClientNamenodeProtocolServerSideTranslatorPB.java:985)
>> 	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:48095)
>> 	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:527)
>> 	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1033)
>> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1842)
>> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1838)
>> 	at java.security.AccessController.doPrivileged(Native Method)
>> 	at javax.security.auth.Subject.doAs(Subject.java:396)
>> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1489)
>> 	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1836)
>> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>> 	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>> 	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>> 	at org.apache.hadoop.hdfs.DFSClient.getSnapshotDiffReport(DFSClient.java:2161)
>> 	at org.apache.hadoop.hdfs.DistributedFileSystem.getSnapshotDiffReport(DistributedFileSystem.java:990)
>> 	at org.apache.hadoop.hdfs.tools.snapshot.SnapshotDiff.main(SnapshotDiff.java:85)
>> Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.snapshot.SnapshotException):
Cannot find the snapshot of directory / with name gibberish1
>> 	at org.apache.hadoop.hdfs.server.namenode.snapshot.INodeDirectorySnapshottable.getSnapshotByName(INodeDirectorySnapshottable.java:389)
>> 	at org.apache.hadoop.hdfs.server.namenode.snapshot.INodeDirectorySnapshottable.computeDiff(INodeDirectorySnapshottable.java:363)
>> 	at org.apache.hadoop.hdfs.server.namenode.snapshot.SnapshotManager.diff(SnapshotManager.java:358)
>> 	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getSnapshotDiffReport(FSNamesystem.java:6035)
>> 	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getSnapshotDiffReport(NameNodeRpcServer.java:1153)
>> 	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getSnapshotDiffReport(ClientNamenodeProtocolServerSideTranslatorPB.java:985)
>> 	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:48095)
>> 	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:527)
>> 	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1033)
>> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1842)
>> 	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1838)
>> 	at java.security.AccessController.doPrivileged(Native Method)
>> 	at javax.security.auth.Subject.doAs(Subject.java:396)
>> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1489)
>> 	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1836)
>> 	at org.apache.hadoop.ipc.Client.call(Client.java:1303)
>> 	at org.apache.hadoop.ipc.Client.call(Client.java:1255)
>> 	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:204)
>> 	at com.sun.proxy.$Proxy9.getSnapshotDiffReport(Unknown Source)
>> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> 	at java.lang.reflect.Method.invoke(Method.java:597)
>> 	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:163)
>> 	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:82)
>> 	at com.sun.proxy.$Proxy9.getSnapshotDiffReport(Unknown Source)
>> 	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getSnapshotDiffReport(ClientNamenodeProtocolTranslatorPB.java:975)
>> 	at org.apache.hadoop.hdfs.DFSClient.getSnapshotDiffReport(DFSClient.java:2158)
>> 	... 2 more
>> schu-mbp:~ schu$
>> {code}
> --
> This message is automatically generated by JIRA.
> If you think it was sent incorrectly, please contact your JIRA administrators
> For more information on JIRA, see: http://www.atlassian.com/software/jira



Mime
View raw message