ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hudson (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (AMBARI-9319) HBase fails to start after adding HBase Service to a cluster that has NameNode HA already enabled
Date Sat, 24 Jan 2015 17:33:34 GMT

    [ https://issues.apache.org/jira/browse/AMBARI-9319?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14290721#comment-14290721
] 

Hudson commented on AMBARI-9319:
--------------------------------

SUCCESS: Integrated in Ambari-trunk-Commit #1595 (See [https://builds.apache.org/job/Ambari-trunk-Commit/1595/])
AMBARI-9319. HBase fails to start after adding HBase Service to a cluster that has NameNode
HA already enabled (alexantonenko) (hiveww: http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=48a759ccec2090ed686e8ae775f7fbc786804b55)
* ambari-web/app/controllers/wizard/step7_controller.js


> HBase fails to start after adding HBase Service to a cluster that has NameNode HA already
enabled
> -------------------------------------------------------------------------------------------------
>
>                 Key: AMBARI-9319
>                 URL: https://issues.apache.org/jira/browse/AMBARI-9319
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-web
>    Affects Versions: 2.0.0
>            Reporter: Antonenko Alexander
>            Assignee: Antonenko Alexander
>            Priority: Critical
>             Fix For: 2.0.0
>
>         Attachments: AMBARI-9319.patch, hbase-site.xml, hbase_error.txt, hbase_output.txt
>
>
> HBase fails to start when enabling HA in a 3-node cluster with Ambari 2.0.0 
> (build 339) and HDP 2.2.1.0-2165
> STR:
> Install Ambari 2.0.0 with default settings
> Install HDP 2.2.1.0 on a single node with just HDFS and ZK
> Add 2 more nodes with ZK servers on all 3 nodes
> Enable HA (service name is "ha")
> Add HBase service
> {code}
> 2014-12-30 19:29:40,270 - Error while executing command 'start':
> Traceback (most recent call last):
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 142, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/hbase_master.py",
line 48, in start
>     self.configure(env) # for security
>   File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/hbase_master.py",
line 38, in configure
>     hbase(name='master')
>   File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/hbase.py",
line 150, in hbase
>     params.HdfsResource(None, action="execute")
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148,
in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
151, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
117, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py",
line 105, in action_execute
>     logoutput=logoutput,
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148,
in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
151, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
117, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 265, in action_run
>     raise ex
> Fail: Execution of 'hadoop --config /etc/hadoop/conf jar /var/lib/ambari-agent/lib/fast-hdfs-resource.jar
/var/lib/ambari-agent/data/hdfs_resources.json hdfs://ha' returned 1. Creating: Resource [source=null,
target=hdfs://c6404.ambari.apache.org,c6405.ambari.apache.org:8020/apps/hbase/data, type=directory,
action=create, owner=hbase, group=null, mode=null, recursiveChown=false, recursiveChmod=false]
> Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS: hdfs://c6404.ambari.apache.org,c6405.ambari.apache.org:8020/apps/hbase/data,
expected: hdfs://ha
> 	at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:645)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:193)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.access$000(DistributedFileSystem.java:105)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
> 	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
> 	at org.apache.hadoop.fs.FileSystem.isFile(FileSystem.java:1426)
> 	at org.apache.ambari.fast_hdfs_resource.Resource.checkResourceParameters(Resource.java:152)
> 	at org.apache.ambari.fast_hdfs_resource.Runner.main(Runner.java:72)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
> 	at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message