ambari-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From abaranc...@apache.org
Subject [1/2] ambari git commit: AMBARI-11379 - [WinTP2] Stack Advisor reported an error during cluster deployment with AMS
Date Wed, 27 May 2015 11:55:38 GMT
Repository: ambari
Updated Branches:
  refs/heads/trunk 64ce4989e -> 9efa8f9c8


AMBARI-11379 - [WinTP2] Stack Advisor reported an error during cluster deployment with AMS


Project: http://git-wip-us.apache.org/repos/asf/ambari/repo
Commit: http://git-wip-us.apache.org/repos/asf/ambari/commit/e7d2d3fe
Tree: http://git-wip-us.apache.org/repos/asf/ambari/tree/e7d2d3fe
Diff: http://git-wip-us.apache.org/repos/asf/ambari/diff/e7d2d3fe

Branch: refs/heads/trunk
Commit: e7d2d3fee327ffba771f17df9af0ed3d7aa55b22
Parents: 64ce498
Author: Artem Baranchuk <abaranchuk@hortonworks.con>
Authored: Tue May 26 01:07:51 2015 +0300
Committer: Artem Baranchuk <abaranchuk@hortonworks.con>
Committed: Wed May 27 14:55:09 2015 +0300

----------------------------------------------------------------------
 .../main/resources/stacks/HDPWIN/2.1/services/stack_advisor.py   | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/ambari/blob/e7d2d3fe/ambari-server/src/main/resources/stacks/HDPWIN/2.1/services/stack_advisor.py
----------------------------------------------------------------------
diff --git a/ambari-server/src/main/resources/stacks/HDPWIN/2.1/services/stack_advisor.py
b/ambari-server/src/main/resources/stacks/HDPWIN/2.1/services/stack_advisor.py
index 308ff96..0affe64 100644
--- a/ambari-server/src/main/resources/stacks/HDPWIN/2.1/services/stack_advisor.py
+++ b/ambari-server/src/main/resources/stacks/HDPWIN/2.1/services/stack_advisor.py
@@ -636,7 +636,7 @@ class HDPWIN21StackAdvisor(DefaultStackAdvisor):
     if dir.startswith("hdfs://"):
       return None #TODO following code fails for hdfs://, is this valid check for hdfs?
 
-    dir = re.sub("^file://", "", dir, count=1)
+    dir = re.sub("^file:/*", "", dir, count=1)
     mountPoints = {}
     for mountPoint in hostInfo["disk_info"]:
       mountPoints[mountPoint["mountpoint"]] = to_number(mountPoint["available"])
@@ -782,7 +782,7 @@ def getMountPointForDir(dir, mountPoints):
     # "/", "/hadoop/hdfs", and "/hadoop/hdfs/data".
     # So take the one with the greatest number of segments.
     for mountPoint in mountPoints:
-      if dir.startswith(mountPoint):
+      if dir.startswith(os.path.splitdrive(mountPoint)[0].lower()):
         if bestMountFound is None:
           bestMountFound = mountPoint
         elif bestMountFound.count(os.path.sep) < mountPoint.count(os.path.sep):


Mime
View raw message