hadoop-mapreduce-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From tomwh...@apache.org
Subject svn commit: r903508 - in /hadoop/mapreduce/trunk: CHANGES.txt src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/hive/TableDefWriter.java
Date Wed, 27 Jan 2010 00:05:22 GMT
Author: tomwhite
Date: Wed Jan 27 00:05:21 2010
New Revision: 903508

URL: http://svn.apache.org/viewvc?rev=903508&view=rev
Log:
MAPREDUCE-1394. Sqoop generates incorrect URIs in paths sent to Hive. Contributed by Aaron
Kimball.

Modified:
    hadoop/mapreduce/trunk/CHANGES.txt
    hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/hive/TableDefWriter.java

Modified: hadoop/mapreduce/trunk/CHANGES.txt
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/CHANGES.txt?rev=903508&r1=903507&r2=903508&view=diff
==============================================================================
--- hadoop/mapreduce/trunk/CHANGES.txt (original)
+++ hadoop/mapreduce/trunk/CHANGES.txt Wed Jan 27 00:05:21 2010
@@ -256,6 +256,9 @@
     MAPREDUCE-1327. Fix Sqoop handling of Oracle timezone with timestamp data
     types in import. (Leonid Furman via cdouglas)
 
+    MAPREDUCE-1394. Sqoop generates incorrect URIs in paths sent to Hive.
+    (Aaron Kimball via tomwhite)
+
 Release 0.21.0 - Unreleased
 
   INCOMPATIBLE CHANGES

Modified: hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/hive/TableDefWriter.java
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/hive/TableDefWriter.java?rev=903508&r1=903507&r2=903508&view=diff
==============================================================================
--- hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/hive/TableDefWriter.java
(original)
+++ hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/hive/TableDefWriter.java
Wed Jan 27 00:05:21 2010
@@ -142,23 +142,6 @@
     FileSystem fs = FileSystem.get(configuration);
     Path finalPath = new Path(tablePath).makeQualified(fs);
     String finalPathStr = finalPath.toString();
-    if (finalPathStr.startsWith("hdfs://") && finalPathStr.indexOf(":", 7) == -1)
{
-      // Hadoop removed the port number from the fully-qualified URL.
-      // We need to reinsert this or else Hive will complain.
-      // Do this right before the third instance of the '/' character.
-      int insertPoint = 0;
-      for (int i = 0; i < 3; i++) {
-        insertPoint = finalPathStr.indexOf("/", insertPoint + 1);
-      }
-
-      if (insertPoint == -1) {
-        LOG.warn("Fully-qualified HDFS path does not contain a port.");
-        LOG.warn("this may cause a Hive error.");
-      } else {
-        finalPathStr = finalPathStr.substring(0, insertPoint) + ":" + DEFAULT_HDFS_PORT
-            + finalPathStr.substring(insertPoint, finalPathStr.length());
-      }
-    }
 
     StringBuilder sb = new StringBuilder();
     sb.append("LOAD DATA INPATH '");



Mime
View raw message