hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Pandeng Li" <398168...@qq.com>
Subject An internal error occurs during "eclipse connecting to DFS"
Date Mon, 07 Dec 2015 08:10:09 GMT
Hello everyone:
    I have asked this question before,but I thought i used a wrong title,so i send this question
again.
    I wanna develop my application on eclipse by the method of running on hadoop.but when
eclipse connecting to hadoop namenode,an internal error occurs.I search on the Internet,and
refer a lot similar case,but didn't solve my problem.
     first I build a plugin that suit both hadoop and eclipse-mars.finally build the hadoop-eclipse-plugin-2.7.1.jar,here
is the step:
    1: download the hadoop2x-eclipse-hadoop-master on the github,and extract it into proper
location.
    2:open {$hadoop2x-eclipse-hadoop-master_HOME}/ivy/libraries.properties,and compare the
versions of the various artifacts used by hadoop and components with the jar at the directory
of {$HADOOP_HOME}/share/hadoop/tools/lib.if there is artifact whose version doesn't match,then
change the libraries.properties to match hadoop's jar.
    there is some of changes of mine:
    htrace.version=3.0.4   ==>    htrace.version=3.1.0-incubating
    slf4j-api.version=1.7.5  ==>   slf4j-api.version=1.7.10
    slf4j-log4j12.version=1.7.5  ==> slf4j-log4j12.version=1.7.10
    3:run the shell command:
    >>ant package  -Dversion=2.7.1 -Dhadoop.version=2.7.1 -Declipse.home=/opt/eclipse
-Dhadoop.home=/opt/hadoop
    finally i got hadoop-eclipse-plugin-2.7.1.jar  at directory {$hadoop2x-eclipse-hadoop-master_HOME}/build/contrib/eclipse-plugin,and
add it to eclipse.
    unfortunately,when add a new hadoop location,set the host and port.it occurs the error
of this:
    >>An internal error occurred during: "Connecting to DFS hadoop2.7.1".
    >> Could not initialize class org.apache.hadoop.hdfs.DFSConfigKeys.
    here is some of my config:
    hdfs-site.xml:
        <property>
            <name>dfs.datanode.ipc.address</name>
            <value>master:50020</value>
        </property>
    core-site.xml:
        <property>
            <name>fs.defaultFS</name>
            <value>hdfs://master:9000</value>
        </property>
    mapred-site.xml
        <property>
            <name>mapreduce.framework.name</name>
            <value>yarn</value>
        </property>
        <property>
            <name>mapreduce.jobtracker.address</name>
            <value>master:9002</value>
        </property>
    yarn-site.xml:
        <property>
            <name>yarn.resourcemanager.address</name>
            <value>master:8032</value>
        </property>

    I  have tried the port 9002,8032 and 50020 for Map/Reduce(V2) Master, and set the port
9000 for DFS Master.
    does someone knows how to fix it.it has disturbed me for a long time.without fixing this
problem i can't go for next step.i'm extremely looking forward to someone's reply.
    Sincerely
    Pandeng
Mime
View raw message