hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 권병창 <magnu...@navercorp.com>
Subject Re: Installing just the HDFS client
Date Tue, 30 Aug 2016 04:40:16 GMT
let's  default core-site.xml, hdfs-site.xml log4j.properties (is not your customized *.xml)
locate like below tree.
and re-build jar using 'mvn clean package'.  
finally resources(*.xml *.properties) will locate into a jar.
 .
├── pom.xml 
└── src
    └── main
        └── resources
            ├── core-site.xml
            ├── hdfs-site.xml
            └── log4j.properties
 
and you can use a jar:
 
java -jar ${build_jar} -conf /opt/hadoop/etc/hdfs-site.xml -ls hdfs://${nameservices}/user/home


 
-----Original Message-----
From: "F21"&lt;f21.groups@gmail.com&gt; 
To: "권병창"&lt;magnum.c@navercorp.com&gt;; 
Cc: 
Sent: 2016-08-30 (화) 12:47:16
Subject: Re: Installing just the HDFS client
 

  
    
  
  
    I am still getting the same error
      despite setting the variable. In my case, hdfs-site.xml and
      core-site.xml is customized at run time, so cannot be compiled
      into the jar.

      

      My core-site.xml and hdfs-site.xml is located in /opt/hbase/conf

      

      Here's what I did:

      

      bash-4.3# export HADOOP_CONF_DIR="/opt/hbase/conf"

      bash-4.3# echo $HADOOP_CONF_DIR

      /opt/hbase/conf

      bash-4.3# cd /opt/hadoop/

      bash-4.3# java -jar hdfs-fs.jar

      log4j:WARN No appenders could be found for logger
      (org.apache.hadoop.util.Shell).

      log4j:WARN Please initialize the log4j system properly.

      log4j:WARN See
      http://logging.apache.org/log4j/1.2/faq.html#noconfig for more
      info.

      Exception in thread "main" java.lang.RuntimeException:
      core-site.xml not found

              at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2566)

              at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2492)

              at
      org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2405)

              at
      org.apache.hadoop.conf.Configuration.set(Configuration.java:1143)

              at
      org.apache.hadoop.conf.Configuration.set(Configuration.java:1115)

              at
      org.apache.hadoop.conf.Configuration.setBoolean(Configuration.java:1451)

              at
org.apache.hadoop.util.GenericOptionsParser.processGeneralOptions(GenericOptionsParser.java:321)

              at
org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:487)

              at
org.apache.hadoop.util.GenericOptionsParser.&lt;init&gt;(GenericOptionsParser.java:170)

              at
org.apache.hadoop.util.GenericOptionsParser.&lt;init&gt;(GenericOptionsParser.java:153)

              at
      org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:64)

              at
      org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)

              at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)

      

      On 30/08/2016 11:20 AM, 권병창 wrote:

    
    
      
        set below environmental variable.
         
        
        export HADOOP_CONF_DIR=/opt/hadoop/etc
        

          or
         
        
        hdfs-site.xml core-site.xml can locate in the jar.
        

          
            

            -----Original Message-----
            

            보낸 사람: F21 &lt;f21.groups@gmail.com&gt;
            

            받는 사람: 권병창 &lt;magnum.c@navercorp.com&gt;
            

            참조: 

            날짜: 2016. 8. 30 오전 9:12:58
            

            제목: Re: Installing just the HDFS client
            

            

          

          
            
            Hi,

              

              Thanks for the pom.xml. I was able to build it
              successfully. How do I point it to the config files? My
              core-site.xml and hdfs-site.xml are located in
              /opt/hadoop/etc.

              

              I tried the following:

              java -jar hdfs-fs.jar -ls /

              java -jar hdfs-fs.jar --config /opt/hbase/etc -ls /

              java -jar hdfs-fs.jar -conf /opt/hbase/etc -ls /

              

              This is the error I am getting:

              log4j:WARN No appenders could be found for logger
              (org.apache.hadoop.util.Shell).

              log4j:WARN Please initialize the log4j system properly.

              log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
              for more info.

              Exception in thread "main" java.lang.RuntimeException:
              core-site.xml not found

                      at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2566)

                      at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2492)

                      at
              org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2405)

                      at
              org.apache.hadoop.conf.Configuration.set(Configuration.java:1143)

                      at
              org.apache.hadoop.conf.Configuration.set(Configuration.java:1115)

                      at
              org.apache.hadoop.conf.Configuration.setBoolean(Configuration.java:1451)

                      at
org.apache.hadoop.util.GenericOptionsParser.processGeneralOptions(GenericOptionsParser.java:321)

                      at
org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:487)

                      at
org.apache.hadoop.util.GenericOptionsParser.&lt;init&gt;(GenericOptionsParser.java:170)

                      at
org.apache.hadoop.util.GenericOptionsParser.&lt;init&gt;(GenericOptionsParser.java:153)

                      at
              org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:64)

                      at
              org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)

                      at
              org.apache.hadoop.fs.FsShell.main(FsShell.java:340)

              

              How do I point the jar to use the hdfs-site.xml and
              core-site.xml located in /opt/hadoop/etc?

              

              Cheers,

              Francis

              

              On 29/08/2016 5:18 PM, 권병창 wrote:

            
            
              
                Hi

                 

                refer to below pom.xml

                you should modify hadoop version that you want.

                 

                build
                    is simple:  mvn clean package 

                It
                    will make a jar  of 34mb size. 

                 

                usage is simple:

                java -jar
                  ${build_jar}.jar -mkdir /user/home

                java
                    -jar ${build_jar}.jar -ls /user/home

                
                 

                 

                pom.xml

                &lt;?xml version="1.0" encoding="UTF-8"?&gt; 

                &lt;project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"&gt;

  &lt;modelVersion&gt;4.0.0&lt;/modelVersion&gt;

  &lt;groupId&gt;com.naver.c3&lt;/groupId&gt;
  &lt;artifactId&gt;hdfs-connector&lt;/artifactId&gt;
  &lt;version&gt;1.0-SNAPSHOT&lt;/version&gt;

  &lt;dependencies&gt;
    &lt;dependency&gt;
      &lt;groupId&gt;org.apache.hadoop&lt;/groupId&gt;
      &lt;artifactId&gt;hadoop-common&lt;/artifactId&gt;
      &lt;version&gt;2.7.1&lt;/version&gt;
    &lt;/dependency&gt;
    &lt;dependency&gt;
      &lt;groupId&gt;org.apache.hadoop&lt;/groupId&gt;
      &lt;artifactId&gt;hadoop-hdfs&lt;/artifactId&gt;
      &lt;version&gt;2.7.1&lt;/version&gt;
    &lt;/dependency&gt;
  &lt;/dependencies&gt;

  &lt;build&gt;
    &lt;plugins&gt;
      &lt;plugin&gt;
        &lt;groupId&gt;org.apache.maven.plugins&lt;/groupId&gt;
        &lt;artifactId&gt;maven-shade-plugin&lt;/artifactId&gt;
        &lt;version&gt;2.3&lt;/version&gt;
        &lt;executions&gt;
          &lt;execution&gt;
            &lt;phase&gt;package&lt;/phase&gt;
            &lt;goals&gt;
              &lt;goal&gt;shade&lt;/goal&gt;
            &lt;/goals&gt;
            &lt;configuration&gt;
              &lt;minimizeJar&gt;false&lt;/minimizeJar&gt;
              &lt;transformers&gt;
                &lt;transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"&gt;
                  &lt;mainClass&gt;org.apache.hadoop.fs.FsShell&lt;/mainClass&gt;
                &lt;/transformer&gt;
                &lt;transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/&gt;
              &lt;/transformers&gt;
           &lt;/configuration&gt;
          &lt;/execution&gt;
        &lt;/executions&gt;
      &lt;/plugin&gt;
    &lt;/plugins&gt;
  &lt;/build&gt;

&lt;/project&gt;
                 

                 

                -----Original
                    Message-----

                  From: "F21"&lt;f21.groups@gmail.com&gt;
                  

                  To: &lt;user@hadoop.apache.org&gt;;
                  

                  Cc: 

                  Sent: 2016-08-29 (월) 14:25:09

                  Subject: Installing just the HDFS client

                   

                Hi all,

                

                I am currently building a HBase docker image. As part of
                the bootstrap 

                process, I need to run some `hdfs dfs` commands to
                create directories on 

                HDFS.

                

                The whole hadoop distribution is pretty heavy contains
                things to run 

                namenodes, etc. I just need a copy of the dfs client for
                my docker 

                image. I have done some poking around and see that I
                need to include the 

                files in bin/, libexec/, lib/ and share/hadoop/common
                and share/hadoop/hdfs.

                

                However, including the above still takes up quite a bit
                of space. Is 

                there a single JAR I can add to my image to perform
                operations against HDFS?

                

                

                Cheers,

                

                Francis

                

                

---------------------------------------------------------------------

                To unsubscribe, e-mail: user-unsubscribe@hadoop.apache.org

                For additional commands, e-mail: user-help@hadoop.apache.org

                

                

                

              
              
                
              
            
            

            

          
      
      
        
      
    
    

    

  



Mime
View raw message