ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Andrew Onischuk (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (AMBARI-7910) Lzo package missing
Date Wed, 22 Oct 2014 20:27:34 GMT

     [ https://issues.apache.org/jira/browse/AMBARI-7910?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Andrew Onischuk resolved AMBARI-7910.
-------------------------------------
    Resolution: Fixed

Committed to trunk and branch-1.7.0

> Lzo package missing 
> --------------------
>
>                 Key: AMBARI-7910
>                 URL: https://issues.apache.org/jira/browse/AMBARI-7910
>             Project: Ambari
>          Issue Type: Bug
>            Reporter: Andrew Onischuk
>            Assignee: Andrew Onischuk
>             Fix For: 1.7.0
>
>
> Compression jobs are failing due to missing Lzo package.
> **console.log**
>     
>     
>     
>     2014-10-18 21:21:34,621|main|INFO|19267|139929561851648|MainThread|RUNNING TEST "test_Compression[com.hadoop.compression.lzo.LzoCodec-org.apache.hadoop.io.compress.DefaultCodec-NONE-TextFormat]"
at location "tests/mapred/mapred_1/Compression/test_Compression_20.py" at line number "72"
>     2014-10-18 21:21:34,622|beaver.machine|INFO|19267|139929561851648|MainThread|RUNNING:
/usr/hdp/current/hadoop-client/bin/hadoop jar /usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples.jar
sort -Dmapreduce.map.output.compress=true -Dmapreduce.map.output.compress.codec=org.apache.hadoop.io.compress.DefaultCodec
-Dmapreduce.output.fileoutputformat.compress=true -Dmapreduce.output.fileoutputformat.compression.type=NONE
-Dmapreduce.output.fileoutputformat.compress.codec=com.hadoop.compression.lzo.LzoCodec -outKey
org.apache.hadoop.io.Text -outValue org.apache.hadoop.io.Text  Compression/textinput Compression/textoutput-1413667294.62
>     2014-10-18 21:21:36,591|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:21:36 INFO client.RMProxy: Connecting to ResourceManager at ambari-rerun-su-1.cs1cloud.internal/172.18.146.170:8050
>     2014-10-18 21:21:37,982|beaver.machine|INFO|19267|139929561851648|MainThread|Running
on 1 nodes to sort from hdfs://ambari-rerun-su-1.cs1cloud.internal:8020/user/hrt_qa/Compression/textinput
into hdfs://ambari-rerun-su-1.cs1cloud.internal:8020/user/hrt_qa/Compression/textoutput-1413667294.62
with 1 reduces.
>     2014-10-18 21:21:37,988|beaver.machine|INFO|19267|139929561851648|MainThread|Job
started: Sat Oct 18 21:21:37 UTC 2014
>     2014-10-18 21:21:38,026|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:21:38 INFO client.RMProxy: Connecting to ResourceManager at ambari-rerun-su-1.cs1cloud.internal/172.18.146.170:8050
>     2014-10-18 21:21:38,113|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:21:38 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token 82 for hrt_qa on 172.18.146.170:8020
>     2014-10-18 21:21:38,138|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:21:38 INFO security.TokenCache: Got dt for hdfs://ambari-rerun-su-1.cs1cloud.internal:8020;
Kind: HDFS_DELEGATION_TOKEN, Service: 172.18.146.170:8020, Ident: (HDFS_DELEGATION_TOKEN token
82 for hrt_qa)
>     2014-10-18 21:21:38,695|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:21:38 INFO input.FileInputFormat: Total input paths to process : 1
>     2014-10-18 21:21:38,908|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:21:38 INFO mapreduce.JobSubmitter: number of splits:1
>     2014-10-18 21:21:39,273|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:21:39 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1413586769062_0027
>     2014-10-18 21:21:39,275|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:21:39 INFO mapreduce.JobSubmitter: Kind: HDFS_DELEGATION_TOKEN, Service: 172.18.146.170:8020,
Ident: (HDFS_DELEGATION_TOKEN token 82 for hrt_qa)
>     2014-10-18 21:21:39,794|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:21:39 INFO impl.YarnClientImpl: Submitted application application_1413586769062_0027
>     2014-10-18 21:21:39,865|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:21:39 INFO mapreduce.Job: The url to track the job: http://ambari-rerun-su-1.cs1cloud.internal:8088/proxy/application_1413586769062_0027/
>     2014-10-18 21:21:39,866|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:21:39 INFO mapreduce.Job: Running job: job_1413586769062_0027
>     2014-10-18 21:21:51,064|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:21:51 INFO mapreduce.Job: Job job_1413586769062_0027 running in uber mode : false
>     2014-10-18 21:21:51,067|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:21:51 INFO mapreduce.Job:  map 0% reduce 0%
>     2014-10-18 21:21:57,153|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:21:57 INFO mapreduce.Job:  map 100% reduce 0%
>     2014-10-18 21:22:03,201|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:22:03 INFO mapreduce.Job: Task Id : attempt_1413586769062_0027_r_000000_0, Status : FAILED
>     2014-10-18 21:22:03,221|beaver.machine|INFO|19267|139929561851648|MainThread|Error:
java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzoCodec
was not found.
>     2014-10-18 21:22:03,222|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.getOutputCompressorClass(FileOutputFormat.java:122)
>     2014-10-18 21:22:03,222|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:56)
>     2014-10-18 21:22:03,222|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
>     2014-10-18 21:22:03,222|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.<init>(ReduceTask.java:540)
>     2014-10-18 21:22:03,222|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:614)
>     2014-10-18 21:22:03,222|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)
>     2014-10-18 21:22:03,223|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
>     2014-10-18 21:22:03,223|beaver.machine|INFO|19267|139929561851648|MainThread|at java.security.AccessController.doPrivileged(Native
Method)
>     2014-10-18 21:22:03,223|beaver.machine|INFO|19267|139929561851648|MainThread|at javax.security.auth.Subject.doAs(Subject.java:415)
>     2014-10-18 21:22:03,223|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>     2014-10-18 21:22:03,223|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
>     2014-10-18 21:22:03,223|beaver.machine|INFO|19267|139929561851648|MainThread|Caused
by: java.lang.ClassNotFoundException: Class com.hadoop.compression.lzo.LzoCodec not found
>     2014-10-18 21:22:03,224|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1954)
>     2014-10-18 21:22:03,224|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.getOutputCompressorClass(FileOutputFormat.java:119)
>     2014-10-18 21:22:03,224|beaver.machine|INFO|19267|139929561851648|MainThread|...
10 more
>     2014-10-18 21:22:03,224|beaver.machine|INFO|19267|139929561851648|MainThread|
>     2014-10-18 21:22:09,269|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:22:09 INFO mapreduce.Job: Task Id : attempt_1413586769062_0027_r_000000_1, Status : FAILED
>     2014-10-18 21:22:09,273|beaver.machine|INFO|19267|139929561851648|MainThread|Error:
java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzoCodec
was not found.
>     2014-10-18 21:22:09,273|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.getOutputCompressorClass(FileOutputFormat.java:122)
>     2014-10-18 21:22:09,274|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:56)
>     2014-10-18 21:22:09,274|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
>     2014-10-18 21:22:09,274|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.<init>(ReduceTask.java:540)
>     2014-10-18 21:22:09,275|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:614)
>     2014-10-18 21:22:09,275|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)
>     2014-10-18 21:22:09,276|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
>     2014-10-18 21:22:09,276|beaver.machine|INFO|19267|139929561851648|MainThread|at java.security.AccessController.doPrivileged(Native
Method)
>     2014-10-18 21:22:09,277|beaver.machine|INFO|19267|139929561851648|MainThread|at javax.security.auth.Subject.doAs(Subject.java:415)
>     2014-10-18 21:22:09,277|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>     2014-10-18 21:22:09,278|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
>     2014-10-18 21:22:09,278|beaver.machine|INFO|19267|139929561851648|MainThread|Caused
by: java.lang.ClassNotFoundException: Class com.hadoop.compression.lzo.LzoCodec not found
>     2014-10-18 21:22:09,279|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1954)
>     2014-10-18 21:22:09,279|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.getOutputCompressorClass(FileOutputFormat.java:119)
>     2014-10-18 21:22:09,280|beaver.machine|INFO|19267|139929561851648|MainThread|...
10 more
>     2014-10-18 21:22:09,280|beaver.machine|INFO|19267|139929561851648|MainThread|
>     2014-10-18 21:22:15,329|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:22:15 INFO mapreduce.Job: Task Id : attempt_1413586769062_0027_r_000000_2, Status : FAILED
>     2014-10-18 21:22:15,333|beaver.machine|INFO|19267|139929561851648|MainThread|Error:
java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzoCodec
was not found.
>     2014-10-18 21:22:15,333|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.getOutputCompressorClass(FileOutputFormat.java:122)
>     2014-10-18 21:22:15,334|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:56)
>     2014-10-18 21:22:15,334|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
>     2014-10-18 21:22:15,335|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.<init>(ReduceTask.java:540)
>     2014-10-18 21:22:15,336|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:614)
>     2014-10-18 21:22:15,336|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)
>     2014-10-18 21:22:15,337|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
>     2014-10-18 21:22:15,337|beaver.machine|INFO|19267|139929561851648|MainThread|at java.security.AccessController.doPrivileged(Native
Method)
>     2014-10-18 21:22:15,338|beaver.machine|INFO|19267|139929561851648|MainThread|at javax.security.auth.Subject.doAs(Subject.java:415)
>     2014-10-18 21:22:15,339|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>     2014-10-18 21:22:15,339|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
>     2014-10-18 21:22:15,340|beaver.machine|INFO|19267|139929561851648|MainThread|Caused
by: java.lang.ClassNotFoundException: Class com.hadoop.compression.lzo.LzoCodec not found
>     2014-10-18 21:22:15,341|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1954)
>     2014-10-18 21:22:15,342|beaver.machine|INFO|19267|139929561851648|MainThread|at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.getOutputCompressorClass(FileOutputFormat.java:119)
>     2014-10-18 21:22:15,343|beaver.machine|INFO|19267|139929561851648|MainThread|...
10 more
>     2014-10-18 21:22:15,343|beaver.machine|INFO|19267|139929561851648|MainThread|
>     2014-10-18 21:22:23,398|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:22:23 INFO mapreduce.Job:  map 100% reduce 100%
>     2014-10-18 21:22:23,418|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:22:23 INFO mapreduce.Job: Job job_1413586769062_0027 failed with state FAILED due to: Task
failed task_1413586769062_0027_r_000000
>     2014-10-18 21:22:23,419|beaver.machine|INFO|19267|139929561851648|MainThread|Job
failed as tasks failed. failedMaps:0 failedReduces:1
>     2014-10-18 21:22:23,419|beaver.machine|INFO|19267|139929561851648|MainThread|
>     2014-10-18 21:22:23,592|beaver.machine|INFO|19267|139929561851648|MainThread|14/10/18
21:22:23 INFO mapreduce.Job: Counters: 37
>     2014-10-18 21:22:23,592|beaver.machine|INFO|19267|139929561851648|MainThread|File
System Counters
>     2014-10-18 21:22:23,593|beaver.machine|INFO|19267|139929561851648|MainThread|FILE:
Number of bytes read=0
>     2014-10-18 21:22:23,593|beaver.machine|INFO|19267|139929561851648|MainThread|FILE:
Number of bytes written=115511
>     2014-10-18 21:22:23,593|beaver.machine|INFO|19267|139929561851648|MainThread|FILE:
Number of read operations=0
>     2014-10-18 21:22:23,593|beaver.machine|INFO|19267|139929561851648|MainThread|FILE:
Number of large read operations=0
>     2014-10-18 21:22:23,593|beaver.machine|INFO|19267|139929561851648|MainThread|FILE:
Number of write operations=0
>     2014-10-18 21:22:23,593|beaver.machine|INFO|19267|139929561851648|MainThread|HDFS:
Number of bytes read=1767
>     2014-10-18 21:22:23,594|beaver.machine|INFO|19267|139929561851648|MainThread|HDFS:
Number of bytes written=0
>     2014-10-18 21:22:23,594|beaver.machine|INFO|19267|139929561851648|MainThread|HDFS:
Number of read operations=4
>     2014-10-18 21:22:23,594|beaver.machine|INFO|19267|139929561851648|MainThread|HDFS:
Number of large read operations=0
>     2014-10-18 21:22:23,594|beaver.machine|INFO|19267|139929561851648|MainThread|HDFS:
Number of write operations=0
>     2014-10-18 21:22:23,594|beaver.machine|INFO|19267|139929561851648|MainThread|Job
Counters
>     2014-10-18 21:22:23,594|beaver.machine|INFO|19267|139929561851648|MainThread|Failed
reduce tasks=4
>     2014-10-18 21:22:23,594|beaver.machine|INFO|19267|139929561851648|MainThread|Launched
map tasks=1
>     2014-10-18 21:22:23,595|beaver.machine|INFO|19267|139929561851648|MainThread|Launched
reduce tasks=4
>     2014-10-18 21:22:23,595|beaver.machine|INFO|19267|139929561851648|MainThread|Data-local
map tasks=1
>     2014-10-18 21:22:23,595|beaver.machine|INFO|19267|139929561851648|MainThread|Total
time spent by all maps in occupied slots (ms)=3977
>     2014-10-18 21:22:23,595|beaver.machine|INFO|19267|139929561851648|MainThread|Total
time spent by all reduces in occupied slots (ms)=15442
>     2014-10-18 21:22:23,595|beaver.machine|INFO|19267|139929561851648|MainThread|Total
time spent by all map tasks (ms)=3977
>     2014-10-18 21:22:23,596|beaver.machine|INFO|19267|139929561851648|MainThread|Total
time spent by all reduce tasks (ms)=15442
>     2014-10-18 21:22:23,596|beaver.machine|INFO|19267|139929561851648|MainThread|Total
vcore-seconds taken by all map tasks=3977
>     2014-10-18 21:22:23,596|beaver.machine|INFO|19267|139929561851648|MainThread|Total
vcore-seconds taken by all reduce tasks=15442
>     2014-10-18 21:22:23,597|beaver.machine|INFO|19267|139929561851648|MainThread|Total
megabyte-seconds taken by all map tasks=4072448
>     2014-10-18 21:22:23,597|beaver.machine|INFO|19267|139929561851648|MainThread|Total
megabyte-seconds taken by all reduce tasks=15812608
>     2014-10-18 21:22:23,598|beaver.machine|INFO|19267|139929561851648|MainThread|Map-Reduce
Framework
>     2014-10-18 21:22:23,598|beaver.machine|INFO|19267|139929561851648|MainThread|Map
input records=2
>     2014-10-18 21:22:23,598|beaver.machine|INFO|19267|139929561851648|MainThread|Map
output records=2
>     2014-10-18 21:22:23,598|beaver.machine|INFO|19267|139929561851648|MainThread|Map
output bytes=1514
>     2014-10-18 21:22:23,598|beaver.machine|INFO|19267|139929561851648|MainThread|Map
output materialized bytes=885
>     2014-10-18 21:22:23,598|beaver.machine|INFO|19267|139929561851648|MainThread|Input
split bytes=159
>     2014-10-18 21:22:23,599|beaver.machine|INFO|19267|139929561851648|MainThread|Combine
input records=0
>     2014-10-18 21:22:23,599|beaver.machine|INFO|19267|139929561851648|MainThread|Spilled
Records=2
>     2014-10-18 21:22:23,599|beaver.machine|INFO|19267|139929561851648|MainThread|Failed
Shuffles=0
>     2014-10-18 21:22:23,599|beaver.machine|INFO|19267|139929561851648|MainThread|Merged
Map outputs=0
>     2014-10-18 21:22:23,599|beaver.machine|INFO|19267|139929561851648|MainThread|GC time
elapsed (ms)=17
>     2014-10-18 21:22:23,599|beaver.machine|INFO|19267|139929561851648|MainThread|CPU
time spent (ms)=670
>     2014-10-18 21:22:23,599|beaver.machine|INFO|19267|139929561851648|MainThread|Physical
memory (bytes) snapshot=595439616
>     2014-10-18 21:22:23,600|beaver.machine|INFO|19267|139929561851648|MainThread|Virtual
memory (bytes) snapshot=1667874816
>     2014-10-18 21:22:23,600|beaver.machine|INFO|19267|139929561851648|MainThread|Total
committed heap usage (bytes)=632291328
>     2014-10-18 21:22:23,600|beaver.machine|INFO|19267|139929561851648|MainThread|File
Input Format Counters
>     2014-10-18 21:22:23,600|beaver.machine|INFO|19267|139929561851648|MainThread|Bytes
Read=1608
>     2014-10-18 21:22:23,600|beaver.machine|INFO|19267|139929561851648|MainThread|Job
ended: Sat Oct 18 21:22:23 UTC 2014
>     2014-10-18 21:22:23,600|beaver.machine|INFO|19267|139929561851648|MainThread|The
job took 45 seconds.
>     
> **lzo package search in nano**
>     
>     
>     
>     find / -name "*lzo*"
>     /usr/hdp/2.2.0.0-908/hadoop/lib/hadoop-lzo-0.6.0.jar
>     /usr/hdp/current/share/lzo
>     /usr/hdp/current/share/lzo/0.6.0/lib/hadoop-lzo-0.6.0.jar
>     
> **lzo package search in ambari**
>     
>     
>     
>     sudo find / -name "*lzo*"
>     /usr/share/mime/application/x-lzop.xml
>     /grid/0/hadoopqe/set_tez_lzo.ps1
>     /grid/0/hadoopqe/tests/flume/conf/exec-file-hdfs-lzop.properties
>     /grid/0/hadoopqe/tests/flume/conf/exec-file-hdfs-lzo.properties
>     /grid/0/hadoopqe/tests/flume/conf/exec-memory-hdfs-lzop.properties
>     /grid/0/hadoopqe/tests/flume/conf/exec-memory-hdfs-lzo.properties
>     /lib/modules/2.6.32-358.el6.x86_64/kernel/crypto/lzo.ko
>     /lib/modules/2.6.32-358.el6.x86_64/kernel/lib/lzo
>     /lib/modules/2.6.32-358.el6.x86_64/kernel/lib/lzo/lzo_compress.ko
>     /lib/modules/2.6.32-358.el6.x86_64/kernel/lib/lzo/lzo_decompress.ko
>     
> **hadoop version**
>     
>     
>     
>     hadoop version
>     Hadoop 2.6.0.2.2.0.0-945
>     Subversion git@github.com:hortonworks/hadoop.git -r 5e72cc2773fc079a72735bd3f4fd347ed24df743
>     Compiled by jenkins on 2014-10-16T23:47Z
>     Compiled with protoc 2.5.0
>     From source with checksum af8da4bc9b78bbbd52225cb96f1bd71
>     This command was run using /usr/hdp/2.2.0.0-945/hadoop/hadoop-common-2.6.0.2.2.0.0-945.jar
>     



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message