mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Wasim <wasim.k...@gmail.com>
Subject File name too long error on linux machine
Date Fri, 11 Mar 2011 18:44:05 GMT
Hi,

I have executed synthetic control data example from mahout wiki page on
linux mahcine. When i execute the following command to copy data from hadoop
to my local machine:

$HADOOP_HOME/bin/hadoop fs -get output $MAHOUT_HOME/examples

i get the following error:

get: File name too long

Actually, when i executed the following command to see all outputs on
hadoop:

$HADOOP_HOME/bin/hadoop fs -lsr output

there were indeed some very long file names there, such as:

/user/behoover-a001/output/clusteredPoints/_logs/history/localhost_1299864873686_job_201103111834_0013_behoover-a001_KMeans+Driver+running+clusterData+over+input%3A+outp

there are many others like this.

can anybody please have the solution for this problem?

-- 
Thank you & Regards
Muhammad Wasimullah Khan
Mobile:+46 72 03 29 205
Alt.Telephone: +92 345 21 98 451
Email: mwkhan@kth.se
Skype: muhammad.wasim.khan

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message