hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alexander Striffeler <a.striffe...@students.unibe.ch>
Subject NNBench on external HDFS
Date Wed, 22 Jul 2015 07:41:29 GMT
Hi all

I'm pretty new to the Hadoop environment and I'm about performing some 
micro benchmarks. In particular, I'm struggling with executing NNBench 
against an external File System:

hadoop jar 
/usr/hdp/2.2.6.0-2800/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-tests.jar 
nnbench -Dfs.defaultFS='hfds://<external.file.system>' -operation 
create_write -bytesToWrite 10 -maps 2 -reduces 1 -numberOfFiles 100 
-baseDir hdfs://dapsilon.daplab.ch/user/username/nnbench-`hostname -s`

yields in
java.lang.IllegalArgumentException: Wrong FS: 
hdfs://<external.file.system>/user/username/nnbench-hostname/data, 
expected: hdfs://<native fs>

If I neglect the ext FS prefix in the baseDir, NNBench simply ignores 
the -D option and writes the files to the native DFS. Does anyone have 
an idea how to solve this and nnbench an external DFS?

Thanks a lot, any hints are very appreciated!
Regards,
Alex

Mime
View raw message