hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Scott <skes...@weather.com>
Subject Re: Cant get HDFS to load more than 1Gig total data
Date Thu, 07 Jan 2010 21:11:15 GMT
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
  <meta content="text/html;charset=ISO-8859-1" http-equiv="Content-Type">
<body bgcolor="#ffffff" text="#000000">
Thanks for the reply Allen.&nbsp; I fixed it, though I am not sure of the
exact cause as I found more than one thing wrong.&nbsp; There were some
directories owned by root instead of hadoop that I think were the main
Allen Wittenauer wrote:
<blockquote cite="mid:C76B50A1.223C%25awittenauer@linkedin.com"
  <pre wrap="">

On 1/7/10 8:34 AM, "Scott" <a class="moz-txt-link-rfc2396E" href="mailto:skester@weather.com">&lt;skester@weather.com&gt;</a>
  <blockquote type="cite">
    <pre wrap=""> WARN hdfs.DFSClient: DataStreamer Exception:
org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
/user/hadoop/ads3x11-1256301562.log.lzo could only be replicated to 0
nodes, instead of 1
  <pre wrap=""><!---->
This almost always means your HDFS is in safemode and/or has no live

  <blockquote type="cite">
    <pre wrap="">I have checked quotas and found none.  I have also tried other users,
including the hadoop user, and get the same result.  Any ideas?
  <pre wrap=""><!---->
How is the namenode heap?

Are you out of physical space?

What does hadoop fsck / say?


View raw message