hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Liam Slusser <lslus...@gmail.com>
Subject Re: oldWALs: what it is and how can I clean it?
Date Thu, 26 Feb 2015 00:40:40 GMT
I'm having this same problem.  I had replication enabled but have since
been disabled.  However oldWALs still grows.  There are so many files in
there that running "hadoop fs -ls /hbase/oldWALs" runs out of memory.

On Wed, Feb 25, 2015 at 9:27 AM, Nishanth S <nishanth.2884@gmail.com> wrote:

> Do you have replication turned on in hbase and  if so is your slave
>  consuming the replicated data?.
>
> -Nishanth
>
> On Wed, Feb 25, 2015 at 10:19 AM, Madeleine Piffaretti <
> mpiffaretti@powerspace.com> wrote:
>
> > Hi all,
> >
> > We are running out of space in our small hadoop cluster so I was checking
> > disk usage on HDFS and I saw that most of the space was occupied by the*
> > /hbase/oldWALs* folder.
> >
> > I have checked in the "HBase Definitive Book" and others books, web-site
> > and I have also search my issue on google but I didn't find a proper
> > response...
> >
> > So I would like to know what does this folder, what is use for and also
> how
> > can I free space from this folder without breaking everything...
> >
> >
> > If it's related to a specific version... our cluster is under
> > 5.3.0-1.cdh5.3.0.p0.30 from cloudera (hbase 0.98.6).
> >
> > Thx for your help!
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message