apex-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Kottapalli, Venkatesh" <VKottapa...@DIRECTV.com>
Subject RE: Reg. files handled soft limit set in the application
Date Thu, 17 Mar 2016 08:38:24 GMT
Thank You Sandeep. 

-Venkatesh.

-----Original Message-----
From: Sandeep Deshmukh [mailto:sandeep@datatorrent.com] 
Sent: Wednesday, March 16, 2016 11:39 PM
To: dev
Subject: Re: Reg. files handled soft limit set in the application

There is no limit as such from Apex but whatever applies to Hadoop will still be applicable
here. So, you may have to do some tweaks in Hadoop.

https://wiki.apache.org/hadoop/TooManyOpenFiles

http://askubuntu.com/questions/162345/how-to-increase-open-file-limits-nofile-and-epoll-in-10-04

Regards,
Sandeep

Regards,
Sandeep

On Thu, Mar 17, 2016 at 11:24 AM, Kottapalli, Venkatesh < VKottapalli@directv.com> wrote:

> Hi,
>
> App Master fails with the below exception. The upper limit on the 
> system is 1024. Could you please suggest what could be the possible 
> cause. There are no container failures from what I see in the 
> application and not sure why it is opening those many files.
>
> 2016-03-16 23:34:46,269 [943875111@qtp-1149942716-31] FATAL 
> conf.Configuration loadResource - error parsing conf core-site.xml
> java.io.FileNotFoundException:
> /var/run/cloudera-scm-agent/process/3149-yarn-NODEMANAGER/core-site.xm
> l
> (Too many open files)
>         at java.io.FileInputStream.open(Native Method)
>         at java.io.FileInputStream.<init>(FileInputStream.java:146)
>         at java.io.FileInputStream.<init>(FileInputStream.java:101)
>         at
> sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
>         at
> sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
>         at java.net.URL.openStream(URL.java:1037)
>         at
> org.apache.hadoop.conf.Configuration.parse(Configuration.java:2378)
>         at
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2449)
>         at
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2402)
>         at
> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2319)
>         at
> org.apache.hadoop.conf.Configuration.get(Configuration.java:1146)
>         at
> com.datatorrent.stram.util.ConfigUtils.getSchemePrefix(ConfigUtils.java:73)
>         at
> com.datatorrent.stram.StreamingContainerManager.getAppMasterContainerInfo(StreamingContainerManager.java:418)
>         at
> com.datatorrent.stram.webapp.StramWebServices.listContainers(StramWebServices.java:442)
>         at sun.reflect.GeneratedMethodAccessor95.invoke(Unknown Source)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMet
> hodInvokerFactory.java:60)
>
> -Venkatesh.
>
>
>
>
> -----Original Message-----
> From: Tushar Gosavi [mailto:tushar@datatorrent.com]
> Sent: Monday, March 07, 2016 8:18 AM
> To: dev@apex.incubator.apache.org
> Subject: Re: Reg. files handled soft limit set in the application
>
> When running application with lots of physical operators, Application 
> master went down as it was not able to open any new connection because 
> of limit on file handles. Do we open connection per containers or per 
> operator partition. Also there were some failure, may be connection 
> close is not happening when some containers are failing, (RPC timeout 
> is set to high
> value) The user have soft limit set to 1024 and hard limit set to 4096 
> which is also low.
>
> Is there any way (apex configuration property) for running container 
> with increased soft limit. Or use need tos change system configuration 
> to allow more open files per process.
>
> - Tushar.
>
>
> On Mon, Mar 7, 2016 at 9:17 PM, Munagala Ramanath 
> <ram@datatorrent.com>
> wrote:
>
> > *sysctl fs.file-max*
> > should show you the kernel limit.
> >
> > *ulimit -n*
> > shows the per-user limit
> >
> > You can see the list of open files used by a process with (where 
> > <pid> is the process id):
> > *ls -l /proc/<pid>/fd*
> >
> > You can also use the *lsof* command described here:
> > http://www.thegeekstuff.com/2012/08/lsof-command-examples/
> >
> > Are you running into the limit ? Can you share some details of the 
> > error you're seeing ?
> >
> > Ram
> >
> > On Mon, Mar 7, 2016 at 12:25 AM, Kottapalli, Venkatesh < 
> > VKottapalli@directv.com> wrote:
> >
> > > Hi,
> > >
> > >                 Is there a limit set by DT application by default 
> > > on the number of files the application is working on? If so, is 
> > > there a way to increase the soft limit set?
> > >
> > > -Venkatesh.
> > >
> > >
> >
>
Mime
View raw message