apex-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Raja.Aravapalli <Raja.Aravapa...@target.com>
Subject Re: DAG is failing due to memory issues
Date Tue, 12 Jul 2016 17:13:03 GMT

Hi,


What memory does the “allocated mem.” refers to on UI for a DAG ? Application Master OR
Containers memory of an operators ?


[cid:B61FE0C9-4767-4FF8-9E23-454CB502C53C]


I included below properties as well and re-triggered the DAG, still it is showing 32GB only!!


<property>
    <name>dt.application.<APP_NAME>.attr.MASTER_MEMORY_MB</name>
    <value>4096</value>
</property>

<property>
    <name>dt.application.<APP_NAME>.operator.*.attr.MEMORY_MB</name>
    <value>4096</value>
</property>


I have the same DAG running on other hadoop environment, which is showing approx. 125gb, but
in other environment only 32gb, which is what I am assuming to be the problem !!


Regards,
Raja.


From: Sandesh Hegde <sandesh@datatorrent.com<mailto:sandesh@datatorrent.com>>
Reply-To: "users@apex.apache.org<mailto:users@apex.apache.org>" <users@apex.apache.org<mailto:users@apex.apache.org>>
Date: Tuesday, July 12, 2016 at 11:35 AM
To: "users@apex.apache.org<mailto:users@apex.apache.org>" <users@apex.apache.org<mailto:users@apex.apache.org>>
Subject: Re: DAG is failing due to memory issues

Raja,

Please increase the container size and launch the app again.  yarn.scheduler.maximum-allocation-mb
is for the container and not for the DAG and the error message showed by you is for the container.

Here is one quick way, use the following attribute.

<property>
  <name>dt.operator.*.attr.MEMORY_MB</name>
  <value>4096</value>
</property>


On Tue, Jul 12, 2016 at 9:24 AM Raja.Aravapalli <Raja.Aravapalli@target.com<mailto:Raja.Aravapalli@target.com>>
wrote:

Hi Ram,

Sorry I did not share that details of 32gb with you.

I am saying 32gb is allocated because, I observed the same on UI, when the application is
running. But now, as the DAG is failed, I cannot take a screenshot and send!!


Regards,
Raja.

From: Munagala Ramanath <ram@datatorrent.com<mailto:ram@datatorrent.com>>
Reply-To: "users@apex.apache.org<mailto:users@apex.apache.org>" <users@apex.apache.org<mailto:users@apex.apache.org>>
Date: Tuesday, July 12, 2016 at 11:06 AM

To: "users@apex.apache.org<mailto:users@apex.apache.org>" <users@apex.apache.org<mailto:users@apex.apache.org>>
Subject: Re: DAG is failing due to memory issues

How do you know it is allocating 32GB ? The diagnostic message you posted does not show
that.

Ram

On Tue, Jul 12, 2016 at 8:51 AM, Raja.Aravapalli <Raja.Aravapalli@target.com<mailto:Raja.Aravapalli@target.com>>
wrote:

Thanks for the response Sandesh.

Since our yarn-site is configured with value 32768 for the property yarn.scheduler.maximum-allocation-mb,
it is allocating a max of 32gb and not more than that!!


Wish to know, is there a way I can increase the max allowed value ? OR, since it is configured
in yarn-site.xml, I cannot increase it ?



Regards,
Raja.

From: Sandesh Hegde <sandesh@datatorrent.com<mailto:sandesh@datatorrent.com>>
Reply-To: "users@apex.apache.org<mailto:users@apex.apache.org>" <users@apex.apache.org<mailto:users@apex.apache.org>>
Date: Tuesday, July 12, 2016 at 10:46 AM
To: "users@apex.apache.org<mailto:users@apex.apache.org>" <users@apex.apache.org<mailto:users@apex.apache.org>>
Subject: Re: DAG is failing due to memory issues

Quoting from the doc shared by the Ram, those parameters control operator memory size.


 actual container memory allocated by RM has to lie between

[yarn.scheduler.minimum-allocation-mb, yarn.scheduler.maximum-allocation-mb]

On Tue, Jul 12, 2016 at 8:38 AM Raja.Aravapalli <Raja.Aravapalli@target.com<mailto:Raja.Aravapalli@target.com>>
wrote:

Hi Ram,

I see in the cluster yarn-site.xml, below two properties are configured with below settings..

yarn.scheduler.minimum-allocation-mb ===> 1024
yarn.scheduler.maximum-allocation-mb ===> 32768


So with the above settings at cluster level, I can’t increase the memory allocated for my
DAG ?  Is there is any other way, I can increase the memory ?


Thanks a lot.


Regards,
Raja.

From: Munagala Ramanath <ram@datatorrent.com<mailto:ram@datatorrent.com>>
Reply-To: "users@apex.apache.org<mailto:users@apex.apache.org>" <users@apex.apache.org<mailto:users@apex.apache.org>>
Date: Tuesday, July 12, 2016 at 9:31 AM
To: "users@apex.apache.org<mailto:users@apex.apache.org>" <users@apex.apache.org<mailto:users@apex.apache.org>>
Subject: Re: DAG is failing due to memory issues

Please see: http://docs.datatorrent.com/troubleshooting/#configuring-memory

Ram

On Tue, Jul 12, 2016 at 6:57 AM, Raja.Aravapalli <Raja.Aravapalli@target.com<mailto:Raja.Aravapalli@target.com>>
wrote:

Hi,

My DAG is failing with memory issues for container. Seeing below information in the log.



Diagnostics: Container [pid=xxx,containerID=container_xyclksdjf] is running beyond physical
memory limits. Current usage: 1.0 GB of 1 GB physical memory used; 2.9 GB of 2.1 GB virtual
memory used. Killing container.


Can someone help me on how I can fix this issue. Thanks a lot.



Regards,
Raja.


Mime
View raw message