spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Robineast <Robin.e...@xense.co.uk>
Subject Re: Various Apache Spark's deployment problems
Date Fri, 29 Apr 2016 20:10:19 GMT
Do you need 2 --num-executors ?

Sent from my iPhone

> On 29 Apr 2016, at 20:25, Ashish Sharma [via Apache Spark User List] <ml-node+s1001560n26847h62@n3.nabble.com>
wrote:
> 
> Submit Command1: 
> 
>         spark-submit --class working.path.to.Main \ 
>                 --master yarn \ 
>                 --deploy-mode cluster \ 
>                 --num-executors 17 \ 
>                 --executor-cores 8 \ 
>                 --executor-memory 25g \ 
>                 --driver-memory 25g \ 
>                 --num-executors 5 \ 
>             application-with-all-dependencies.jar 
>     
> Error Log1: 
> 
>         User class threw exception: java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient 
>         
> Submit Command2: 
> 
>         spark-submit --class working.path.to.Main \ 
>                 --master yarn \ 
>                 --deploy-mode cluster \ 
>                 --num-executors 17 \ 
>                 --executor-cores 8 \ 
>                 --executor-memory 25g \ 
>                 --driver-memory 25g \ 
>                 --num-executors 5 \ 
>                 --files /etc/hive/conf/hive-site.xml \ 
>                 application-with-all-dependencies.jar 
> 
> Error Log2: 
> 
>         User class threw exception: java.lang.NumberFormatException: For input string:
"5s" 
> 
> Since I don't have the administrative permissions, I cannot modify the configuration.
Well, I can contact to the IT engineer and make the changes, but I'm looking for the 
> solution that involves less changes in the configuration files, if possible! 
> 
> Configuration changes were suggested in here: 
> https://hadoopist.wordpress.com/2016/02/23/how-to-resolve-error-yarn-applicationmaster-user-class-threw-exception-java-lang-runtimeexception-java-lang-numberformatexception-for-input-string-5s-in-spark-submit/
> 
> Then I tried passing various jar files as arguments as suggested in other discussion
forums. 
> 
> Submit Command3: 
> 
>         spark-submit --class working.path.to.Main \ 
>                 --master yarn \ 
>                 --deploy-mode cluster \ 
>                 --num-executors 17 \ 
>                 --executor-cores 8 \ 
>                 --executor-memory 25g \ 
>                 --driver-memory 25g \ 
>                 --num-executors 5 \ 
>                 --jars /usr/hdp/2.3.0.0-2557/spark/lib/datanucleus-api-jdo-3.2.6.jar,/usr/hdp/2.3.0.0-2557/spark/lib/datanucleus-core-3.2.10.jar,/usr/hdp/2.3.0.0-2557/spark/lib/datanucleus-rdbms-3.2.9.jar
\ 
>                 --files /etc/hive/conf/hive-site.xml \ 
>                 application-with-all-dependencies.jar 
> 
> Error Log3: 
> 
>         User class threw exception: java.lang.NumberFormatException: For input string:
"5s" 
>     
> I didn't understood what happened with the following command and couldn't analyze the
error log. 
> 
> Submit Command4: 
> 
>         spark-submit --class working.path.to.Main \ 
>                 --master yarn \ 
>                 --deploy-mode cluster \ 
>                 --num-executors 17 \ 
>                 --executor-cores 8 \ 
>                 --executor-memory 25g \ 
>                 --driver-memory 25g \ 
>                 --num-executors 5 \ 
>                 --jars /usr/hdp/2.3.0.0-2557/spark/lib/*.jar \ 
>                 --files /etc/hive/conf/hive-site.xml \ 
>                 application-with-all-dependencies.jar 
> 
> Submit Log4: 
> 
>         Application application_1461686223085_0014 failed 2 times due to AM Container
for appattempt_1461686223085_0014_000002 exited with exitCode: 10 
>         For more detailed output, check application tracking page:<a href="http://cluster-host:XXXX/cluster/app/application_1461686223085_0014Then">http://cluster-host:XXXX/cluster/app/application_1461686223085_0014Then,
click on links to logs of each attempt. 
>         Diagnostics: Exception from container-launch. 
>         Container id: container_e10_1461686223085_0014_02_000001 
>         Exit code: 10 
>         Stack trace: ExitCodeException exitCode=10: 
>         at org.apache.hadoop.util.Shell.runCommand(Shell.java:545) 
>         at org.apache.hadoop.util.Shell.run(Shell.java:456) 
>         at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722)

>         at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)

>         at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)

>         at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)

>         at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

>         at java.lang.Thread.run(Thread.java:745) 
>         Container exited with a non-zero exit code 10 
>         Failing this attempt. Failing the application. 
>         
> 
> Any other possible options? Any kind of help will be highly appreciated. Please let me
know if you need any other information. 
> 
> Thank you. 
> 
> If you reply to this email, your message will be added to the discussion below:
> http://apache-spark-user-list.1001560.n3.nabble.com/Various-Apache-Spark-s-deployment-problems-tp26847.html
> To start a new topic under Apache Spark User List, email ml-node+s1001560n1h36@n3.nabble.com

> To unsubscribe from Apache Spark User List, click here.
> NAML




-----
Robin East 
Spark GraphX in Action Michael Malak and Robin East 
Manning Publications Co. 
http://www.manning.com/books/spark-graphx-in-action

--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Various-Apache-Spark-s-deployment-problems-tp26847p26848.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Mime
View raw message