ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Alejandro Fernandez (JIRA)" <j...@apache.org>
Subject [jira] [Created] (AMBARI-9954) Spark on tez apps fails needs tez.tar.gz copied to HDFS
Date Fri, 06 Mar 2015 03:22:38 GMT
Alejandro Fernandez created AMBARI-9954:
-------------------------------------------

             Summary: Spark on tez apps fails needs tez.tar.gz copied to HDFS
                 Key: AMBARI-9954
                 URL: https://issues.apache.org/jira/browse/AMBARI-9954
             Project: Ambari
          Issue Type: Bug
          Components: ambari-server
    Affects Versions: 2.0.0
            Reporter: Alejandro Fernandez
            Assignee: Alejandro Fernandez
             Fix For: 2.0.0


The spark on tez apps fails because tez.tar.gz needs to be copied to HDFS.
Currently, only Pig Service Check and Hive START copy it to HDFS.

{noformat}
$ /usr/hdp/current/spark-client/bin/spark-submit  --class org.apache.spark.examples.SparkPi
--master execution-context:org.apache.spark.tez.TezJobExecutionContext /usr/hdp/current/spark-client/lib/spark-examples-1.2.1.2.2.2.0-2538-hadoop2.6.0.2.2.2.0-2538.jar
3

tput: No value for $TERM and no -T specified
Spark assembly has been built with Hive, including Datanucleus jars on classpath
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/grid/0/hdp/2.2.2.0-2538/spark/lib/spark-examples-1.2.1.2.2.2.0-2538-hadoop2.6.0.2.2.2.0-2538.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/grid/0/hdp/2.2.2.0-2538/spark/external/spark-native-yarn/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/03/04 09:27:53 INFO spark.SecurityManager: Changing view acls to: hrt_qa
15/03/04 09:27:53 INFO spark.SecurityManager: Changing modify acls to: hrt_qa
15/03/04 09:27:53 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui
acls disabled; users with view permissions: Set(hrt_qa); users with modify permissions: Set(hrt_qa)
15/03/04 09:27:54 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/03/04 09:27:54 INFO Remoting: Starting remoting
15/03/04 09:27:54 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@ip-172-31-47-166.ec2.internal:34628]
15/03/04 09:27:54 INFO util.Utils: Successfully started service 'sparkDriver' on port 34628.
15/03/04 09:27:54 INFO spark.SparkEnv: Registering MapOutputTracker
15/03/04 09:27:54 INFO spark.SparkEnv: Registering BlockManagerMaster
15/03/04 09:27:54 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-c3fe89f7-4117-41fc-8a62-01f0451c9060/spark-a209539b-07ae-42ad-a83d-9ad53b1c6adc
15/03/04 09:27:54 INFO storage.MemoryStore: MemoryStore started with capacity 265.4 MB
15/03/04 09:27:55 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
15/03/04 09:27:55 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-995afa1c-3d9a-453b-a84c-b02b73aab8d7/spark-57d211b1-32a5-453c-b2d0-5f010b4cf74a
15/03/04 09:27:55 INFO spark.HttpServer: Starting HTTP Server
15/03/04 09:27:55 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/03/04 09:27:55 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:52966
15/03/04 09:27:55 INFO util.Utils: Successfully started service 'HTTP file server' on port
52966.
15/03/04 09:27:55 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/03/04 09:27:55 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
15/03/04 09:27:55 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
15/03/04 09:27:55 INFO ui.SparkUI: Started SparkUI at http://ip-172-31-47-166.ec2.internal:4040
15/03/04 09:27:55 INFO spark.SparkContext: Will use custom job execution context org.apache.spark.tez.TezJobExecutionContext
15/03/04 09:27:56 INFO tez.TezJobExecutionContext: Config dir: /etc/hadoop/conf
15/03/04 09:27:56 INFO tez.TezJobExecutionContext: FileSystem: hdfs://ip-172-31-47-165.ec2.internal:8020
15/03/04 09:27:56 INFO tez.TezJobExecutionContext: Error while accessing configuration. Possible
cause - 'version missmatch'
org.apache.hadoop.conf.Configuration is loaded from file:/grid/0/hdp/2.2.2.0-2538/spark/external/spark-native-yarn/lib/hadoop-common-2.6.0.2.2.2.0-2538.jar
org.apache.tez.dag.api.TezConfiguration is loaded from file:/grid/0/hdp/2.2.2.0-2538/spark/external/spark-native-yarn/lib/tez-api-0.5.2.2.2.2.0-2538.jar
15/03/04 09:27:57 WARN shortcircuit.DomainSocketFactory: The short-circuit local reads feature
cannot be used because libhadoop cannot be loaded.
15/03/04 09:27:57 INFO netty.NettyBlockTransferService: Server created on 58099
15/03/04 09:27:57 INFO storage.BlockManagerMaster: Trying to register BlockManager
15/03/04 09:27:57 INFO storage.BlockManagerMasterActor: Registering block manager ip-172-31-47-166.ec2.internal:58099
with 265.4 MB RAM, BlockManagerId(<driver>, ip-172-31-47-166.ec2.internal, 58099)
15/03/04 09:27:57 INFO storage.BlockManagerMaster: Registered BlockManager
15/03/04 09:27:58 INFO tez.TezDelegate: Job: Spark Pi will be submitted to the following YARN
cluster:
15/03/04 09:27:58 INFO tez.TezDelegate: Default FS Address: hdfs://ip-172-31-47-165.ec2.internal:8020
15/03/04 09:27:58 INFO tez.TezDelegate: RM Host Name: ip-172-31-47-165.ec2.internal
15/03/04 09:27:58 INFO tez.TezDelegate: RM Address: ip-172-31-47-165.ec2.internal:8050
15/03/04 09:27:58 INFO tez.TezDelegate: RM Scheduler Address: ip-172-31-47-165.ec2.internal:8030
15/03/04 09:27:58 INFO tez.TezDelegate: RM Resource Tracker Address: null
15/03/04 09:27:58 INFO tez.TezDelegate: Application classpath dir is: hdfs://ip-172-31-47-165.ec2.internal:8020/user/hrt_qa/Spark
Pi/app-classpath
15/03/04 09:28:08 INFO utils.HadoopUtils: Skipping provisioning of tez-api-0.5.2.2.2.2.0-2538.jar
since Tez libraries are already provisioned
15/03/04 09:28:08 INFO utils.HadoopUtils: Skipping provisioning of tez-common-0.5.2.2.2.2.0-2538.jar
since Tez libraries are already provisioned
15/03/04 09:28:08 INFO utils.HadoopUtils: Skipping provisioning of tez-dag-0.5.2.2.2.2.0-2538.jar
since Tez libraries are already provisioned
15/03/04 09:28:08 INFO utils.HadoopUtils: Skipping provisioning of tez-mapreduce-0.5.2.2.2.2.0-2538.jar
since Tez libraries are already provisioned
15/03/04 09:28:08 INFO utils.HadoopUtils: Skipping provisioning of tez-runtime-internals-0.5.2.2.2.2.0-2538.jar
since Tez libraries are already provisioned
15/03/04 09:28:08 INFO utils.HadoopUtils: Skipping provisioning of tez-runtime-library-0.5.2.2.2.2.0-2538.jar
since Tez libraries are already provisioned
15/03/04 09:28:09 INFO client.TezClient: Tez Client Version: [ component=tez-api, version=0.5.2.2.2.2.0-2538,
revision=2d3c6b639d5b1048bd20aad5736823a35edd2485, SCM-URL=scm:git:https://git-wip-us.apache.org/repos/asf/tez.git,
buildTIme=20150304-0248 ]
15/03/04 09:28:09 INFO impl.TimelineClientImpl: Timeline service address: http://ip-172-31-47-165.ec2.internal:8188/ws/v1/timeline/
15/03/04 09:28:10 INFO client.RMProxy: Connecting to ResourceManager at ip-172-31-47-165.ec2.internal/172.31.47.165:8050
15/03/04 09:28:10 INFO tez.Utils: STAGE Result: Stage 0 vertex: 0
15/03/04 09:28:10 INFO tez.Utils: DAG: {0=(stage: 0; vertex:0; input:[])}
15/03/04 09:28:10 INFO tez.DAGBuilder: Submitting generated DAG to YARN/Tez cluster
15/03/04 09:28:10 INFO client.TezClient: Submitting DAG application with id: application_1425459626716_0002
15/03/04 09:28:10 INFO client.TezClientUtils: Using tez.lib.uris value from configuration:
/hdp/apps/2.2.2.0-2538/tez/tez.tar.gz
java.io.FileNotFoundException: File does not exist: /hdp/apps/2.2.2.0-2538/tez/tez.tar.gz
at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1140)
at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1132)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1132)
at org.apache.hadoop.fs.FileSystem.resolvePath(FileSystem.java:750)
at org.apache.tez.client.TezClientUtils.getLRFileStatus(TezClientUtils.java:127)
at org.apache.tez.client.TezClientUtils.setupTezJarsLocalResources(TezClientUtils.java:178)
at org.apache.tez.client.TezClient.getTezJarResources(TezClient.java:721)
at org.apache.tez.client.TezClient.submitDAGApplication(TezClient.java:689)
at org.apache.tez.client.TezClient.submitDAGApplication(TezClient.java:667)
at org.apache.tez.client.TezClient.submitDAG(TezClient.java:353)
at org.apache.spark.tez.DAGBuilder.run(DAGBuilder.java:164)
at org.apache.spark.tez.DAGBuilder.access$000(DAGBuilder.java:60)
at org.apache.spark.tez.DAGBuilder$1.execute(DAGBuilder.java:148)
at org.apache.spark.tez.TezDelegate.submitApplication(TezDelegate.scala:82)
at org.apache.spark.tez.TezJobExecutionContext.runJob(TezJobExecutionContext.scala:168)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1292)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1358)
at org.apache.spark.rdd.RDD.reduce(RDD.scala:882)
at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:41)
at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:367)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:77)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Error: application failed with exception
java.lang.IllegalStateException: Failed to execute DAG
at org.apache.spark.tez.DAGBuilder.run(DAGBuilder.java:177)
at org.apache.spark.tez.DAGBuilder.access$000(DAGBuilder.java:60)
at org.apache.spark.tez.DAGBuilder$1.execute(DAGBuilder.java:148)
at org.apache.spark.tez.TezDelegate.submitApplication(TezDelegate.scala:82)
at org.apache.spark.tez.TezJobExecutionContext.runJob(TezJobExecutionContext.scala:168)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1292)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1358)
at org.apache.spark.rdd.RDD.reduce(RDD.scala:882)
at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:41)
at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:367)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:77)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.io.FileNotFoundException: File does not exist: /hdp/apps/2.2.2.0-2538/tez/tez.tar.gz
at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1140)
at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1132)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1132)
at org.apache.hadoop.fs.FileSystem.resolvePath(FileSystem.java:750)
at org.apache.tez.client.TezClientUtils.getLRFileStatus(TezClientUtils.java:127)
at org.apache.tez.client.TezClientUtils.setupTezJarsLocalResources(TezClientUtils.java:178)
at org.apache.tez.client.TezClient.getTezJarResources(TezClient.java:721)
at org.apache.tez.client.TezClient.submitDAGApplication(TezClient.java:689)
at org.apache.tez.client.TezClient.submitDAGApplication(TezClient.java:667)
at org.apache.tez.client.TezClient.submitDAG(TezClient.java:353)
at org.apache.spark.tez.DAGBuilder.run(DAGBuilder.java:164)
... 16 more
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/grid/0/hdp/2.2.2.0-2538/spark/lib/spark-examples-1.2.1.2.2.2.0-2538-hadoop2.6.0.2.2.2.0-2538.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/grid/0/hdp/2.2.2.0-2538/spark/external/spark-native-yarn/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
{noformat}

Here is the log location http://qelog.hortonworks.com/log/ec2-amb-s11-3-us-1425455713-spark



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message