Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 24B9F19644 for ; Wed, 9 Mar 2016 09:12:41 +0000 (UTC) Received: (qmail 83512 invoked by uid 500); 9 Mar 2016 09:12:31 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 83262 invoked by uid 500); 9 Mar 2016 09:12:31 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 82674 invoked by uid 99); 9 Mar 2016 09:12:31 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 09 Mar 2016 09:12:31 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id 13CA4C0123 for ; Wed, 9 Mar 2016 09:12:31 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 1.18 X-Spam-Level: * X-Spam-Status: No, score=1.18 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=2, RCVD_IN_DNSWL_LOW=-0.7, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, SPF_PASS=-0.001, WEIRD_PORT=0.001] autolearn=disabled Authentication-Results: spamd1-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx2-lw-us.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id 6vAFCA4-7n9f for ; Wed, 9 Mar 2016 09:12:30 +0000 (UTC) Received: from mail-oi0-f45.google.com (mail-oi0-f45.google.com [209.85.218.45]) by mx2-lw-us.apache.org (ASF Mail Server at mx2-lw-us.apache.org) with ESMTPS id 9A22A5FB17 for ; Wed, 9 Mar 2016 09:12:29 +0000 (UTC) Received: by mail-oi0-f45.google.com with SMTP id c203so30730374oia.2 for ; Wed, 09 Mar 2016 01:12:29 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:from:date:message-id:subject:to; bh=AiyYqh/HVhosja3EigmPRjrUWX90cXd/oPf6/9mJSq4=; b=pjrPXUZ/3EeFjBdFjJPAd9+spit4lhI3sE14iEgCGxiOEUJUDSA4FnCnCTk73aIEu8 2o1kISjHyE7isa6xW5ggquEyymdspHseCVLy7u1UDVTpX6bpckz44xsPPB/VHs/Mgcnq rfVW9HgkdPQaR4Tn2i4s2N5L/g+okgD6uRVF7VFERpeKKkxNHYxcqMFIVHOtGNqFwJ9F WguH3MjAlMPfizx3yRrtf73ceVKDa4/s1sChs/LMomKrH1eQi/IYSw6Cw9drqYzutYqN 4plPnPEIDokeWhWBa23mJ7st73wgWIr+gkz3maefP6E5gKDzVYG2BhyFarHIvglSROpL usDw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:from:date:message-id:subject:to; bh=AiyYqh/HVhosja3EigmPRjrUWX90cXd/oPf6/9mJSq4=; b=Ie/bXIFCVKYPmI9L6gVhl7lmGpNVPsfVjfwfdBVOuKoZJuIKsVBkxDoS1yWlxPDKOr LZI6q44w+53szryV3nd19C8o6LqA+gmIwYTJChzbhkKWB3/GouGg80xKwJQ0YtKsjlXd J1KFsUlJw2kb+VF48GjNJn9c9FXdB33ByMRdgx4nA8NBMNRFM3OYLsKsr6TkOetDIPRz BylVhnXpRF4bcfH+lLGSGy1rba/ARX/Ls8NZg3WheZjXl1oe09IksDafbY7aiwvXzTN5 RZN5kd0kbF7CjAbp+iELwlEGOd4Tsp3lwGyT8gPAYl5uq8XuEtfRrU14oglVMC/u4IcK 9J+w== X-Gm-Message-State: AD7BkJLCrsu64ZByoT+D8BrhjExUDpiqMhyBUZog/ewpEBfpTkzlnsSVIhSq1w5tMvryQksNFU58WQpTjqLI8w== X-Received: by 10.202.232.141 with SMTP id f135mr11646909oih.95.1457514748895; Wed, 09 Mar 2016 01:12:28 -0800 (PST) MIME-Version: 1.0 Received: by 10.202.185.195 with HTTP; Wed, 9 Mar 2016 01:11:49 -0800 (PST) From: Divya Gehlot Date: Wed, 9 Mar 2016 17:11:49 +0800 Message-ID: Subject: [Error]Run Spark job as hdfs user from oozie workflow To: "user @spark" , user@hadoop.apache.org Content-Type: multipart/alternative; boundary=001a11408ca0dce909052d9a18f0 --001a11408ca0dce909052d9a18f0 Content-Type: text/plain; charset=UTF-8 Hi, I have non secure Hadoop 2.7.2 cluster on EC2 having Spark 1.5.2 When I am submitting my spark scala script through shell script using Oozie workflow. I am submitting job as hdfs user but It is running as user = "yarn" so all the output should get store under user/yarn directory only . When I googled and got YARN-2424 for non secure cluster I changed the settings as per this docs and when I ran my Oozie workflow as hdfs user got below error Application application_1457494230162_0004 failed 2 times due to AM Container for appattempt_1457494230162_0004_000002 exited with exitCode: -1000 For more detailed output, check application tracking page: http://ip-xxx-xx-xx-xxx.ap-southeast-1.compute.internal:8088/cluster/app/application_1457494230162_0004Then , click on links to logs of each attempt. Diagnostics: Application application_1457494230162_0004 initialization failed (exitCode=255) with output: main : command provided 0 main : run as user is hdfs main : requested yarn user is hdfs Can't create directory /hadoop/yarn/local/usercache/hdfs/appcache/application_1457494230162_0004 - Permission denied Did not create any app directories Failing this attempt. Failing the application. After changing the settiing when I start spark shell I got error saying that Error starting SQLContext -Yarn application has ended Has anybody ran into these kind of issues? Would really appreciate if you could guide me to the steps/docs to resolve it. Thanks, Divya --001a11408ca0dce909052d9a18f0 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Hi,
I have non secure =C2=A0Hadoop 2.7.2 cluster on EC2 having Spark 1.5.2
When I am s= ubmitting my spark scala script through shell script using Oozie workflow.<= /div>
I am submitting job as hdfs user but It is running as user =3D &q= uot;yarn" so all the output should get store under user/yarn directory= only .

When I googled and got YARN-2424 for non secure clust= er
I changed the settings as per this docs=C2=A0
and= when I ran my Oozie workflow as hdfs user =C2=A0got below error=C2=A0

Application application_1457494230162_0004 failed 2 times = due to AM Container for appattempt_1457494230162_0004_000002 exited with ex= itCode: -1000
For more detailed output, check application tracking pa= ge:http://ip-xxx-xx-xx-xxx.ap-southeast-1.comput= e.internal:8088/cluster/app/application_1457494230162_0004Then, click on li= nks to logs of each attempt.
Diagnostics: Application application_145= 7494230162_0004 initialization failed (exitCode=3D255) with output: main : = command provided 0
main : run as user is hdfs
main : requeste= d yarn user is hdfs
Can't create directory /hadoop/yarn/local/use= rcache/hdfs/appcache/application_1457494230162_0004 - Permission denied
=
Did not create any app directories
Failing this attempt. Failing t= he application.

After changing the settii= ng when I start spark shell=C2=A0
I got error saying= that Error starting SQLContext -Yarn application has ended=C2=A0

Has anybody ran into these kind o= f issues?
Would really appreciate if you could= guide me to the steps/docs to resolve it.


Thanks,
<= div>Divya
--001a11408ca0dce909052d9a18f0--