hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Harsh J (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (HADOOP-8577) The RPC must have failed proxyUser (auth:SIMPLE) via realUser1@HADOOP.APACHE.ORG (auth:SIMPLE)
Date Sun, 08 Jul 2012 16:27:35 GMT

     [ https://issues.apache.org/jira/browse/HADOOP-8577?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Harsh J resolved HADOOP-8577.
-----------------------------

    Resolution: Invalid

The JIRA is to track issues with the project, not for user/dev-help. Please ask your question
on common-dev[at]hadoop.apache.org mailing lists instead, and refrain from posting general
questions on the JIRA. Thanks! :)

P.s. The issue is your OS. Fix your /etc/hosts to use the right format of "IP FQDN ALIAS",
instead of "IP ALIAS FQDN". In any case, please mail the right user/dev group. See http://hadoop.apache.org/mailing_lists.html
                
> The RPC must have failed proxyUser (auth:SIMPLE) via realUser1@HADOOP.APACHE.ORG (auth:SIMPLE)
> ----------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-8577
>                 URL: https://issues.apache.org/jira/browse/HADOOP-8577
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: test
>         Environment: Ubuntu 11
> JDK 1.7
> Maven 3.0.4
>            Reporter: chandrashekhar Kotekar
>            Priority: Minor
>   Original Estimate: 12h
>  Remaining Estimate: 12h
>
> Hi,
> I have downloaded maven source code today itself and tried test it. I did following steps
:
> 1) mvn clean
> 2) mvn compile
> 3) mvn test
> After 3rd step one step failed. Stack trace of failed test is as follows :
> Failed tests:   testRealUserIPNotSpecified(org.apache.hadoop.security.TestDoAsEffectiveUser):
The RPC must have failed proxyUser (auth:SIMPLE) via realUser1@HADOOP.APACHE.ORG (auth:SIMPLE)
>   testWithDirStringAndConf(org.apache.hadoop.fs.shell.TestPathData): checking exist
>   testPartialAuthority(org.apache.hadoop.fs.TestFileSystemCanonicalization): expected:<myfs://host.a.b:123>
but was:<myfs://host.a:123>
>   testFullAuthority(org.apache.hadoop.fs.TestFileSystemCanonicalization): expected:<null>
but was:<java.lang.IllegalArgumentException: Wrong FS: myfs://host/file, expected: myfs://host.a.b>
>   testShortAuthorityWithDefaultPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
expected:<myfs://host.a.b:123> but was:<myfs://host:123>
>   testPartialAuthorityWithDefaultPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
expected:<myfs://host.a.b:123> but was:<myfs://host.a:123>
>   testShortAuthority(org.apache.hadoop.fs.TestFileSystemCanonicalization): expected:<myfs://host.a.b:123>
but was:<myfs://host:123>
>   testIpAuthorityWithOtherPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
expected:<myfs://127.0.0.1:456> but was:<myfs://localhost:456>
>   testAuthorityFromDefaultFS(org.apache.hadoop.fs.TestFileSystemCanonicalization): expected:<myfs://host.a.b:123>
but was:<myfs://host:123>
>   testFullAuthorityWithDefaultPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
expected:<null> but was:<java.lang.IllegalArgumentException: Wrong FS: myfs://host/file,
expected: myfs://host.a.b:123>
>   testShortAuthorityWithOtherPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
expected:<myfs://host.a.b:456> but was:<myfs://host:456>
>   testPartialAuthorityWithOtherPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
expected:<myfs://host.a.b:456> but was:<myfs://host.a:456>
>   testFullAuthorityWithOtherPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
expected:<null> but was:<java.lang.IllegalArgumentException: Wrong FS: myfs://host:456/file,
expected: myfs://host.a.b:456>
>   testIpAuthority(org.apache.hadoop.fs.TestFileSystemCanonicalization): expected:<myfs://127.0.0.1:123>
but was:<myfs://localhost:123>
>   testIpAuthorityWithDefaultPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
expected:<myfs://127.0.0.1:123> but was:<myfs://localhost:123>
> Tests in error: 
>   testUnqualifiedUriContents(org.apache.hadoop.fs.shell.TestPathData): `d1': No such
file or directory
> I am newbie in Hadoop source code world. Please help me in building hadoop source code.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

Mime
View raw message