hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Anu Engineer (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HADOOP-12758) Extend CSRF Filter with UserAgent Checks
Date Tue, 02 Feb 2016 23:42:39 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-12758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15129360#comment-15129360

Anu Engineer commented on HADOOP-12758:

Hi [~lmccay]

Patch looks good and thank you for addressing this critical security concern. I really appreciate

However I have a  higher level question. With this patch, we are going to allow non-browser
based clients (curl, java, perl and wget) to work without the webHDFS XSRF header. 

But if a user is not able to connect to WebHDFS from a Web page due to XSRF error,  he/she
now cannot use these tools since we have extra code that allows them to bypass the XSRF security
I am afraid with this patch we are taking away a really good debug tool from the users and
perhaps going to create a bunch of confused webHDFS users.

My question is: Is this complexity worth it ? if a user enables XSRF check based on your older
patch, in most cases the overhead is a hash table lookup and parsing of the header to check
the verb. 
I know this is a cool and logically correct optimization to have, but I am worried that this
optimization would only create pain for the user, whereas gains are relatively minor.

I do see one use case though -  where you want to enable XSRF on a cluster but allow older
curl based clients to continue to operate. But if that is the use case (and you really have
lots of curl based scripts) then and only then, this makes sense. Even then I would actually
prefer to modify older scripts than have these kind of surprises. If I am missing something
here, please do let me know.

> Extend CSRF Filter with UserAgent Checks
> ----------------------------------------
>                 Key: HADOOP-12758
>                 URL: https://issues.apache.org/jira/browse/HADOOP-12758
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: security
>            Reporter: Larry McCay
>            Assignee: Larry McCay
>             Fix For: 2.8.0
>         Attachments: HADOOP-12758-001.patch
> To protect against CSRF attacks, HADOOP-12691 introduces a CSRF filter that will require
a specific HTTP header to be sent with every REST API call. This will affect all API consumers
from web apps to CLIs and curl. 
> Since CSRF is primarily a browser based attack we can try and minimize the impact on
non-browser clients.
> This enhancement will provide additional configuration for identifying non-browser useragents
and skipping the enforcement of the header requirement for anything identified as a non-browser.
This will largely limit the impact to browser based PUT and POST calls when configured appropriately.

This message was sent by Atlassian JIRA

View raw message