Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 60C71200C01 for ; Thu, 19 Jan 2017 17:36:29 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 5F5B9160B57; Thu, 19 Jan 2017 16:36:29 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 0F6EE160B3A for ; Thu, 19 Jan 2017 17:36:27 +0100 (CET) Received: (qmail 71925 invoked by uid 500); 19 Jan 2017 16:36:27 -0000 Mailing-List: contact commits-help@kafka.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@kafka.apache.org Delivered-To: mailing list commits@kafka.apache.org Received: (qmail 71916 invoked by uid 99); 19 Jan 2017 16:36:27 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 19 Jan 2017 16:36:27 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id E6268DFB86; Thu, 19 Jan 2017 16:36:26 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: junrao@apache.org To: commits@kafka.apache.org Message-Id: <662537b03f2e4777b857ef80618e26a9@git.apache.org> X-Mailer: ASF-Git Admin Mailer Subject: kafka git commit: KAFKA-4589; SASL/SCRAM documentation Date: Thu, 19 Jan 2017 16:36:26 +0000 (UTC) archived-at: Thu, 19 Jan 2017 16:36:29 -0000 Repository: kafka Updated Branches: refs/heads/0.10.2 5562d63e8 -> ae5591a3a KAFKA-4589; SASL/SCRAM documentation Author: Rajini Sivaram Reviewers: Ismael Juma , Gwen Shapira , Sriharsha Chintalapani , Jun Rao Closes #2369 from rajinisivaram/KAFKA-4589 (cherry picked from commit 666abafcc54f8cab64912355dba4c8ada8e44827) Signed-off-by: Jun Rao Project: http://git-wip-us.apache.org/repos/asf/kafka/repo Commit: http://git-wip-us.apache.org/repos/asf/kafka/commit/ae5591a3 Tree: http://git-wip-us.apache.org/repos/asf/kafka/tree/ae5591a3 Diff: http://git-wip-us.apache.org/repos/asf/kafka/diff/ae5591a3 Branch: refs/heads/0.10.2 Commit: ae5591a3af45832d9a50e2aaa31b11c039d8d389 Parents: 5562d63 Author: Rajini Sivaram Authored: Thu Jan 19 08:35:39 2017 -0800 Committer: Jun Rao Committed: Thu Jan 19 08:36:09 2017 -0800 ---------------------------------------------------------------------- docs/security.html | 317 +++++++++++++++++++++++++++++++++--------------- 1 file changed, 218 insertions(+), 99 deletions(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/kafka/blob/ae5591a3/docs/security.html ---------------------------------------------------------------------- diff --git a/docs/security.html b/docs/security.html index 8cb867e..81d5c40 100644 --- a/docs/security.html +++ b/docs/security.html @@ -19,8 +19,12 @@

7.1 Security Overview

In release 0.9.0.0, the Kafka community added a number of features that, used either separately or together, increases security in a Kafka cluster. The following security measures are currently supported:
    -
  1. Authentication of connections to brokers from clients (producers and consumers), other brokers and tools, using either SSL or SASL (Kerberos). - SASL/PLAIN can also be used from release 0.10.0.0 onwards.
  2. +
  3. Authentication of connections to brokers from clients (producers and consumers), other brokers and tools, using either SSL or SASL. Kafka supports the following SASL mechanisms: +
      +
    • SASL/GSSAPI (Kerberos) - starting at version 0.9.0.0
    • +
    • SASL/PLAIN - starting at version 0.10.0.0
    • +
    • SASL/SCRAM-SHA-256 and SASL/SCRAM-SHA-512 - starting at version 0.10.2.0
    • +
  4. Authentication of connections from brokers to ZooKeeper
  5. Encryption of data transferred between brokers and clients, between brokers, or between brokers and tools using SSL (Note that there is a performance degradation when SSL is enabled, the magnitude of which depends on the CPU type and the JVM implementation.)
  6. Authorization of read / write operations by clients
  7. @@ -211,117 +215,125 @@

    7.3 Authentication using SASL

      -
    1. SASL configuration for Kafka brokers

      -
        -
      1. Select one or more supported mechanisms to enable in the broker. GSSAPI - and PLAIN are the mechanisms currently supported in Kafka.
      2. -
      3. Add a JAAS config file for the selected mechanisms as described in the examples - for setting up GSSAPI (Kerberos) - or PLAIN.
      4. -
      5. Pass the JAAS config file location as JVM parameter to each Kafka broker. - For example: -
            -Djava.security.auth.login.config=/etc/kafka/kafka_server_jaas.conf
      6. -
      7. Configure a SASL port in server.properties, by adding at least one of - SASL_PLAINTEXT or SASL_SSL to the listeners parameter, which - contains one or more comma-separated values: -
            listeners=SASL_PLAINTEXT://host.name:port
        - If SASL_SSL is used, then SSL must also be - configured. If you are only configuring a SASL port (or if you want - the Kafka brokers to authenticate each other using SASL) then make sure - you set the same SASL protocol for inter-broker communication: -
            security.inter.broker.protocol=SASL_PLAINTEXT (or SASL_SSL)
      8. -
      9. Enable one or more SASL mechanisms in server.properties: -
            sasl.enabled.mechanisms=GSSAPI (,PLAIN)
      10. -
      11. Configure the SASL mechanism for inter-broker communication in server.properties - if using SASL for inter-broker communication: -
            sasl.mechanism.inter.broker.protocol=GSSAPI (or PLAIN)
      12. -
      13. Follow the steps in GSSAPI (Kerberos) - or PLAIN to configure SASL - for the enabled mechanisms. To enable multiple mechanisms in the broker, follow - the steps here.
      14. - Important notes: +
      15. JAAS configuration

        +

        Kafka uses the Java Authentication and Authorization Service + (JAAS) + for SASL configuration.

          -
        1. KafkaServer is the section name in the JAAS file used by each +
        2. JAAS configuration for Kafka brokers
          + +

          KafkaServer is the section name in the JAAS file used by each KafkaServer/Broker. This section provides SASL configuration options for the broker including any SASL client connections made by the broker - for inter-broker communication.

        3. -
        4. Client section is used to authenticate a SASL connection with + for inter-broker communication.

          + +

          Client section is used to authenticate a SASL connection with zookeeper. It also allows the brokers to set SASL ACL on zookeeper nodes which locks these nodes down so that only the brokers can modify it. It is necessary to have the same principal name across all brokers. If you want to use a section name other than Client, set the system property zookeeper.sasl.client to the appropriate - name (e.g., -Dzookeeper.sasl.client=ZkClient).

        5. -
        6. ZooKeeper uses "zookeeper" as the service name by default. If you + name (e.g., -Dzookeeper.sasl.client=ZkClient).

          + +

          ZooKeeper uses "zookeeper" as the service name by default. If you want to change this, set the system property zookeeper.sasl.client.username to the appropriate name - (e.g., -Dzookeeper.sasl.client.username=zk).

        7. -
        -
      -
    2. -
    3. SASL configuration for Kafka clients

      - SASL authentication is only supported for the new Java Kafka producer and - consumer, the older API is not supported. JAAS configuration for clients may - be specified as a static JAAS config file or using the client configuration property - sasl.jaas.config. - To configure SASL authentication on the clients: -
        -
      1. Select a SASL mechanism for authentication.
      2. -
      3. Configure the following properties in producer.properties or - consumer.properties: -
            security.protocol=SASL_PLAINTEXT (or SASL_SSL)
        -    sasl.mechanism=GSSAPI (or PLAIN)
      4. -
      5. Follow the steps in GSSAPI (Kerberos) - or PLAIN to configure SASL - for the selected mechanism.
      6. -
      7. Configure JAAS using client configuration property - or static JAAS config file as described below.
      8. -
      -
        -
      1. JAAS configuration using client configuration property
        -

        Clients may specify JAAS configuration as a producer or consumer property without - creating a physical configuration file. This mode also enables different producers - and consumers within the same JVM to use different credentials by specifying - different properties for each client. If both static JAAS configuration system property - java.security.auth.login.config and client property sasl.jaas.config - are specified, the client property will be used.

        - - To configure SASL authentication on the clients using configuration property: -
          -
        1. Configure the property sasl.jaas.config in producer.properties or - consumer.properties to be the JAAS login module section of the selected mechanism. - For example, PLAIN - credentials may be configured as: -
              sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="alice" password="alice-secret";
        2. - See GSSAPI (Kerberos) or PLAIN - for full example configurations. -
        -
      2. -
      3. JAAS configuration using static config file
        - To configure SASL authentication on the clients using static JAAS config file: + (e.g., -Dzookeeper.sasl.client.username=zk).

      4. + +
      5. JAAS configuration for Kafka clients
        + +

        Clients may configure JAAS using the client configuration property + sasl.jaas.config + or using the static JAAS config file + similar to brokers.

        +
          -
        1. Add a JAAS config file with a client login section named KafkaClient. Configure - a login module in KafkaClient for the selected mechanism as described in the examples - for setting up GSSAPI (Kerberos) - or PLAIN. - For example, GSSAPI - credentials may be configured as: -
          -    KafkaClient {
          +            
        2. JAAS configuration using client configuration property
          +

          Clients may specify JAAS configuration as a producer or consumer property without + creating a physical configuration file. This mode also enables different producers + and consumers within the same JVM to use different credentials by specifying + different properties for each client. If both static JAAS configuration system property + java.security.auth.login.config and client property sasl.jaas.config + are specified, the client property will be used.

          + +

          See GSSAPI (Kerberos), + PLAIN or + SCRAM for example configurations.

        3. + +
        4. JAAS configuration using static config file
          + To configure SASL authentication on the clients using static JAAS config file: +
            +
          1. Add a JAAS config file with a client login section named KafkaClient. Configure + a login module in KafkaClient for the selected mechanism as described in the examples + for setting up GSSAPI (Kerberos), + PLAIN or + SCRAM. + For example, GSSAPI + credentials may be configured as: +
            +        KafkaClient {
                     com.sun.security.auth.module.Krb5LoginModule required
                     useKeyTab=true
                     storeKey=true
                     keyTab="/etc/security/keytabs/kafka_client.keytab"
                     principal="kafka-client-1@EXAMPLE.COM";
                 };
            - See GSSAPI (Kerberos) or PLAIN - for example configurations of each mechanism.
          2. -
          3. Pass the JAAS config file location as JVM parameter to each client JVM. For example: -
                -Djava.security.auth.login.config=/etc/kafka/kafka_client_jaas.conf
          4. + +
          5. Pass the JAAS config file location as JVM parameter to each client JVM. For example: +
                -Djava.security.auth.login.config=/etc/kafka/kafka_client_jaas.conf
          6. +
          +
        5. +
        +
      6. +
      +
    4. +
    5. SASL configuration

      + +

      SASL may be used with PLAINTEXT or SSL as the transport layer using the + security protocol SASL_PLAINTEXT or SASL_SSL respectively. If SASL_SSL is + used, then SSL must also be configured.

      + +
        +
      1. SASL mechanisms
        + Kafka supports the following SASL mechanisms: + +
      2. +
      3. SASL configuration for Kafka brokers
        +
          +
        1. Configure a SASL port in server.properties, by adding at least one of + SASL_PLAINTEXT or SASL_SSL to the listeners parameter, which + contains one or more comma-separated values: +
              listeners=SASL_PLAINTEXT://host.name:port
          + If you are only configuring a SASL port (or if you want + the Kafka brokers to authenticate each other using SASL) then make sure + you set the same SASL protocol for inter-broker communication: +
              security.inter.broker.protocol=SASL_PLAINTEXT (or SASL_SSL)
        2. +
        3. Select one or more supported mechanisms + to enable in the broker and follow the steps to configure SASL for the mechanism. + To enable multiple mechanisms in the broker, follow the steps + here.
      4. +
      5. SASL configuration for Kafka clients
        +

        SASL authentication is only supported for the new Java Kafka producer and + consumer, the older API is not supported.

        + +

        To configure SASL authentication on the clients, select a SASL + mechanism that is enabled in + the broker for client authentication and follow the steps to configure SASL + for the selected mechanism.

    6. Authentication using SASL/Kerberos

      @@ -502,6 +514,111 @@
    + +
  8. Authentication using SASL/SCRAM

    +

    Salted Challenge Response Authentication Mechanism (SCRAM) is a family of SASL mechanisms that + addresses the security concerns with traditional mechanisms that perform username/password authentication + like PLAIN and DIGEST-MD5. The mechanism is defined in RFC 5802. + Kafka supports SCRAM-SHA-256 and SCRAM-SHA-512 which + can be used with TLS to perform secure authentication. The username is used as the authenticated + Principal for configuration of ACLs etc. The default SCRAM implementation in Kafka + stores SCRAM credentials in Zookeeper and is suitable for use in Kafka installations where Zookeeper + is on a private network. Refer to Security Considerations + for more details.

    +
      +
    1. Creating SCRAM Credentials
      +

      The SCRAM implementation in Kafka uses Zookeeper as credential store. Credentials can be created in + Zookeeper using kafka-configs.sh. For each SCRAM mechanism enabled, credentials must be created + by adding a config with the mechanism name. Credentials for inter-broker communication must be created + before Kafka brokers are started. Client credentials may be created and updated dynamically and updated + credentials will be used to authenticate new connections.

      +

      Create SCRAM credentials for user alice with password alice-secret: +

      +    bin/kafka-configs.sh --zookeeper localhost:2181 --alter --add-config 'SCRAM-SHA-256=[iterations=8192,password=alice-secret],SCRAM-SHA-512=[password=alice-secret]' --entity-type users --entity-name alice
      +        
      +

      The default iteration count of 4096 is used if iterations are not specified. A random salt is created + and the SCRAM identity consisting of salt, iterations, StoredKey and ServerKey are stored in Zookeeper. + See RFC 5802 for details on SCRAM identity and the individual fields. +

      The following examples also require a user admin for inter-broker communication which can be created using: +

      +    bin/kafka-configs.sh --zookeeper localhost:2181 --alter --add-config 'SCRAM-SHA-256=[password=admin-secret],SCRAM-SHA-512=[password=admin-secret]' --entity-type users --entity-name admin
      +        
      +

      Existing credentials may be listed using the --describe option: +

      +   bin/kafka-configs.sh --zookeeper localhost:2181 --describe --entity-type users --entity-name alice
      +        
      +

      Credentials may be deleted for one or more SCRAM mechanisms using the --delete option: +

      +   bin/kafka-configs.sh --zookeeper localhost:2181 --alter --delete-config 'SCRAM-SHA-512' --entity-type users --entity-name alice
      +        
      +
    2. +
    3. Configuring Kafka Brokers
      +
        +
      1. Add a suitably modified JAAS file similar to the one below to each Kafka broker's config directory, let's call it kafka_server_jaas.conf for this example: +
        +    KafkaServer {
        +        org.apache.kafka.common.security.scram.ScramLoginModule required
        +        username="admin"
        +        password="admin-secret"
        +    };
        + The properties username and password in the KafkaServer section are used by + the broker to initiate connections to other brokers. In this example, admin is the user for + inter-broker communication.
      2. +
      3. Pass the JAAS config file location as JVM parameter to each Kafka broker: +
            -Djava.security.auth.login.config=/etc/kafka/kafka_server_jaas.conf
      4. +
      5. Configure SASL port and SASL mechanisms in server.properties as described here. For example: +
        +    listeners=SASL_SSL://host.name:port
        +    security.inter.broker.protocol=SASL_SSL
        +    sasl.mechanism.inter.broker.protocol=SCRAM-SHA-256 (or SCRAM-SHA-512)
        +    sasl.enabled.mechanisms=SCRAM-SHA-256 (or SCRAM-SHA-512)
      6. +
      +
    4. + +
    5. Configuring Kafka Clients
      + To configure SASL authentication on the clients: +
        +
      1. Configure the JAAS configuration property for each client in producer.properties or consumer.properties. + The login module describes how the clients like producer and consumer can connect to the Kafka Broker. + The following is an example configuration for a client for the SCRAM mechanisms: +
        +   sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \
        +        username="alice" \
        +        password="alice-secret";
        + +

        The options username and password are used by clients to configure + the user for client connections. In this example, clients connect to the broker as user alice. + Different clients within a JVM may connect as different users by specifying different user names + and passwords in sasl.jaas.config.

        + +

        JAAS configuration for clients may alternatively be specified as a JVM parameter similar to brokers + as described here. Clients use the login section named + KafkaClient. This option allows only one user for all client connections from a JVM.

      2. + +
      3. Configure the following properties in producer.properties or consumer.properties: +
        +    security.protocol=SASL_SSL
        +    sasl.mechanism=SCRAM-SHA-256 (or SCRAM-SHA-512)
      4. +
      +
    6. +
    7. Security Considerations for SASL/SCRAM
      +
        +
      • The default implementation of SASL/SCRAM in Kafka stores SCRAM credentials in Zookeeper. This + is suitable for production use in installations where Zookeeper is secure and on a private network.
      • +
      • Kafka supports only the strong hash functions SHA-256 and SHA-512 with a minimum iteration count + of 4096. Strong hash functions combined with strong passwords and high iteration counts protect + against brute force attacks if Zookeeper security is compromised.
      • +
      • SCRAM should be used only with TLS-encryption to prevent interception of SCRAM exchanges. This + protects against dictionary or brute force attacks and against impersonation if Zookeeper is compromised.
      • +
      • The default SASL/SCRAM implementation may be overridden using custom login modules in installations + where Zookeeper is not secure. See here for details.
      • +
      • For more details on security considerations, refer to + RFC 5802. +
      +
    8. +
    +
  9. +
  10. Enabling multiple SASL mechanisms in a broker

    1. Specify configuration for the login modules of all enabled mechanisms in the KafkaServer section of the JAAS config file. For example: @@ -519,12 +636,14 @@ user_admin="admin-secret" user_alice="alice-secret"; };
    2. -
    3. Enable the SASL mechanisms in server.properties:
          sasl.enabled.mechanisms=GSSAPI,PLAIN
    4. +
    5. Enable the SASL mechanisms in server.properties:
          sasl.enabled.mechanisms=GSSAPI,PLAIN,SCRAM-SHA-256,SCRAM-SHA-512
    6. Specify the SASL security protocol and mechanism for inter-broker communication in server.properties if required: -
          security.inter.broker.protocol=SASL_PLAINTEXT (or SASL_SSL)
      -        sasl.mechanism.inter.broker.protocol=GSSAPI (or PLAIN)
    7. -
    8. Follow the mechanism-specific steps in GSSAPI (Kerberos) - and PLAIN to configure SASL for the enabled mechanisms.
    9. +
      +    security.inter.broker.protocol=SASL_PLAINTEXT (or SASL_SSL)
      +    sasl.mechanism.inter.broker.protocol=GSSAPI (or one of the other enabled mechanisms)
      +
    10. Follow the mechanism-specific steps in GSSAPI (Kerberos), + PLAIN and SCRAM + to configure SASL for the enabled mechanisms.
  11. Modifying SASL mechanism in a Running Cluster