Hadoop Security : Kerberos or Knox or both

to create new topics or reply. | New User Registration

This topic contains 1 reply, has 2 voices, and was last updated by  Vinay Shukla 1 year, 1 month ago.

  • Creator
  • #41301

    Sourabh Potnis

    for setting up a secure Hadoop cluster, what is the best approach?

    1. Kerberos
    2. Knox gateway
    3. Encryption of data-in-motion using SSL/TLS
    4. Mixture of these solutions
    5. Any other ?


Viewing 1 replies (of 1 total)

You must be to reply to this topic. | Create Account

  • Author
  • #49125

    Vinay Shukla

    These are not 4 or5 choices. The answer may be all of the above.

    Securing a Hadoop cluster starts with identifying what type (PII, security sensitive, webblogs or low value-highvolume data) will be stored in a Hadoop cluster.
    Then consider how users will access the data? Through a middleware application (perhaps BI) or directly. What are the controls placed in the middleware. Are these controls sufficient? If these controls aren’t sufficient, or the consequences of a breach are high, enable Kerberos. Then consider putting a firewall around your Hadoop cluster. Often an end-user does not have a line of sight to an enterprise DB and a Hadoop cluster may need to be secured similarily.
    Then if you have applications accessing Hadoop services over REST, Apache Knox is your friend. Put it between the application and the Hadoop cluster.
    Now there are authorization controls at various layers in Hadoop. From ACL in MR Qs to HDFS permission. More access controls improvements are coming.
    Consider enabling wire encryption to protect data as it moves in Hadoop. Consider using custom and other solutions for encrypting Data at rest (as it sits in HDFS). In new couple of release there will be Hadoop native solutions of data at rest encryption.

    You may also want to review Hadoop Security labs (http://hortonworks.com/labs/security/)- it has a good overview of Hadoop security, where it is heading and links to a growing list of how-tos.

Viewing 1 replies (of 1 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.