Apache Knox Gateway

A single point of secure access for Hadoop clusters
With YARN as its architectural center, Apache Hadoop continues to attract new engines to run within the data platform, as organizations want to efficiently store their data in a single repository and interact with it for batch, interactive and real-time streaming use cases. More and more independent software vendors (ISVs) are developing applications to run in Hadoop via YARN. This increases the number of users and processing engines that operate simultaneously across a Hadoop cluster, on the same data, at the same time.

The Apache Knox Gateway (“Knox”) provides perimeter security so that the enterprise can confidently extend Hadoop access to more of those new users while also maintaining compliance with enterprise security policies.

Knox also simplifies Hadoop security for users who access the cluster data and execute jobs. It integrates with prevalent identity management and SSO systems and allows identities from those enterprise systems to be used for seamless, secure access to Hadoop clusters.

Hortonworks Focus for Knox Gateway

The Knox community is working on development efforts to focus on extending the reach of Hadoop services to users outside of the cluster, while further enhancing security.

Focus Planned Enhancements
REST & HTTP services
  • Provide security to all of Hadoop’s REST & HTTP services
  • Support for REST APIs for Apache Ambari, Apache Falcon and Apache Ranger
Deeper integration for authentication
  • Integrate with authentication tokens such as OAuth, OpenID & SAML
  • Support for advance authentication features such as Multi Factor Authentication
Enterprise readiness
  • Deeper integration with Apache Ambari to simplify configuration management
  • SSO services for Hadoop’s web consoles

Recent Progress in Knox Gateway

Recent releases of Apache Knox Gateway has focused on securely extending access to Apache Hadoop YARNs rich set of APIs and on improving the developer experience in the Apache Knox API Gateway.

Apache Knox Version Progress
Knox 0.5.0
  • Support for HDFS HA
  • Installation and configuration with Apache Ambari
  • Service-level authorization with Apache Ranger
  • YARN REST API access
Knox 0.4.0
  • Support for ODBC/JDBC calls via Knox to Apache Hive
  • Support for SSL
  • Full support for Kerberized Hadoop Cluster
  • Filter for removing web app vulnerabilities
Knox 0.3.0
  • Support for Apache Hive, HBase, Hive, Oozie and HDFS REST APIs
  • Initial support for Kerberized Hadoop Cluster

What Knox Does

Knox provides perimeter security for Hadoop clusters, with these advantages:

Advantage Description
Simplified access Entend Hadoop’s REST/HTTP services by encapsulating Kerberos within the cluster
Enhanced security Expose Hadoop’s REST/HTTP services without revealing network details, with SSL provided out of box
Centralized control Centrally enforce REST API security and route requests to multiple Hadoop clusters
Enterprise integration Support LDAP, Active Directory, SSO, SAML and other authentication systems

How Knox Works

A fully secure Hadoop cluster needs Kerberos. Kerberos requires a client side library and complex client side configuration. By encapsulating Kerberos, Knox eliminates the need for client software or client configuration and thus simplifies the access model. In this way, Knox aggregates REST/HTTP calls to various components within the Hadoop ecosystem.

Knox is a stateless reverse proxy framework and can be deployed as a cluster of Knox instances that route requests to Hadoop’s REST APIs. Because Knox is stateless, it scales linearly by adding more Knox nodes as the load increases. A load balancer can route requests to multiple Knox instances.

Knox also intercepts REST/HTTP calls and provides authentication, authorization, audit, URL rewriting, web vulnerability removal and other security services through a series of extensible interceptor pipelines.

Try these Tutorials

Apache Top-Level Project Since
February 2013
Hortonworks Committers

Try Knox Gateway with Sandbox

Hortonworks Sandbox is a self-contained virtual machine with HDP running alongside a set of hands-on, step-by-step Hadoop tutorials.

Get Sandbox

View Past Webinars

Discover HDP 2.2: Comprehensive Hadoop Security with Apache Ranger and Apache Knox

More Webinars »


More posts on:
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.