Hortonworks Data Cloud provides a quick and easy on-ramp for users looking to combine the agility of Amazon Web Services (“AWS”) with the data processing power of the Hortonworks Data Platform. We are pleased to announce not ONE but TWO new releases available for Hortonworks Data Cloud. Read on to learn more.
The beat goes on with the latest GA release of Hortonworks Data Cloud. Hortonworks Data Cloud is a fast way to harness the agility of cloud to run Data Science (Apache Spark and Apache Zeppelin), Interactive Analytics (Apache Hive LLAP) and ETL (Apache Hive) workloads.
The newest goodie in this release allows you to enable Hortonworks Flex Support Subscriptions. With your Flex Subscription in place, you can connect with the experts at Hortonworks to get the most out of your workload processing in the cloud.
Checkout the 1.16 GA release of Hortonworks Data Cloud using the links below. Also be sure to note the upcoming webinar “Powering Big Data Success in the Cloud” to learn more about Hortonworks Flex Support.
|1.16 GA Product Documentation||https://docs.hortonworks.com/HDPDocuments/HDCloudAWS/HDCloudAWS-1.16.0/index.html|
|Data Analytics Powered by Cloud
|Powering Big Data Success in the Cloud
(Upcoming Webinar: June 27, 2017)
|Powering Big Data Success in the Cloud|
Harnessing the agility of cloud and running ephemeral workloads doesn’t preclude you from needing robust Authentication, Authorization and Audit (“AAA”) capabilities for those workloads. Apache Ranger is a cornerstone technology for AAA but how do you put Ranger together with ephemeral workloads?
Our answer: “Shared Data Lake Services” for Enterprise Ephemeral Workloads!
We are excited to announce the availability of Hortonworks Data Cloud 2.0 Technical Preview. This Technical Preview allows you to define schema and security policies in a set of Shared Data Lake Services that can be shared across your ephemeral workloads. When you launch your workloads, you “attach” the workloads to your Shared Data Lake Services to pick-up AAA capabilities. As your workload executes, your defined security context will be applied and enforced to that job.
The resulting deployment architecture allows you to have workloads that are ephemeral (i.e. “spin-up and spin-down”) to right-size your workload infrastructure and only consume cloud resources when needed. It also includes consistent Authentication, Authorization and Audit controls (i.e. “Shared Data Lake Services”) to centrally manage your security policies (a la Apache Ranger) that can be applied and enforced in the attached ephemeral workloads.
The components of the Shared Data Lake Services include:
|Schema Metastore||Apache Hive||Provides Hive schema (tables, views, etc). If you have 2 or more workloads accessing the same Hive data, need to share schema across those workloads.|
|Security Policies||Apache Ranger||Defines security policies around Hive schema. If you have 2 or more users accessing the same data, you need security policies to be consistently available and enforced.|
|Audit Logging||Apache Ranger||Audit user access. Captures data access activity for the users.|
|User and Group Directory||LDAP/AD||Provides authentication source for users and definition of groups for authorization.|
|Protected Gateway||Apache Knox||Supports a single workload endpoint that can be protected with SSL and enabled for authentication to access to resources.|
We are excited to have you try this Technical Preview and give us feedback. You can obtain the Technical Preview software at the link below. As well, please send us feedback via Hortonworks Community Connection. Hortonworks cloud subject matter experts are moderating the “Cloud & Operations” Track for questions related to this Technical Preview. When posting a question related to this Technical Preview, be sure to select add the following tag: hortonworks-cloud.
|Hortonworks Data Cloud 2.0 Technical Preview Documentation||http://hortonworks.github.io/hdp-aws/|