Enterprise Open Source Platform with Red Hat

Collaborating to Enable the Future of Enterprise Data Apps

The expanded strategic alliance between Hortonworks and Red Hat includes a comprehensive approach that can address the growing requirements of key enterprise stakeholders as they realize their vision for Big Data.

Together, our engineers partner to deliver integration so that:

  • Data Architects can combine Hadoop data with other enterprise data in a single repository.
  • Operators can elastically scale their Hadoop infrastructure – from physical to virtual and cloud.
  • Developers can quickly build new analytics applications.
  • Data Analysts have improved access to new data types.

Initiative Goals

Seamless Experience
True integration across configuration, deployment, and management of Hadoop and Red Hat products.
On Demand
Elastic Hadoop services across a range of deployment options from physical to virtual to cloud.
Open Leadership
Open source leaders collaborating to develop the best in enterprise data solutions.

What have we accomplished so far?

The engineering relationship has multiple projects focused on the integration of product lines and collaborative customer support.

Apache Hadoop on Red Hat Storage

Working in the community, Hortonworks and Red Hat provide a secure and resilient general purpose storage pool with multiple interfaces, including Hadoop, POSIX and Swift. This improves flexibility and speed the development and deployment of new and existing analytic work flows.

For a seamless experience, Hortonworks and Red Hat have been leveraging the Apache Ambari operational framework to add support for Red Hat Storage. Ambari supports pluggable Stacks and Services and Red Hat has leveraged this extensibility to add Red Hat Storage as an option when installing HDP via Ambari.

The availability of the beta version of the Hortonworks Data Platform (HDP) plug-in for Red Hat Storage is now available, you can download it here.

Apache Hadoop on OpenStack

HDP with Red Hat Enterprise Linux OpenStack Platform gives elastic Hadoop services on a secure, private cloud infrastructure to lower costs and improve flexibility.

Find more details on Apache Hadoop on OpenStack here.

Apache Hadoop on RHEL with OpenJDK

HDP with Red Hat Enterprise Linux and OpenJDK provides a more open and flexible development environment for enterprise-strength analytic applications so they can be deployed on physical, virtual, or cloud infrastructures.

HDP 2.0 is certified to run in production clusters on OpenJDK 1.7.0_09-icedtea. This simplifies deployment and operational maintenance as OpenJDK is available as part of standard RHEL packages. With OpenJDK certification, this also forges closer links between the Apache Hadoop and OpenJDK communities.

You can install OpenJDK on RHEL here.

Data – specifically data running on Hadoop – is the killer application for the open hybrid clouds.   Enterprises are looking to IT solution providers to help with a dramatic reduction in time-to-results for their big data projects.Red Hat’s strategic partnership with Hortonworks is focused on helping customers with efficiency and agility as they embark on big data projects.

Ranga Rangachari, vice president and general manager, Storage and Big Data, Red Hat

Apache Hadoop with JBoss Data Virtualization

integrates Hadoop with existing information sources including data warehouses, SQL and NoSQL databases, enterprise and cloud applications, and flat and XML files. The solution creates business-friendly, reusable and virtual data models with unified views by combining and transforming data from multiple sources including Hadoop. This creates integrated data available on-demand for external applications through standard SQL and Web services interfaces.

You can find more information and download JBoss Data Virtualization here.

Try HDP for Red Hat JBoss Data Virtualization

HDP for Red Hat JBoss

Try HDP for Red Hat Storage

HDP for Red Hat Storage

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.