Enterprise Open Source Platform with Red Hat
The expanded strategic alliance between Hortonworks and Red Hat includes a comprehensive approach that can address the growing requirements of key enterprise stakeholders as they realize their vision for Big Data.
Together, our engineers partner to deliver integration so that:
- Data Architects can combine Hadoop data with other enterprise data in a single repository.
- Operators can elastically scale their Hadoop infrastructure – from physical to virtual and cloud.
- Developers can quickly build new analytics applications.
- Data Analysts have improved access to new data types.
What have we accomplished so far?
The engineering relationship has multiple projects focused on the integration of product lines and collaborative customer support.
Apache Hadoop on Red Hat Storage
Working in the community, Hortonworks and Red Hat provide a secure and resilient general purpose storage pool with multiple interfaces, including Hadoop, POSIX and Swift. This improves flexibility and speed the development and deployment of new and existing analytic work flows.
For a seamless experience, Hortonworks and Red Hat have been leveraging the Apache Ambari operational framework to add support for Red Hat Storage. Ambari supports pluggable Stacks and Services and Red Hat has leveraged this extensibility to add Red Hat Storage as an option when installing HDP via Ambari.
The availability of the beta version of the Hortonworks Data Platform (HDP) plug-in for Red Hat Storage is now available, you can download it here.
Apache Hadoop on OpenStack
HDP with Red Hat Enterprise Linux OpenStack Platform gives elastic Hadoop services on a secure, private cloud infrastructure to lower costs and improve flexibility.
Find more details on Apache Hadoop on OpenStack here.
Apache Hadoop on RHEL with OpenJDK
HDP with Red Hat Enterprise Linux and OpenJDK provides a more open and flexible development environment for enterprise-strength analytic applications so they can be deployed on physical, virtual, or cloud infrastructures.
HDP 2.0 is certified to run in production clusters on OpenJDK 1.7.0_09-icedtea. This simplifies deployment and operational maintenance as OpenJDK is available as part of standard RHEL packages. With OpenJDK certification, this also forges closer links between the Apache Hadoop and OpenJDK communities.
You can install OpenJDK on RHEL here.
Data – specifically data running on Hadoop – is the killer application for the open hybrid clouds. Enterprises are looking to IT solution providers to help with a dramatic reduction in time-to-results for their big data projects.Red Hat’s strategic partnership with Hortonworks is focused on helping customers with efficiency and agility as they embark on big data projects.
Ranga Rangachari, vice president and general manager, Storage and Big Data, Red Hat
Apache Hadoop with JBoss Data Virtualization
integrates Hadoop with existing information sources including data warehouses, SQL and NoSQL databases, enterprise and cloud applications, and flat and XML files. The solution creates business-friendly, reusable and virtual data models with unified views by combining and transforming data from multiple sources including Hadoop. This creates integrated data available on-demand for external applications through standard SQL and Web services interfaces.
You can find more information and download JBoss Data Virtualization here.