Pivotal, Hortonworks, and a Shared Vision of Operations for Enterprise Hadoop
There are many projects that have been contributed to the Apache Software Foundation (ASF) by both vendors and users alike that greatly expand Apache Hadoop’s capabilities as an enterprise data platform.
While Hadoop – with YARN at its architectural center – provides the foundational capabilities for managing and accessing data at scale, a broader blueprint for Enterprise Hadoop has emerged that specifies how this array of Apache projects fit across five distinct pillars to form a complete enterprise data platform: data access, data management, security, operations and governance.
Our recent news with Pivotal focuses on furthering innovation within the operations pillar – specifically within Apache Ambari which provides tools and APIs to provision, manage and monitor Hadoop clusters.
Working with Pivotal within the Community for the Enterprise
We fundamentally believe in the efforts of a vibrant open source community under the governance of the Apache Software Foundation (ASF) as the fastest path to innovation for Hadoop, and we are excited to work with the Pivotal team to accelerate the innovation, features, and momentum of Ambari.
—Jamie Buckley, vice president of product management at Pivotal
We share Hortonworks’ commitment to enabling a modern data architecture with open source. And we look forward to working closely with them and the Apache Software Foundation to further progress Apache Ambari.
Collaborating with the team at Pivotal on Enterprise Hadoop moves us closer to solving the data problem that I wrote about in my first Hortonworks blog post early in 2012 entitled Solving the Data Problem in a Big Way, and on a personal level is something of a homecoming since I was part of the SpringSource/VMWare team from 2008 to 2011!
Collaboration is King
An article earlier this year by Wayne Eckerson on BeyeNetwork does a great job summarizing Hortonworks’ unique approach to the market including:
partnering with leading commercial data management and analytics vendors to create a data environment that blends the best of Hadoop and commercial software
I’ve stated many times that you don’t make a platform like Enterprise Hadoop easy to use and enterprise-grade by going it alone. You do it by working with the broader ecosystem to enable:
- developers, data workers, and analysts to build applications quickly and easily;
- existing data systems to integrate deeply;
- operators and security administrators to deploy, manage, secure, and govern the platform and the applications deployed on it in a consistent way; and
- enterprises big and small with a choice of deployment options that span Linux, Windows, virtualization, cloud, and appliances.
Our approach to partnering is about enabling our customers to embrace Hadoop in a way that makes sense for their business, and it’s about enabling our partners to get value out of an alliance with us while being respectful to them in the process.
Hortonworks has always been committed to innovating in the open since it provides the fastest and most transparent path to value for everyone using and investing in Enterprise Hadoop. We look forward to collaborating with the team at Pivotal within the Apache community to deliver on a terrific roadmap for operating Hadoop with Apache Ambari. Their commitment to Apache Ambari is a great thing for the community.