cta

Get Started

cloud

Ready to Get Started?

Download sandbox

How can we help you?

closeClose button
March 28, 2016
prev slideNext slide

ODPi Core and HDP Hadoop Core

Hortonworks is proud and committed to being 100% open, we break down silos, push boundaries and enable the entire ecosystem to flourish and innovate (read Shaun Connolly’s blog). That belief extends to our commitment with Open Data Platform initiative (ODPi) as well, we are proud to be part of ODPi because it operates under an open governance model and helps rally a community to drive innovation around shared goals.

Today we are excited with ODPi’s announcement of the ODPi Core – Runtime spec release (read ODPi blog and announcement). The ODPi Core is a great first step towards setting common reference specifications over which the industry can build enterprise-class Big Data solutions.

Screen Shot 2016-03-28 at 4.55.42 PMIn our recent March 1st announcement, we unveiled new release cycles for Hadoop Core and Extended Services. The Hadoop Core (HDFS, YARN and MapReduce) goes through a slower revision cycle to provide platform stability and scalability, while Extended Services (Spark, Hive, Ambari and other Apache Data Access projects) have more frequent releases for a more rapid innovation cycle. So the Hadoop Core is very much aligned to the ODPi Core, and we believe the ODPi Core provides a stable platform that the ecosystem can build and innovate on.

This year is the 10th yea10_year_hadoop_logor anniversary of Hadoop, which started with Owen O’Malley’s first patch. Being in the open, Hadoop has had accelerated growth and we believe well-defined common core platform/specs from a community driven open organization like ODPi can help enterprise end-users and vendors alike to advance their Big Data solutions. The commonality, compatibility, and interoperability among the Hadoop- based platforms simplify the enterprise adoption and enable Big Data solutions to flourish.

There are many participants that can benefit from ODPi Core, from Independent Software Vendors (ISVs), and application developers to end-user organizations that develop their own custom applications or systems. They always want to build their applications to run on top of as many platforms with as minimal effort as possible.

If you are building applications for the Big Data space, this Hadoop platform harmonization eases your testing and validation burden, so you can focus on doing what you do best and can integrate with a common and predictable reference platform. You verify once and run anywhere across your preferred Big Data infrastructure deployments.

Instead of watching from the sideline, this is your opportunity to reduce your development and support costs by getting your applications certified and tested with industry standard, and help shape the enterprise-class Big Data solution.

Innovation will flourish and advance even faster in the market with all the ODPi members building upon the same Runtime spec and Operations spec!

Leave a Reply

Your email address will not be published. Required fields are marked *