We are living in a hyper connected world. Digitization has lead to massive improvements in human productivity and enabled us to find solutions that would otherwise be simply impossible. Spurring digitization has been a perfect confluence of network, compute and analytics. Thanks to cloud computing, individuals and enterprises of any scale can continuously collect & process data using dynamic compute resources. Advanced scale out analytics has enabled enterprises to derive insight and operationalize them for improved outcomes.
Google and Hortonworks are at the forefronts of cloud computing and distributed scale-out processing. As enterprises adopt cloud and Apache Hadoop, they look to leverage the Google Cloud Platform and Hortonworks Data Platform (HDP), the only 100% open source distribution of Apache Hadoop. Today, we are thrilled to announce the certification and availability of HDP on Google Cloud Platform.
With this new certification, enterprises worldwide can dynamically provision HDP clusters on Google Compute Engine and Google Cloud Storage to store, discover and analyze a unified collection of structured and unstructured information assets. With Google Cloud Platform and Hortonworks Data Platform, enterprises benefit from limitless scalability and an enterprise-grade platform backed by community driven open source innovation.
The joint solution would not be possible without the close collaboration of Google and Hortonworks. Our engineering teams have collaborated to integrate “bdutil” with Apache Ambari Blueprints API, to deliver a simple and streamlined provisioning experience for the end user. Key highlights of the joint solution include:
Because of the Google’s and Hortonworks’ joint engineering, you can easily provision HDP clusters on the Google Compute Platform to take advantage of the only 100% open source Hadoop distribution.