Get fresh updates from Hortonworks by email

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.


Sign up for the Developers Newsletter

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.


Get Started


Ready to Get Started?

Download sandbox

How can we help you?

* I understand I can unsubscribe at any time. I also acknowledge the additional information found in Hortonworks Privacy Policy.
closeClose button
September 13, 2016
prev slideNext slide

New HDF Product Integration Certification Program

Expanding the Data In Motion Ecosystem of Hortonworks DataFlow

The Hortonworks DataFlow (HDF™) ecosystem continues to accelerate, with 170+ connectors to different systems that support wide-scale dataflows. To further expand the HDF ecosystem, Hortonworks’ Partnerworks program now includes a new HDF Product Integration Certification program designed to simplify integrations of HDF into a company’s IT architecture. The program provides validation of pre-built integrations between leading enterprise technologies and HDF. It also offers testing and certification processes to help ensure that Partner software integrates with HDF, saving end customers time while providing them with an assurance of interoperability.

Hortonworks DataFlow HDP Partner Certification Connecting Ecosystems

Hortonworks is collaborating with leading technology partners to certify joint solutions and offer an end-to-end value to our customers. Key partners collaborating on HDF technology integration include:

  • Attunity
    • Attunity Replicate works with HDF to move data from relational databases to Hadoop in real time –  automatically capturing data changes (CDC) at scale
  • HPE Security – Data Security
    • Developed a Nifi Processor providing highly secure encryption for data flow files.
  • Impetus Technologies – StreamAnalytix™
    • Provides a visual interface for analytics workflows, making it extremely easy to build analytics applications quickly using built-in operators.
  • Kepware
    • Enables Smart Operations via the integration of Kepware ServerEX and HDF to capture, prioritize, process and deliver time series data from diverse industrial automation devices to enterprise data lakes
  • SAS® Event Stream Processing
    • Implemented Nifi Processors which integrate with SAS’s Event Stream Processing engine, providing “Complex Event Processing” from the Data Center to the far Edge and back.
  • Syncsort
    • Provides a scalable solution to keep your data in sync between RDBMS and Hadoop

More about HDF: Hortonworks DataFlow (HDF) powered by Apache NiFi, Kafka and Storm addresses Modern DataFlow challenges for an enterprise.  HDF provides the ability to collect, mediate and curate data from different, distributed and disparate data sources and provides real-time stream processing to generate insights from Data In Motion.

Also, reference the on-demand webinar about the HDF Certification program


Leave a Reply

Your email address will not be published. Required fields are marked *