Get fresh updates from Hortonworks by email

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.


Get Started


Ready to Get Started?

Download sandbox

How can we help you?

closeClose button

How to Accelerate Real Time Data Ingest and Automate Transfer into Apache™ Hadoop® with Hortonworks DataFlow (HDF) and Attunity

Recorded on July 14th, 2016

Big data is transforming the way that organisations use and manage data. They now have more data in motion and at rest than ever before in higher velocities and from more sources across the organisation. Businesses can’t afford to miss opportunities for deeper insight due to time spent “data wrangling”. They are also looking for enterprise-class data loading solutions that go beyond simple tools such as Sqoop, which is more suitable for test and dev environments.

Join this webinar to learn about how Attunity and Hortonworks solutions alleviate those challenges. You will hear:

  • How to ingest real time data and transfer it into Hadoop
  • Real life use cases of Hortonworks DataFlow (HDF) powered by Apache™ NiFi
  • How to combine the real time Change Data Capture (CDC) and connected data platforms from Hortonworks
  • You’ll see a live demo of Attunity Replicate and HDF running together to move operational data collected in real time into Hadoop.

You’ll also have the chance to ask live audience questions to the experts about your big data challenges.


  • Leave a Reply

    Your email address will not be published. Required fields are marked *

    In association with :