Get fresh updates from Hortonworks by email

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.


Get Started


Ready to Get Started?

Download sandbox

How can we help you?

closeClose button

Hortonworks Technical Workshop: Streamline Hadoop Development

Recorded on September 10th, 2015

As developers, we like to build and test applications in our favorite IDE. We prefer to debug applications using checkpoints and be able to trace through our code. This paradigm of development is disrupted in a distributed compute environment with execution spread across multiple hosts and JVMs. In this session we look into using Hadoop Mini Cluster to efficiently develop and test applications in a local environment, prior to distributed deployment. We will walk through a real life example of using Hadoop Mini Cluster to develop a streaming application using Storm, Kafka and HBase.


Thangaperumal Sridharkumar says:


Raghavendra says:

Please is it possible to have the detail step by step explanation

Leave a Reply

Your email address will not be published. Required fields are marked *