Get fresh updates from Hortonworks by email

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.


Sign up for the Developers Newsletter

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.


Get Started


Ready to Get Started?

Download sandbox

How can we help you?

* I understand I can unsubscribe at any time. I also acknowledge the additional information found in Hortonworks Privacy Policy.
closeClose button
December 21, 2016
prev slideNext slide

3 Ways Hortonworks Helps You Have More Time for the Holidays


As the hectic holiday season nears, we’re all looking for way to have a little more time for friends and family, to enjoy the season and perhaps slow down a little. But as many of us know, business doesn’t always wait. And for some, it’s one of the busiest times of year. Wouldn’t it be wonderful if you could somehow have a few more hours for your personal life this time of year? Hortonworks can help. If you are working with big data, Hortonworks offers 3 ways to make your big data project more efficient.

1) Hortonworks DataFlow: Transform Data Ingest from Months to Minutes

Getting the data you need for analysis can be a long haul. It can take weeks and sometimes months to gather all the different types of data you need, convert it into the right format and get it to the right place just so you can start your big data project. With almost-just-a-click-of-a- button, Hortonworks DataFlow makes it easy to collect data and move it where you need it, by providing a drag and drop user interface with real-time control and management of dataflow.

Watch how you can move data into HDFS in 30 seconds. Learn more about HDF Data Ingest  



2) Hortonworks HDCloud: Model and Analyze Data Sets in Minutes

Hortonworks Data Cloud for AWS is a platform for analyzing and processing data, enabling businesses to achieve insights more quickly and with greater flexibility than ever before. Instead of going through infinite configuration options, you can choose from a set of prescriptive cluster configurations (for example: Apache Spark for data processing or Apache Hive for data analytics) and you can start modeling and analyzing your data sets in minutes.Try it out here:


3) Hortonworks SmartSense: Save Hours of Effort

Monitoring, managing and tuning a big data deployment takes a lot of effort. Hadoop consists of 27 Apache projects with 87 components, and HDFS alone has over 250 configuration parameters. Layer on all the other components needed for a production grade deployment – hardware, cluster management and the versatility and options of open source innovation means manually maintaining optimal performance of your production grade clusters can become a painstaking and never-ending project.

Hortonworks Smartsense helps automate the optimization and maintenance of your Hadoop cluster, through tailored recommendations generated from the latest best practices, and analysis of how you are actually using your cluster. Learn more about the proactive capabilities Smartsense that save you time and effort here.




Leave a Reply

Your email address will not be published. Required fields are marked *