Hortonworks and Hadoop reach the big data summit

Successfully launching a big data project is easier said than done. The immense value provided by a firmly established data analytics program is no doubt worth the effort to deploy, but that doesn't mean organizations won't face challenges along the way. The main obstacle that businesses often run into when trying to get a big data operation off the ground is that their chosen analytics solution is ill-equipped to meet their needs. Proprietary software options are often limiting in regard to how they can be utilized and which data streams they can tap into effectively. In addition, they may not facilitate scalable solutions, resulting in companies investing in analytics platforms that are either too constraining or too complex for their needs. These concerns have driven the open source Apache Hadoop platform to the top of the big data industry. 

SAS recently released a white paper detailing the benefits of the Hadoop framework, noting that the technology provided an optimal solution for organizations that wished to employ a powerful, comprehensive, flexible and cost-effective analytics platform. Many highly specialized, proprietary big data options require companies to invest in expensive hardware to facilitate storage and computing needs. Hadoop can operate on commodity hardware, however, resulting in a tenth of the investment cost necessitated by its competitors.

That low cost of investment wouldn't mean much if the Hadoop architecture couldn't back it up with a high level of performance. The platform provides the tools needed to collect, sort and process large volumes of data with ease. According to the report, a large cluster containing 1,400 nodes was able to sort a terabyte of data in 62 seconds during a performance test. When the system sorts incoming data streams, including unstructured data such as video files, it disperses copies of the information across multiple Hadoop clusters, providing analysts with backups in case a component were to fail. 

Hortonworks leads the way
Leading the pack in Hadoop advancement is enterprise software company Hortonworks. The organization was recently recognized for its contributions in the field of big data. Ventana Research recently announced that Hortonworks would receive the 2013 Technology Innovation award for its ongoing efforts to push development and increase the performance of this burgeoning technology.

"We congratulate Hortonworks for its visionary and transformative applications of the Apache Hadoop platform," Ventana Research CEO Mark Smith stated. "The technology delivers real business value to its customers. The company has exhibited a high level of expertise in the Hadoop market and truly advanced the computing potential of its enterprise data platform, making it a clear innovator of big data technology."

Categorized by :
Other

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

HDP 2.1 Webinar Series
Join us for a series of talks on some of the new enterprise functionality available in HDP 2.1 including data governance, security, operations and data access :
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.