Not that long ago, customers relied on traditional business intelligence tools and data to help guide them in making business decisions to improve efficiencies or increase topline growth. A lot of that data is structured data that captures transactional exchanges, and are being stored in relational databases.
Capturing new data sources such as geo-location, sensors, social media, and machine-generated data provide opportunities to push innovation beyond its current boundaries. That data however, is unstructured in nature and doesn’t fit in the rigid pre-defined structure of those relational databases. A new approach is needed to handle the volume, velocity and variety of these data sources. Apache Hadoop, an open source platform for distributed storage and processing of large data sets, has been selected by many customers to address this big data opportunity.
The strength and maturity of a technology can be measured by the depth and breath of its eco-system. Customers want choice when deciding on a strategic platform that will not only deal with their needs of today, but will set them up to stay ahead for whatever comes next. A wide range of decision factors come into play for building an infrastructure that will maximize the use of available footprint, while lowering CAPEX, power consumption and management cost.
Hortonworks and Cisco have been working together to provide the benefits of Hortonworks Data Platform (HDP), a 100% open source Hadoop platform, in combination with Cisco’s new UCS S-Series Storage Server, which provides software-defined storage, scale-out object storage, and x86 servers to deliver a scalable platform for data intensive workloads.
For more information, click here.