Today Concurrent announced that we have certified the Hortonworks Data Platform against the Cascading application framework. As Hadoop adoption continues to grow more organizations are looking to take advantage of new data types and build new applications for the enterprise. By combining our enterprise-grade data platform and unparalleled growing ecosystem with the power, maturity and broad platform support of Concurrent’s Cascading application framework, we have now closed the modeling, development and production loop for all data-oriented applications.
Cascading and Big Data Applications
For those that aren’t familiar Cascading is the most widely used and deployed application framework for building robust, enterprise Big Data applications on Hadoop. Recognized companies, including The Climate Corporation, eBay, Etsy, FlightCaster, iCrossing, Razorfish, Trulia, TeleNav and Twitter, are using Cascading to streamline data processing, data filtering and workflow optimization for large volumes of unstructured and semi-structured data. Cascading is also at the core of popular language extensions including PyCascading (Python + Cascading), Scalding (Scala + Cascading) and Cascalog (Clojure + Cascading) – open source projects sponsored by Twitter. Cascading has become the most reliable and repeatable way of building and deploying Big Data applications.
Cascading and Hortonworks Data Platform
HDP is the only 100-percent open source ApacheTM Hadoop®-based data management platform. HDP allows users to capture, process and share data in any format and at scale. Built and packaged by the core architects, builders and operators of Hadoop, HDP includes all of the necessary components to manage a cluster at scale and uncover business insights from existing and new big data sources.
Together, with the simplicity and flexibility of Cascading and the reliability and stability of the HDP, companies can rapidly build, test and deploy new data transformation and refinement, data processing, analytics and machine-learning applications. Enterprises can now leverage existing skill sets, core competencies and product investments by carrying them over to HDP via the standards-based technology – Java, ANSI SQL and machine-learning standards. Analysts and data scientists familiar with these can now easily run predictive data models at scale and integrate ETL, data preparation and predictive analytics in the same application, greatly reducing time to production and unlocking access to large Hadoop data sets.
You can read more about Modern Data Architecture with Hadoop here.