This guest post from John Haddad, Director of Product Marketing at Informatica Corporation. He has over 25 years’ experience designing, building, integrating and marketing enterprise applications. His current focus is helping organizations get the most business value from Big Data by delivering timely, trusted, and relevant data across the extended enterprise.
Why is it so important for companies today to adopt a modern data architecture and why is next generation data integration on Apache Hadoop such a critical component?
Business is finding that success depends much more on its ability to predict outcomes and proactively engage customers rather than simply reacting to situations once it’s too late to cash in on opportunities. This predictive power increases when there is access to more data and more types of data both historical and in real-time. Common examples of using predictive analytics to improve business include predictive maintenance in manufacturing to improve customer service and reliability, predicting patient outcomes in healthcare to lower total cost of care, predicting next best offers in retail, and detecting fraud in financial services.
Before you can do big data analytics such as those that use predictive algorithms you need to reliably access, store, integrate, and prepare all types of data at scale. It turns out this ability to essentially access and prepare the data accounts for about 80% of the work in a big data project. The demand on today’s data architecture has traditional IT infrastructure bursting at the seams in terms of its ability to store and process growing data volume from an ever-increasing number of new data sources. Most attempts by IT to address these challenges result in costly hardware and software upgrades. However one of the biggest challenges they face is a lack of resource skills that understand and can implement big data technologies from sandbox to production. But companies don’t need to be paralyzed by high costs and a lack of skills to realize the promise of big data.
New technologies like Hadoop are challenging the status quo and facilitating an evolution in information management architectures, process methodologies, and best practices. The traditional data architecture is giving way to a new modern data architecture that includes Hadoop as a “+1” to the existing systems. This opens the door to new ways of thinking and rapid innovation. Hortonworks and Informatica have teamed up to provide the data systems and tools making up the foundation of the modern data architecture that delivers on the promise of big data. Hortonworks provides a completely open-source data platform to cost-effectively store and process massive amounts of data on Hadoop. And Informatica provides the killer app on Hadoop so that you can use already trained skills to access, integrate, and prepare data for analysis at scale natively on Hadoop thus addressing one of the key requirements to leverage this new data. This modern data architecture makes it possible for organizations to use all the data internal and external to an enterprise to achieve the full predictive power that drives the success of modern data-driven businesses.
To get started extending your current IT infrastructure into a modern data architecture read the white paper, The Safe On-Ramp to Big Data: Lower Costs, Minimize Risk, and Innovate Faster with a Proven Approach to Big Data