Healthcare has long been touted as one of the biggest areas of opportunity for big data initiatives – McKinsey's influential 2011 big data report suggested improved analytics could save the industry $300 billion a year in the U.S. – but technologies such as Hadoop are enabling the industry to actually embrace the potential its data holds. In a recent column for TDWI, cloud computing and big data expert David Linthicum explained how processing technologies such as Hadoop, coupled with scalable cloud resources, are powering cutting-edge healthcare research in predictive analytics.
Linthicum highlighted a recent study from researchers at the University of Indiana, who found that a pair of predictive modeling techniques can make better treatment recommendations than doctors acting independently. The study found that an artificial intelligence framework combining Markov decision processes and dynamic decision networks – essentially looking at patient attributes and past cases to predict future events – could reduce healthcare costs by nearly 60 percent while improving patient outcomes by more than 41 percent.
The study offered a validation of many analytics principles by showing that models based on high-quality data could be used to improve decision making, Linthicum noted.
"What's changed recently to make this technology possible?" he wrote. "The commoditization of data storage and processing, including the rise of technologies such as Hadoop that are able to process petabytes of structured and unstructured data in a very short period of time. Another factor is the rise of 'rental' computing models … Furthermore, the public generally accepts that these types of systems will only enhance our ability to do our jobs, not jeopardize them."
Powered by Hadoop file systems, big data and predictive analytics will continue to be a factor in breaking healthcare research, Linthicum noted, adding that the technology and associated strategies will be significant in many other verticals as well.