Unlocking the human brain with big data

Big data has already been leveraged by healthcare professionals to enhance their clinical research and treatment efforts for several years now. For example, predictive analytics applications have been used to determine the likelihood of chronic ailments such as heart disease developing and identifying early warning signs for traumatic brain injuries. An MIT research group has even experimented with using data analytics software to predict heart failure and diagnose mental diseases from the results of EEG tests.

Mapping the human brain
The federal government now appears to be throwing its weight behind big data applications in medical research. InformationWeek reported that President Barack Obama recently announced the formation of a data analytics initiative to map out the human brain. The Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative is a collaboration between several organizations involved in the medical and analytics fields, including the National Institutes of Health and the Defense Advanced Research Projects Agency.

The goal of the project is to facilitate the development of new technology that will allow researchers to determine how brain cells and neural circuits interact in real time. Theoretically, that information could be used to produce a near endless list of benefits. For instance, medical researchers are hopeful that by obtaining a better understanding of complex neurological operations, they may be able to develop more effective treatments for degenerative diseases such as Parkinson's and Alzheimer's.

Tapping into more data processing power
The amount of processing power needed to achieve such a feat would be immense. Current technology may prohibit such a venture, but the initiative's members are optimistic that the upward trend of big data analytics indicates that the level of data processing needed is not far off.

"There have been some conversations about whether the amount of data, if you are going to collect data from tens of thousands of neurons in real time, can you process and store it," NIH director Francis Collins said. "This is generally in the direction of the capability where things are headed."

This monumental undertaking will undoubtedly require powerful and wholly unique data analytics processes. It is very possible that researchers could utilize Hadoop architecture to produce the tools needed to further their endeavors. Hadoop's open source platform allows software engineers to craft the custom tools they require to achieve specific goals. Because Hadoop is entirely scalable, researchers could build software tools as large and complex as they needed.

Categorized by :

Leave a Reply

Your email address will not be published. Required fields are marked *

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.