Get fresh updates from Hortonworks by email

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.


Sign up for the Developers Newsletter

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.


Get Started


Ready to Get Started?

Download sandbox

How can we help you?

* I understand I can unsubscribe at any time. I also acknowledge the additional information found in Hortonworks Privacy Policy.
closeClose button
April 01, 2015
prev slideNext slide

The Evolution of the “Internet of Things”: From “Diagnostics and Repair” to “Prescriptive and Proactive”

In this guest blog, Dale Glover, vice president of Industry Consulting at Teradata, discusses the evolution of Internet of Things, how sensor data are used for diagnostics and repair as well as for prescriptive and proactive data analytics.

The Internet of Things (IOT) is upon us, and we see new participants joining this evolution of technology every day. Whether it is some new consumer device/appliance contributing to the ultimate automation of our homes and vehicles, or the deluge of new smartphone-connected devices (Fitbits, watches, or even smart clothing), the number of “things” generating volumes of data in an automatic and consistent manner increases daily. What is not talked about so much, but is even more important, is the evolution of the IOT in industrial and commercial use.

I use the word “evolution”, because the foundation for much of the IOT has grown for decades. For instance, the diagnostic and control systems embedded into equipment from engines to shop floor systems have matured as rapidly as the computer systems that run them.

What Happened?

In the mid-1980s, GM took analog measurements and input parameters from engines, then used a lookup table in a memory chip to yield pre-computed output values. After those simple first designs, electronics became cheaper and more sophisticated, making way for further innovation. Now, even a simple engine designed for industrial use has hundreds of sensors recording thousands of measurements, along with enough memory to store months or even years of data. Until recently, the primary use of these embedded measurements and associated low level diagnostics was to help identify the root cause of failure and facilitate the repair process. Essentially we were looking at “What Happened?”

What’s Happening Now?

The next evolutionary step came from the ability to connect to diagnostics remotely, at a relatively low cost. Most in-home consumer devices now include cheap, embedded Wi-Fi to connect to ubiquitous home broadband connections. Wearable and other mobile devices use Bluetooth to link to a cell phone. During installation, we are often asked for permission for all this equipment to report back their “health”. Most consumers agree, and the information begins to flow. In these cases the cost for communicating data back to the manufacturer is born by the consumer.

In industrial use cases, factory equipment, medical equipment and computer systems are most often installed with the requirement that the end user/buyer provides the manufacturer with remote access capability for service purposes. In the early days this was a simple modem allowing a service technician to access the same embedded functions to facilitate service and repair. This visibility of “What’s Happening?” across deployed equipment was an exciting and cost-effective step, but what came next brought more meaningful and longer term impact.

What’s Going to Happen?

Manufacturers of equipment for medical, high-tech, automotive, and heavy industrial applications began collecting and storing historical data in traditional DBMS systems years ago. Analytic models were developed to understand basic early warning signals and ultimately to predict part failures in advance. Applications of these models enable prescriptive actions to be taken in advance of failures in order to avoid costly equipment outages.

It is the combination of increasingly dense sensor configurations, low or no cost communication, and the ability to apply analytic models in real/right time that has ultimately and forever changed the IOT. While a few world class companies have implemented predictive analytic programs effectively, the majority still face obstacles.

Massive amounts of historical equipment performance data can now be captured and uploaded by the manufacturer, but costs to load and store all this information in traditional DBMS environments have historically been prohibitive. Often information generated by field equipment becomes nothing more than data exhaust, not recorded and ultimately lost forever before any timely decision-making can take place.

Integration & Proactivity: Keys to Future Success

Enter Apache Hadoop. Using Hadoop as a platform to land and store data in its original form and fidelity on low cost commodity hardware dramatically lowers the cost bar and allows retention of vast history of data generated from embedded systems.

In the aircraft industry, a Teradata/Hortonworks client today collects high volumes of sensor data in flight and loads this to a Hortonworks Data Platform (HDP) Hadoop cluster. Combining that data with maintenance history held on the Teradata platform (and using Teradata Query Grid to bridge the systems and join the relevant data sets) has enabled analysis showing patterns of aircraft system and flight conditions. This modeling validates original design in actual field operation and provides insight into improvement opportunities in field support programs.

Another client, a major oil and gas producer, gathers sensor data from a network of wells, production, and storage facilities and brings it into a central, integrated data environment. Engineers continuously evaluate field practices and refine predictive models based on real-world conditions. Daily work is assigned on an exception basis to the highest value critical field opportunities covering predictive parts replacement, transportation logistics, and rework of field systems.

Enterprises have realized this potential and adopted Apache Hadoop as a go-to technology to capture, store and process these net new data sources within their modern data architecture. Data exhaust can now be used to make routine and critical operational decisions. And it’s driving bottom line value to the business.

Hortonworks and Teradata have partnered to deliver a clear path to Big Analytics via stable and reliable Hadoop for the enterprise. The Teradata Portfolio for Hadoop is a flexible offering of products and services for customers to integrate Hadoop into their data architecture while taking advantage of their installed DBMS Analytic environment.

The IOT will continue to evolve with machine generated data volumes growing exponentially along with new and exciting ways to exploit this data to better run our companies and serve our customers. World class organizations must have a strategy to leverage this information and for their technology to evolve in support of it.

Learn More


Leave a Reply

Your email address will not be published. Required fields are marked *