cta

Get Started

cloud

Ready to Get Started?

Download sandbox

How can we help you?

closeClose button
cta

Connected Data Platforms for the Public Sector

cloud Ready to Get Started?

Download Sandbox

METRO TRANSIT OF SAINT LOUIS

Metro Transit of St. Louis (MTL) operates the public transportation system for the St. Louis metropolitan region. Hortonworks Data Platform helps MTL meet their mission by storing and analyzing IoT data from the city’s Smart Buses. HDP® helped the agency cut average cost per mile driven by its buses from $0.92 to $0.43. It achieved that cost reduction while simultaneously doubling the annual miles driven per bus.

Public Sector Benefits from the Power of Data

The public sector is charged with protecting citizens, responding to constituents, providing services and maintaining infrastructure. In many instances, the demands of these responsibilities increase while government resources simultaneously shrink under budget pressures.

How can government, defense and intelligence agencies and government contractors do more with less? The Hortonworks Connected Data Platform is part of the answer.

Use Machine and Sensor Data to Proactively Maintain Public Infrastructure

Metro Transit of St. Louis (MTL) operates the public transportation system for the St. Louis metropolitan region. Hortonworks Data Platform helps MTL meet their mission by storing and analyzing IoT data from the city’s Smart Buses, which helped the agency cut average cost per mile driven by its buses from $0.92 to $0.43. It achieved that cost reduction while simultaneously doubling the annual miles driven per bus. Hortonworks delivered the MTL solution in partnership with LHP Telematics, an industry leader in creating custom telematics solutions for connected vehicles in the heavy equipment OEM marketplace, transportation, service, and construction fleets. The combined solution is making MTL bus service more reliable–improving the Mean Time Between Failures (MTBF) for metro buses by a factor of five, from four thousand to twenty-one thousand miles.


Understand Public Sentiment About Government Performance

One federal ministry in a European country wanted to better understand the views of its constituents related to a major initiative to reduce obesity. Direct outreach for feedback might have been effective for a few high-quality interactions with a small number of citizens or school age children, but those methods lacked both reach and persistence. So the Ministry started analyzing social media posts related to its program to reduce obesity. Every day, a team uses HDP to analyze tweets, posts and chat sessions and give daily sentiment reports to members of parliament for rapid feedback on which polices work and which flop.


Protect Critical Networks from Threats (Both Internal and External)

Large IT networks generate server logs with data on who accesses the network and the actions that they take. Server log data is typically seen as exhaust data, characterized by a “needle-in-a-haystack” dilemma: almost all server logs have no value, but some logs contain information critical to national defense. The challenge is to identify actual risks amongst the noise, before they lead to loss of classified information. Now intruders plan long-term, strategic campaigns referred to as “Advanced Persistent Threats” (APTs). Both internal actors like Edward Snowden or external attackers in foreign governments conduct sophisticated, multi-year intrusion campaigns. Hadoop’s processing power makes it easier to find the “needles” left by these intruders across the different data “haystacks”. This generalized approach is described in a paper published by Lockheed Martin entitled “Intelligence-Driven Computer Network Defense Informed by Analysis of Adversary Campaigns and Intrusion Kill Chains”: Network defense techniques which leverage knowledge about these adversaries can create an intelligence feedback loop, enabling defenders to establish a state of information superiority which decreases the adversary’s likelihood of success with each subsequent intrusion attempt. Apache Hadoop can provide that information superiority to protect against sustained campaigns by malicious users.


Prevent Fraud and Waste

One federal agency with a large pool of beneficiaries turned to Apache Hadoop and the Hortonworks Data Platform to discover fraudulent claims for benefits. The implementation reduced ETL processing from 9 hours to 1 hour, which allowed them to create new data models around fraud, waste and abuse. After speeding the ETL process, the agency used that efficiency to triple the data included in its daily processing. Because Hadoop is a “schema on read” system, rather than the traditional “schema on load” platform, the agency now plans to search additional legacy systems and include more upstream contextual data (such as social media and online content) in its analysis. All of this will make it easier to identify and stop fraud, waste and abuse.

Analyze Social Media to Identify Terrorist Threats

Terrorist networks attempt to avoid detection by organizing and communicating across diffuse, informal networks. Yet the nature of these social networks contains information that can be used to detect and thwart malicious activity. With social network analysis over huge sets of data, intelligence agencies who identify one malicious individual can find accomplices within six degrees of separation from the known bad guy. Apache Hadoop makes this analysis efficient. Of course, not everyone in contact with a known terrorist is complicit. In fact, most are uninvolved in any wrongdoing. This is another needle-in-a-haystack problem. Hadoop can store structured and unstructured data on individuals, their actions and the way they communicate. Analysis on that social data gives agencies actionable intelligence enabling them to leave innocents alone and focus on those intending harm.

Decrease Budget Pressures by Offloading Expensive SQL Workloads

During the recent sequestration standoff within the United States federal government, IT budgets came under increased scrutiny and budgetary pressure. Many agencies turned to a major consulting firm that recommended Hortonworks Data Platform for offloading certain data sets to Hadoop. This recommendation was based on the best practice of putting each and every data workload in the most appropriate place. HDP interoperates with all of the major relational data warehouse platforms used by federal agencies. It doesn’t make economic sense to store certain types of data in those platforms, so transitioning less structured data sets to Hadoop reduced expenses without disrupting any existing data or operations. Now the same data is accessible as before but stored at a lower cost.

The MITRE Corporation Transforms Aviation
customer
The MITRE Corporation Transforms Aviation

The MITRE Corporation® is a not-for-profit company that operates multiple federally funded research and development centers (FFRDCs). One of these FFRDCs is the Center for Advanced Aviation System Development (CAASD), which serves the public interest by advancing the safety, security, effectiveness, and efficiency of aerospace in the United States and around the world. Transforming the National…