Hadoop Insights

News about Hadoop in the wild; how Hadoop is being used; how Hadoop can be used.

Changes in technology and customer expectations create new challenges for how insurers engage their customers, manage risk information and control the rising frequency and severity of claims.

Carriers need to rethink traditional models for customer engagement. Advances in technology and the adoption of retail engagement models drive fundamental changes in how customers shop for and purchase insurance coverage. To engage with their customers, our insurance customers seek “omni-channel” insight and the ability to confidently recommend the next best action (NBA) to their customers.…

Hortonworks provides enterprise Hadoop for the telecommunications service provider, and Hortonworks Data Platform (HDP) is architected from the ground up with the centralized YARN-based architecture and core enterprise services for data governance, security and cluster operations that can revolutionize your telecommunications business.

As the originators of Hadoop, leaders in the developer community, and partners for your success, nobody is better to help you become a data-centric telecommunications enterprise.

Hortonworks supports most of the largest North American carriers.…

This is a unique moment in time. Fueled by open source, Apache Hadoop has become an essential part of the modern enterprise data architecture and the Hadoop market is accelerating at an amazing rate.

The impressive thing about successful open source projects is the pace of the “release early, release often” development cycle, also known as upstream innovation. The process moves through major and minor releases at a regular clip and the downstream users get to pick the releases and versions they want to consume for their specific needs.…

Since our founding in 2011, Hortonworks has had a fundamental belief: the only way to deliver infrastructure platform technology is completely in open source. Moreover, we believe that collaborative open source software development under the governance model of an entity like the Apache Software Foundation (ASF) is the best way to accelerate innovation that targets enterprise end users since it brings the largest number of developers together in a way that enables innovation to happen far faster than any single vendor could achieve and in a way that is free of friction for the enterprise.…

The Beginning of our Oil and Gas Journey

Hortonworks began working with the Oil & Gas industry in November of 2013 and our involvement accelerated during a very busy 2014 campaign. Our momentum was set against a backdrop early in the year of milestones in drilling and production across unconventional shale plays in North America, along with with a number of acquisitions, mergers, and divestitures that continued to shape the industry landscape.…

The public sector is charged with protecting citizens, responding to constituents, providing services and maintaining infrastructure. In many instances, the demands of these responsibilities increase while government resources simultaneously shrink under budget pressures.

How can Intelligence, Defense and Civilian agencies do more with less?

Apache Hadoop is part of the answer. Within the public sector, Hadoop delivers data-driven actions in support of IT efficiency and good government.

Download the White Paper

In one example, the United States Internal Revenue Service had to reduce its auditor headcount due to budget cuts.…

Modern retailers collect data from a multitude of consumer engagement channels, including point of sale systems, the web, mobile applications, social media, and more. They hope to use this data to derive greater customer insights, promote increased brand engagement and loyalty, optimize pricing and promotions, streamline the supply chain, and enhance their business models.

Data from the retailer’s transactional systems has historically been stored in an enterprise data warehouse (EDW) or other database, but these traditional data repositories are not well suited for the newer, unstructured data types like log files, social media updates and information from in-store sensors.…

Merv Adrian couldn’t have said it better. In his blog post from the weekend, he continued in his quest to define Hadoop. And it is no easy quest as the components of, and evolution of, Hadoop is happening at a pace that is, frankly, astounding.

The continuous evolution of Hadoop has even given rise to sentiments such as ‘Is Hadoop dead? ‘ The answer to that question is YES. And NO.  …

We certainly live in interesting times. About 20 months ago, in an effort to find proprietary differentiation that could be used to monetize and lock in customers to their model, Cloudera unveiled Impala and at that time Mike Olson stated “Our view is that, long-term, this will supplant Hive”. Only 6 months ago in his Impala v Hive post, Olson defended his “decision to develop Impala from the ground up as a new project, rather than improving the existing Apache Hive project” stating “Put bluntly: We chose to build Impala because Hive is the wrong architecture for real-time distributed SQL processing.”

So, 20 months after abandoning Hive and repeated marketing attempts to throw Hive and many other SQL alternatives under the bus in lieu of their “better” approach, I’m certainly puzzled as Cloudera unveils their plan to enable Apache Hive to run on Apache Spark; please see HIVE-7292 for details.…

We’re finally catching our breath after a phenomenal Hadoop Summit event last week in San Jose.  Thank you to everyone that came to participate in the celebration of Hadoop advances and adoption—from many of the organizations that shared their Hadoop journey with us that fundamentally transformed their businesses, to those just getting started, to the huge ecosystem of vendors. It is amazing to be part of such a broad and deep community that is contributing to making the market for everyone.…

Hurry, time is running out to join Mike Ferguson, independent analyst and thought leader in Business Analytics, Big Data, Data Management and Smart Business, as he explores how the growing business demand to analyze new sources of data is impacting on traditional architectures and how these architectures need to change to accommodate big data analytical workloads.

In this brief session, Mike looks at new analytical use cases, the types of data that need to be analyzed, and the role of Hadoop in a modern analytical environment.…

If there’s one thing my interactions with our customers has taught me, it’s that Apache Hadoop didn’t disrupt the datacenter, the data did. The explosion of new types of data in recent years has put tremendous pressure on the datacenter, both technically and financially, and an architectural shift is underway where Enterprise Hadoop is playing a key role in the resulting modern data architecture.

Download our Whitepaper: Hadoop and a Modern Data Architecture.

Due to the flourish of Apache Software Foundation projects that have emerged in recent years in and around the Apache Hadoop project, a common question I get from mainstream enterprises is: What is the definition of Hadoop?

Download our Whitepaper: Hadoop and a Modern Data Architecture.

This question goes beyond the Apache Hadoop project itself, since most folks know that it’s an open source technology borne out of the experience of web scale consumer companies such as Yahoo!, Facebook and others who were confronted with the need to store and process massive quantities of data.…

This is the seventh in our series on modern data architectures across industry verticals. Others in the series are:

Any financial services business cares about minimizing risk and maximizing opportunity. Banks weigh the risk of opening accounts versus the opportunity to hold deposits.…

Luminar is one of Hortonworks’ original customers. Apache Hadoop is a pillar of their modern data architecture, and since choosing Hortonworks in 2012, the Luminar team became expert users of Hortonworks Data Platform version 1.

They were eager to migrate to HDP2 after it launched in October 2013.

I recently spoke with Juan Manuel Alonso, Luminar’s Manager of Insights. Juan Manuel worked with the Hortonworks professional services team to plan and execute the migration from HDP1 to HDP2.…