The Hortonworks Blog

Posts categorized by : Hadoop in the Enterprise

Geoff Flood is president of T4G Limited and co-chair of the province of New Brunswick Research & Innovation Council. In this guest blog, Geoff elaborates on why “partnering with Hortonworks was simply a no-brainer for us. It’s a decision that will deliver prized and measurable value to our customers.”

Big data is more than just buzz; it’s a big deal. It’s changing everything in our lives and all around us. As president of a successful technology services firm in Canada, I knew we had to change, too, when it comes to designing, developing and implementing solutions for our customers across North America.…

I can’t believe it’s been 6 months since we first announced for expanded strategic alliance with Red Hat. For those that have been following this partnership, you know our goal is simple — to help organizations adopt enterprise Apache Hadoop as part of their modern data architecture. Our expanded relationship with Red Hat is closely aligned around a strategy of innovating in the open and applying enterprise rigor to open source software, thereby de-risking it for the enterprise, and allowing faster adoption for Enterprise Apache Hadoop.…

The open source community, including Hortonworks, has invested heavily in building enterprise grade security for Apache Hadoop. These efforts include Apache Knox for perimeter security, Kerberos for strong authentication and the recently announced Apache Argus incubator that brings a central administration framework for authorization and auditing.

Join Hortonworks and Voltage Security in a webinar on August 27  to learn more.

In multi-platform environments with data coming from many different sources, personally identifiable information, credit card numbers, and intellectual property can land in the Hadoop cluster.…

Zettaset is a Hortonworks partner. In this guest blog, John Armstrong, VP of Marketing at Zettaset Inc., shares Zettaset’s security features and explains why data encryption is vital for data in the Hadoop infrastructure.

Comprehensive Security Across the Hadoop Infrastructure

As big data technologies like Hadoop become widely deployed in production environments, the expectation is that they will meet the enterprise requirements in data governance, operations and security while integrating with existing data center infrastructure. …

The key to monetization of Big Data is not only the ability to capture and process information quickly but also to analyze the data to derive meaningful insights.  Big Data can be complex, and often the expertise of a programmer is needed to create focused and targeted queries.

0xdata, a provider of open source machine learning and predictive analytics for Big Data, helps to facilitate the manipulation and extraction of data with the use of its H2O prediction engine for statisticians. …

With the release of Apache Hadoop YARN in October of last year, more and more solution providers are moving from single-application Hadoop clusters to a versatile, integrated Hadoop 2 data platform. This allows them to host multiple applications — eliminating silos, maximizing resources and bringing true multi-workload capabilities to Hadoop. 

That is why we’re  extremely excited to have Paul Kent, Vice President of Big Data at SAS, share his insights on the value of Apache Hadoop YARN and the benefits it brings to SAS and its users. …

Few industries depend as heavily on data as financial services. Insurance companies, retail and investment banks aggregate, price and distribute capital with the aim of increasing their return on assets with an acceptable level of risk.

To do that, financial decision-makers need data. Apache Hadoop helps them store new data sources, then process the larger combined dataset for batch, interactive and real-time analysis. More data and better analysis improves bottom-line results.…

The world’s top telecommunications firms adopt Hadoop to gain competitive advantage and to respond to technology-driven changes like increases in both network traffic and the telemetry data captured by network sensors.

The majority of North America’s and Europe’s telcos have chosen Hortonworks Data Platform (HDP) to meet these challenges. Read the new Hortonworks white paper for a detailed discussion of twenty-one common telco and cable company use cases.

Download the White Paper

With their Modern Data Architectures based on HDP, these firms improve efficiency and capture opportunities in some of these ways:

  • Analyze call detail records (CDRs).

ScaleOut joined the Hortonworks Technology Partner Program and has recently achieved Hortonworks Certified status for ScaleOut hServer. ScaleOut Software is a pioneer in in-memory data grid software and the ScaleOut hServer can be installed directly on Hadoop nodes and runs in-memory. In this guest blog, William Bain, Founder and CEO, talks about certification and a use case.

Recently, ScaleOut Software announced technical certification of its ScaleOut hServer® product on Hortonworks Data Platform 2.1.…

This is a quest blog from Voltage Security, a Hortonworks partner.

Data Security for Hadoop is a critical requirement for adoption within the enterprise. Organizations must protect sensitive customer, partner and internal information and adhere to an ever-increasing set of compliance requirements. The security challenges these organizations are facing are diverse and the technology is evolving rapidly to keep pace. 

An Open Community For Platform Security

The open source community, including Hortonworks, has invested heavily in building enterprise grade security for Apache Hadoop. …

This is a guest post from Hortonworks partner, Dataguise. Dataguise is a HDP 2.1 certified technology partner providing sensitive data discovery, protection and reporting in Hadoop.

According to a 2013 Global Data Breach study by the Ponemon Institute, the average cost of data loss exceeds $5.4 million per breach, and the average per person cost of lost data approaching $200 per record in the United States. That said, no industry is spared from this threat and all of our data systems, including Hadoop, need to address the security concern.…

A transformation is occurring in the data center.  Enterprises are turning to a modern data architecture in order to derive maximum value from both big and small data across their organization.  They are building new analytic apps that unlock opportunity and allow them to maintain or create competitive edge. Apache Hadoop is at the center of this architecture and integrates with the technologies that run your business to augment and extend this new value.…

There are many projects that have been contributed to the Apache Software Foundation (ASF) by both vendors and users alike that greatly expand Apache Hadoop’s capabilities as an enterprise data platform.

While Hadoop – with YARN at its architectural center – provides the foundational capabilities for managing and accessing data at scale, a broader blueprint for Enterprise Hadoop has emerged that specifies how this array of Apache projects fit across five distinct pillars to form a complete enterprise data platform: data access, data management, security, operations and governance.…

Today we are excited to announce a deepening of our strategic partnership with HP . This news builds on the reseller partnership that we established in 2013 enabling HP to resell the Hortonworks Data Platform. It also allows us to build on the HP AllianceOne ConvergedSystems Partner of the Year Award that we received at the recent HP Discover 2014 conference for our strategic partnership.

Given the rapid adoption of Enterprise Hadoop as a core component of a modern data architecture combined with the fact that HP is the world’s leading server vendor in terms of shipments AND revenues according to IDC – meaning a significant number of those Hadoop nodes are being deployed with HP technologies – it’s hardly surprising that we’ve been collaborating closely.…

Although the Hadoop Summit San Jose 2014 has come and gone, the invaluable content—keynotes, sessions, and tracks—is available here. We ’ve selected a few sessions for Hadoop developers, practitioners, and architects, curating them under Apache Hadoop YARN, the architectural center and the data operating system.

In most of the keynotes and tracks three themes resonated:

  • Enterprises are transitioning from traditional Hadoop to modern Hadoop 2.
  • YARN is an enabler, the central orchestrator that facilitates multiple workloads, runs multiple data engines, and supports multiple access patterns—batch, interactive, streaming, and real-time—in Apache Hadoop 2.
  • Go to page:12345...10...Last »