The Hortonworks Blog

Posts categorized by : Other

This guest post is from Vamsi Chemitiganti, chief architect of Red Hat’s Financial Services Vertical. Vamsi is responsible for driving Red Hat’s technology vision from a client standpoint. His areas of focus range from platform, middleware, storage to big data and cloud (IaaS and PaaS). The clients Vamsi engages with on a daily basis span marquee names on Wall Street, including businesses in capital markets, core banking, wealth management and IT operations.…

Apache Hive is the de facto standard for SQL in Hadoop with more enterprises relying on this open source project than any other alternative. Stinger.next, a community based effort, is delivering true enterprise SQL at Hadoop scale and speed.

With Hive’s prominence in the enterprise, security within Hive has come under greater focus from enterprise users. They have come to expect fine grain access control and auditing within Hive. Apache Ranger provides centralized security administration for Hadoop, and it enables fine grain access control and deep auditing for Apache components such as Hive, HBase, HDFS, Storm and Knox.…

“Start with the business problem!” That’s Sanjay’s advice when it comes to building a successful Big Data solution. For those of you who have missed the first part of this video series, Sanjay Krishnamurthi, SVP and Chief Technology Officer at Informatica, and Shaun Connolly, Vice President Corporate Strategy at Hortonworks, address a number of hot Big Data topics throughout a series of nine videos.

Today, they talk about how Big Data projects need to be driven by the business and how IT solutions and frameworks such as Hadoop have to be integrated with the rest of the data systems.…

HDP 2.2 brings substantial innovations in Apache Hadoop YARN, enabling users of Apache Hadoop to efficiently store their data in a single repository and interact with it simultaneously using a wide variety of engines. This functionality makes YARN particularly attractive for the integration of many distributed Long-Running services.

In this release, we also introduced a new framework Apache™ Slider for easy on boarding of Long-Running service on top of YARN.…

Today we’re excited to be jointly announcing with EMC that the Isilon OneFS file system has been certified to work with the Hortonworks Data Platform (HDP). Now Isilon customers who are looking for a robust, enterprise-ready, stable Apache Hadoop platform can use HDP on their Isilon implementations.

Joint Engineering Delivering Choice

We’re excited to see the results of the months of engineering and testing efforts that now provide customers even greater deployment choice for their Hadoop projects as they are implementing a modern data architecture towards a data lake.…

OspreyData is a Hortonworks® technology partner whose solution is certified both for Hortonworks Data Platform and YARN. The company delivers agile big data analytics solutions for the oil and gas industry. In this blog, Al Brown, CTO at OspreyData, shares his thoughts on how the industry is addressing a big problem: unplanned interruptions to production.

A Mandate for Operational Efficiency and Margin Growth

The oil and gas industry is constantly challenged with a mandate to operate more efficiently—both in the oilfield and within the data center.…

This guest post is from Gavin Sherry, Vice President of Engineering, Data, at Pivotal. A long time contributor to database technology, Gavin was one of the early contributors to the PostgreSQL project. This led him to join the Greenplum Database R&D team. More recently, Gavin launched Pivotal HAWQ, Pivotal’s SQL on Hadoop engine.

Intro

In the ten years since Hadoop was first conceived at Yahoo!, the big data market has taken off.…

Recently the Oracle Data Integrator products were certified on the Hortonworks Data Platform version 2.1 and we’re delighted to be working more closely with Oracle engineering on these kinds of efforts. We’re happy to bring this guest blog to you today, written by Alex Kotopoulis, Product Manager, Oracle Data Integration for Big Data, at Oracle to discuss the recent integration and certification initiatives. You can learn more by joining our webinar on November 11, register here.…

Arsalan Tavakoli-Shiraji, customer engagement lead overseeing business development activities at Databricks, is our guest blogger today. In this blog, he discusses our expanded partnership built around Apache Spark on Apache Hadoop in three areas: customers, engineering, and open source.

Today Databricks and Hortonworks are announcing an expanded partnership built around Apache Spark; allow me to explain why we’re thrilled to be embarking on this journey with them.

When we started Databricks last summer, Apache Spark was in the early stages of enterprise adoption.…

Joe Travaglini, director of product marketing at Sqrrl and Ely Kahn, vice president of business development at Sqrrl, are our guest bloggers. They explain Sqrrl’s integration with Hortonworks Data Platform (HDP).

There Is No Secure Perimeter

With the dawn of phenomena such as Cloud Computing and Bring Your Own Device (BYOD), it is no longer the case that there is a well-defined perimeter to secure and defend. Data is able to flow inside, outside, and across your network boundaries with limited interference from traditional controls.…

Last week’s release of Hortonworks Data Platform 2.2 is packed with countless new features for Enterprise Hadoop. These included the results of Hortonworks investment in VERTICAL integration with YARN and HDFS and also HORIZONTAL innovation to ensure the key enterprise services of governance, security, and operations can be applied consistently and reliably across all the components within the Apache Hadoop platform.

To guide you through these capabilities, Hortonworks is hosting a new series of eight Thursday webinars beginning on October 23 and running to December 18.…

Over the last several months, Oracle and Hortonworks have been working together to bring ETL tools and connectors to the Hortonworks Data Platform (HDP). Hortonworks and Oracle have teamed up to provide comprehensive data integration capabilities and technologies that lay the foundation for a modern data integration architecture, delivering on the promise of big data for customers of all sizes and scale.

We’re happy to announce that the Oracle Data Integrator (ODI) is now certified with the HDP 2.1.…

Thanks to all who joined us on our Hortonworks/Voltage webinar, “Securing Hadoop: What are Your Options?” For those who couldn’t attend, we’re sorry we missed you. We’ve included a link to the webinar recording below, and please listen in!

On the webinar, Hortonworks’ Vinod Nair presented the recently-announced Apache Argus incubator: a central policy administration framework across security requirements for authentication, authorization, auditing and data protection. Sudeep Venkatesh, of Voltage Security, defined data-centric protection technologies that easily integrate with Hive, Sqoop, MapReduce and other Hadoop interfaces.…

In case you missed it — earlier this week, Alan Gates and team provided some insights into Stinger.next roadmap around the delivery of Enterprise SQL and Hadoop Scale. We’re excited to continue the conversation and include some of our key partners around their excitement on this important initiative. Today’s guest blogger, Michael Hiskey, Chief Product Evangelist & Product Marketing, from MicroStrategy, provides some insight on the Stinger.next initiatives and how this will benefit MicroStrategy customers and the overall Big Data and Hadoop community.…

Since our founding in mid-2011, our vision for Hadoop has been that “half the world’s data will be processed by Hadoop”. With that long-term vision in mind, we focus on the mission to establish Hadoop as the foundational technology of the modern enterprise data architecture that unlocks a whole new class of data-driven applications that weren’t previously possible.

We use what we call the “Blueprint for Enterprise Hadoop” for guiding how we invest in Hadoop-related open source technologies as well as enabling the key integration points that are important for deploying Enterprise Hadoop within a modern data architecture, on-premises or in the cloud, in a way that enables the business and its users to maximize the value from their data.…