The Hortonworks Blog

Posts categorized by : Other

Joe Travaglini, director of product marketing at Sqrrl and Ely Kahn, vice president of business development at Sqrrl, are our guest bloggers. They explain Sqrrl’s integration with Hortonworks Data Platform (HDP).

There Is No Secure Perimeter

With the dawn of phenomena such as Cloud Computing and Bring Your Own Device (BYOD), it is no longer the case that there is a well-defined perimeter to secure and defend. Data is able to flow inside, outside, and across your network boundaries with limited interference from traditional controls.…

Last week’s release of Hortonworks Data Platform 2.2 is packed with countless new features for Enterprise Hadoop. These included the results of Hortonworks investment in VERTICAL integration with YARN and HDFS and also HORIZONTAL innovation to ensure the key enterprise services of governance, security, and operations can be applied consistently and reliably across all the components within the Apache Hadoop platform.

To guide you through these capabilities, Hortonworks is hosting a new series of eight Thursday webinars beginning on October 23 and running to December 18.…

Over the last several months, Oracle and Hortonworks have been working together to bring ETL tools and connectors to the Hortonworks Data Platform (HDP). Hortonworks and Oracle have teamed up to provide comprehensive data integration capabilities and technologies that lay the foundation for a modern data integration architecture, delivering on the promise of big data for customers of all sizes and scale.

We’re happy to announce that the Oracle Data Integrator (ODI) is now certified with the HDP 2.1.…

Thanks to all who joined us on our Hortonworks/Voltage webinar, “Securing Hadoop: What are Your Options?” For those who couldn’t attend, we’re sorry we missed you. We’ve included a link to the webinar recording below, and please listen in!

On the webinar, Hortonworks’ Vinod Nair presented the recently-announced Apache Argus incubator: a central policy administration framework across security requirements for authentication, authorization, auditing and data protection. Sudeep Venkatesh, of Voltage Security, defined data-centric protection technologies that easily integrate with Hive, Sqoop, MapReduce and other Hadoop interfaces.…

In case you missed it — earlier this week, Alan Gates and team provided some insights into Stinger.next roadmap around the delivery of Enterprise SQL and Hadoop Scale. We’re excited to continue the conversation and include some of our key partners around their excitement on this important initiative. Today’s guest blogger, Michael Hiskey, Chief Product Evangelist & Product Marketing, from MicroStrategy, provides some insight on the Stinger.next initiatives and how this will benefit MicroStrategy customers and the overall Big Data and Hadoop community.…

Since our founding in mid-2011, our vision for Hadoop has been that “half the world’s data will be processed by Hadoop”. With that long-term vision in mind, we focus on the mission to establish Hadoop as the foundational technology of the modern enterprise data architecture that unlocks a whole new class of data-driven applications that weren’t previously possible.

We use what we call the “Blueprint for Enterprise Hadoop” for guiding how we invest in Hadoop-related open source technologies as well as enabling the key integration points that are important for deploying Enterprise Hadoop within a modern data architecture, on-premises or in the cloud, in a way that enables the business and its users to maximize the value from their data.…

Few industries depend as heavily on data as financial services. Insurance companies, retail and investment banks aggregate, price and distribute capital with the aim of increasing their return on assets with an acceptable level of risk.

To do that, financial decision-makers need data. Apache Hadoop helps them store new data sources, then process the larger combined dataset for batch, interactive and real-time analysis. More data and better analysis improves bottom-line results.…

Earlier this month, the Apache Ambari community released Apache Ambari 1.6.1, which includes multiple improvements for performance and usability. The momentum in and around the Ambari community is unstoppable. Today we saw the Pivotal team lean in to Ambari, and this is the sixth release of this critical component in 2014, proving again that open source is the fastest path to innovation.

Many thanks to the wealth of contribution from the broad Ambari community that resulted in 585 JIRA issues being resolved in this release.…

Last week, Apache Tez graduated to become a top level project within the Apache Software Foundation (ASF). This represents a major step forward for the project and is representative of its momentum that has been built by a broad community of developers from not only Hortonworks but Cloudera, Facebook, LinkedIn, Microsoft, NASA JPL, Twitter, and Yahoo as well.

What is Apache Tez and why is it useful?

Apache™ Tez is an extensible framework for building YARN based, high performance batch and interactive data processing applications in Hadoop that need to handle TB to PB scale datasets.…

Merv Adrian couldn’t have said it better. In his blog post from the weekend, he continued in his quest to define Hadoop. And it is no easy quest as the components of, and evolution of, Hadoop is happening at a pace that is, frankly, astounding.

The continuous evolution of Hadoop has even given rise to sentiments such as ‘Is Hadoop dead? ‘ The answer to that question is YES. And NO.  …

We know the saying, “what happens in Vegas, stays in Vegas,” but we can’t keep this all to ourselves. We’re super excited to be among 12 of HP AllianceOne Partners to receive the HP Partner of the Year award at HP Discover 2014.

The HP AllianceOne Award for ConvergedSystems recognizes Hortonworks for its strategic relationship and reseller agreement for the Hortonworks Data Platform (HDP) and for delivering solutions that drive meaningful business for our shared customers.…

Apache Ambari has always provided an operator the ability to provision an Apache Hadoop cluster using an intuitive Cluster Install Wizard web interface, guiding the user through a series of steps:

  • confirming the list of hosts
  • assigning master, slave, and client components to configuring services, and
  • installing, starting and testing the cluster.

With Ambari Blueprints, system administrators and dev-ops engineers can expedite the process of provisioning a cluster. Once defined, Blueprints can be re-used, which facilitates easy configuration and automation for each successive cluster creation.…

Last week’s release of HDP 2.1 was packed with countless new features for enterprise Hadoop. These included new processing capabilities with Tez and Hive on YARN, Solr and Storm, to operations with Ambari, governance with Falcon and security with Knox.

To guide you through these capabilities, Hortonworks is hosting a new series of webinars beginning on May 8 and running to June 26.

You can join any or all of the webinars listed below, and we’ve provided a simple way of signing up for all 7.…

The Apache Hive community has voted on and released version 0.13 today. This is a significant release that represents a major effort from over 70 members who worked diligently to close out over 1080 JIRA tickets.

Hive 0.13 also delivers the third and final phase of the Stinger Initiative, a broad community based initiative to drive the future of Apache Hive, delivering 100x performance improvements at petabyte scale with familiar SQL semantics.…

As enterprises build new applications with the data they cost effectively capture and process with Apache Hadoop it is important for the platform to facilitate the app dev processes. That’s why we are excited to announce that we’ve expanded our partnership with Concurrent, Inc. to simplify and accelerate application development on Hadoop.

There are two components to this expanded partnership.

Go to page:12345...Last »