The Hortonworks Blog

Posts categorized by : Administrator

Since the partnership between Hortonworks and Splunk and the release of Hunk last year, we have created some awesome assets (i.e., Hunk sandbox tutorial, 360-degree customer view webinar) that have enabled Hadoop and Big Data enthusiasts’ hands-on training with Big Data. You can find more details around our partnership and resources here: http://hortonworks.com/partner/splunk/

As part of our HDP 2.1 certification series, I would like to introduce Brett Sheppard, Director of Product Marketing for Big Data at Splunk.…

We recently hosted the fourth of our seven Discover HDP 2.1 webinars, entitled Apache 2.4.0, HDFS and YARN. It was very well attended and a very informative discourse. The speakers outlined the new features in YARN and HDFS in HDP 2.1 including:

  • HDFS Extended ACLs
  • HTTPs support for WebHDFS and for the Hadoop web UIs
  • HDFS Coordinated DataNode Caching
  • YARN Resource Manager High Availability
  • Application Monitoring through the YARN Timeline Server
  • Capacity Scheduler Preemption

Many thanks to our presenters, Rohit Bakhshi (Hortonworks’ senior product manager), Vinod Kumar Vavilapalli (co-author of the YARN Book, PMC, Hadoop YARN Project Lead at Apache and Hortonworks), and Justin Sears (Hortonworks’ Product Marketing Manager).…

Traditionally, HDFS, Hadoop’s storage subsystem, has focused on one kind of storage medium, namely spindle-based disks.  However, a Hadoop cluster can contain significant amounts of memory and with the continued drop in memory prices, customers are willing to add memory targeted at caching storage to speed up processing.

Recently HDFS generalized its architecture to include other kinds of storage media including SDDs and memory [1]. We also added support for caching hot files in memory [2].…

Informatica is a Hortonworks Certified Technology Partner. This partnership makes it possible for organizations to use all the data internal and external to an enterprise to achieve the full predictive power that drives the success of modern data-driven businesses. 

That is why we’re excited to have John Haddad, Senior Director, Informatica to be our guest blogger. In this blog, John explores the benefits of certification on HDP 2.1.

When I was in high school, one of my best friends had a water ski boat we often took out on California lakes (what are friends for?).…

Julian Hyde will present the following talks at the Hadoop Summit:

  • Discardable In-Memory, Materialized Query for Hadoop,”  (June 3rd, 11:15-11:55 am)
  • “Cost-based Query Optimization in Hive,” (June 4th,  4:35 pm-5:15 pm)
  • What to do with all that memory in a Hadoop cluster? The question is frequently heard. Should we load all of our data into memory to process it? Unfortunately the answer isn’t quite that simple.

    The goal should be to put memory into its right place in the storage hierarchy, alongside disk and solid-state drives (SSD).…

    We are less than a week away from start of the seventh annual Hadoop Summit San Jose. With all of the final preparations underway, we wish to highlight some of the not to be missed activities in and around the event. The event is filling fast, but you can still register here.

    Here are a few things you don’t want to miss!

  •  Great track content—there is more content than ever with more than 120 informative sessions on Apache Hadoop and related technologies for you to choose from and as always selected by the community and delivered by the experts themselves.…
  • The Apache Ambari community is happy to announce last week’s release of Apache Ambari 1.6.0, which includes exciting new capabilities and resolves 288 JIRA issues.  

    Many thanks to all of the contributors in the Apache Ambari community for the collaboration to deliver 1.6.0, especially with Blueprints, a crucial feature that enables rapid instantiation and replication of clusters.

    Each release of Ambari makes substantial strides in providing functionality to simplify the lives of system administrators and dev-ops engineers to deploy, manage, and monitor large Hadoop clusters, including those running on Hortonworks Data Platform 2.1 (HDP).…

    On Wednesday May 21, Himanshu Bari (Hortonworks’ senior product manager), Venkatesh Seetharam (committer to Apache Falcon), and Justin Sears ( Hortonworks’ Product Marketing Manager), hosted the third of our seven Discover HDP 2.1 webinars. Himanshu and Venkatesh discussed data governance in Hadoop through Apache Falcon that is included in HDP 2.1. As most of you know, ingesting data into Hadoop is one thing; having data governed, by dictating and defining data-pipeline policies, is another thing—a necessity in the enterprise.…

    MongoDB is an open-source NoSQL database, used by companies of all sizes, across all industries and for a wide variety of applications. MongoDB – the company – is a Hortonworks Certified Technology Partner.

    Sheena Badani, Director of Business Development at MongoDB, talks about the value of obtaining HDP 2.1 certification.

    MongoDB is thrilled to announce the certification of the MongoDB Hadoop Connector on Hortonworks latest release HDP 2.1.  Customers now have validation from both MongoDB, Inc.…

    Today we’re delighted to announce our acquisition of XA Secure to provide comprehensive security capabilities for Enterprise Hadoop. Please join us in welcoming XA Secure to the Hortonworks family.

    Register for the Webinar

    Hortonworks Data Platform has seen phenomenal adoption across an ever-growing number of organizations. As part of that adoption, and thanks to Apache Hadoop YARN, businesses are moving from single-purpose Hadoop clusters to a versatile, integrated data platform hosting multiple business applications – combining data sets with diverse processing needs in one place.…

    Last week Vinay Shukla and Kevin Minder hosted the first of our seven Discover HDP 2.1 webinars. Vinay and Kevin covered three important topics related to new Apache Hadoop security features in HDP 2.1:

    • REST API security with Apache Knox Gateway
    • HDFS security with Access Control Lists (ACLs)
    • SQL security and next-generation Hive authorization

    Here is the complete recording of the webinar.

    Here are the presentation slides: http://www.slideshare.net/hortonworks/discoverhdp21security

    Attend our next Discover HDP 2.1 webinar tomorrow, Thursday, May 15 at 10am Pacific Time: Interactive SQL Query in Hadoop with Apache Hive

    We’re grateful to the many participants who joined and asked excellent questions.…

    Rainstor is a Hortonworks Certified Technology Partner and provides an efficient database that reduces the cost, complexity and compliance risk of managing enterprise data. RainStor’s patented technology enables customers to cut infrastructure costs and scales anywhere; on-premise or in the cloud and natively on Hadoop. RainStor’s customers are 20 of the world’s largest communications providers and 10 of the biggest banks and financial services organizations. 

    Rainstor’s Mark Cusack, Chief Architect, writes about the benefits of certification on HDP 2.1.…

    Hadoop 2 and its YARN-based architecture has increased the interest in new engines to be run on Hadoop and one such workload is in-memory computing for machine learning and data science use cases. Apache Spark has emerged as an attractive option for this type of processing and today, we announce availability of our HDP 2.1 Tech Preview Component of Apache Spark.  This is a key addition to the platform and brings another workload supported by YARN on HDP.…

    This is the second in our series on the motivations and architecture for improvements to the Apache Hadoop YARN’s Resource Manager Restart resiliency. Other in the series are:

    Introduction: Phase I – Preserve Application-queues

    In the introductory blog, we previewed what RM Restart Phase I entails. In essence, we preserve the application-queue state into a persistent store and reread it upon RM restart, eliminating the need for users to resubmit their applications.…

    Hortonworks Data Platform 2.1 for Windows is the 100% open source data management platform based on Apache Hadoop and available for the Microsoft Windows Server platform. I have built a helper tool that automates the process of deploying a multi-node Hadoop cluster – utilizing the MSI available in HDP 2.1 for Windows.

    Download HDP 2.1 for Windows

    HDP on Windows MSI Overview

    HDP on Windows installation package comes in the format of MSI, Microsoft’s MSI format utilizes the installation and configuration service provided with Windows called Windows Installer.…

    Go to page:12345...Last »