Extending Microsoft System Center to Manage Apache Hadoop

MSFT_logo_rgb_C-Gray_DWe are excited to announce today that Hortonworks is bringing Windows-based Hadoop Operational Management functionality via Management Packs for System Center. These management packs will enable users to deploy, manage and monitor Hortonworks Data Platform (HDP) for both Windows and Linux deployments. The new management packs for System Center will provide management and monitoring of Hadoop from a single System Center Operations Manager console, enabling customers to streamline operations and ensure quality of service levels.

System Center will integrate with HDP through Apache Ambari APIs. This System Center integration to HDP is a testament to the strength of the 18-month engineering collaboration between Microsoft and Hortonworks. This is a prime illustration of how Microsoft, Hortonworks and the open source community are working together to advance enterprise Hadoop to become a key part of the modern data architecture.

Extending System Center Operations Manager

For monitoring and managing HDP deployments, Hortonworks will provide a free management pack for Microsoft System Center Operations Manager. The Operations Manager management pack will utilize Apache Ambari APIs to monitor HDP service uptime, view cluster resource utilization, report on job and operational metrics and manage all HDP services deployed in the cluster.

Extending System Center Virtual Machine Manager

Organizations have a choice of infrastructure platform for deploying HDP. Hortonworks will provide a System Center Virtual Machine Manager Management Pack for those organizations that utilize Hyper-V virtualization technology to host their enterprise workloads. Virtual Machine Manager will configure a cluster of virtual machines and deploy HDP in an optimized configuration for that virtualized cluster. Through Virtual Machine templates, customers will be able to seamlessly add slave and master Hadoop nodes to take advantage of the scalability of the virtualized infrastructure.

Hortonworks Data Platform for Windows

HDP for Windows is an enterprise-ready Apache Hadoop-based platform that deploys and runs natively on Windows Server. With this System Center integration, Windows System Administrators will have the ability to use the operating system and the management tools they are familiar with to operate and utilize Apache Hadoop.

Expanding the ecosystem around HDP will bring the scalability and processing power of Apache Hadoop to millions of new users. We are excited to release the System Center Management Packs later this year. Follow the Apache Ambari project to see it evolve to enable these features.

To get started with Apache Hadoop on Windows, download the HDP for Windows distribution today!

Categorized by :
Ambari Apache Hadoop Hadoop Ecosystem HDP for Windows

Leave a Reply

Your email address will not be published. Required fields are marked *

If you have specific technical questions, please post them in the Forums

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.