Resources

AllDocumentsVideosWebinarsPresentationsDemos/Use Cases

Apache Hadoop Cluster Configuration Guide

View

You must be logged in to get this resource.
Login or Register

Apache Hadoop Cluster Configuration Guide « Technical Resources « Partner Resources « Downloads

AttributeValue
Version1.0
Date postedJune 12, 2013
CategoriesTechnical Resources, Documents

Description

Sizing a Hadoop cluster is important, as the right resources will allow you to optimize the environment for your purpose. However, this is no simple task as optimizing a distributed environment and its related software can have its complexities.

We have packaged up years of experience building and optimizing Hadoop cluster into a single document and present to you this guide that should help guide you through this process. It is important to note that while we outline suggested sizing configurations, no two implementations are alike and mileage may vary. To this end, this guide provides you a basic framework to address sizing. We hope it will be helpful to you.

Integrate with existing systems
Hortonworks maintains and works with an extensive partner ecosystem from broad enterprise platform vendors to specialized solutions and systems integrators.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.

Thank you for subscribing!