Optimizing Hadoop for Microservers

SM15K_Frt2_RThere are plenty of server and storage options for the wave of data that is being collected and analyzed.  New platforms such as Apache™ Hadoop® provide the opportunity to make all the new data types being collected useful.  However, like any other platform, performance varies depending on the underlying servers being used.  There is great promise in what Hadoop can deliver in terms of business value, and the ecosystem is continuously growing with companies making strides to make Hadoop easier to deploy and manage.

One area that has experienced huge advancements is the data center server.  The power and cooling requirements of data centers have really become an important issue, and the major vendors are all focused on helping the industry become cleaner and greener.  AMD SeaMicro has been a leader in this area and reimagined the server and pioneered fabric-based dense, micro server with technology that interconnects pools of resources over a supercompute fabric with an unprecedented 1.28 Tbps bisectional bandwidth that can access more than five petabytes of direct attached storage.  The SeaMicro Freedom™ Fabric removes the constraints of the traditional server and allows data centers to expand in multiple dimensions without adding unneeded hardware and costs. Hadoop does not need the fastest processor, but it does need to be affordable and easily scaled out as the amount of data that is collected and analyzed increases.

The data center server is the key underlying infrastructure that enables all of these new innovative services.  Though the amount of data being collected is unlimited, data center capacity clearly is not.  The industry is realizing that data center servers need real innovation that extends beyond the individual server components and takes into account the end-to-end perspective encompassing compute, storage and networking. It’s time to re-imagine the data center server, and deliver what the industry needs.  Companies are experiencing problems that just cannot be solved with traditional servers.

To hear more about how microservers can improve your Hadoop performance and minimize operations, join AMD SeaMicro and Hortonworks for a provocative discussion in the June 18 webinar: How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers, hosted by the Linux Journal.

To learn more about AMD SeaMicro visit: www.seamicro.com

For more on delivering a modern data architecture for your business, click here.

Categorized by :
Apache Hadoop

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Join the Webinar!

Discover HDP 2.2: Apache Falcon for Hadoop Data Governance
Thursday, November 6, 2014
1:00 PM Eastern / 12:00 PM Central / 11:00 AM Mountain / 10:00 AM Pacific

More Webinars »

HDP 2.1 Webinar Series
Join us for a series of talks on some of the new enterprise functionality available in HDP 2.1 including data governance, security, operations and data access :
Contact Us
Hortonworks provides enterprise-grade support, services and training. Discuss how to leverage Hadoop in your business with our sales team.
Explore Technology Partners
Hortonworks nurtures an extensive ecosystem of technology partners, from enterprise platform vendors to specialized solutions and systems integrators.