We are pleased to announce that Mac Moore and Shane Kumpf will be presenting this month on how to scale Spark workloads on Yarn. Spark is seeing wide adoption across a variety of industries as more organizations look to leverage its advanced capabilities. A common challenge with Spark jobs has been around sizing – this has been a bit of an art up to this point. Join us to learn how you can achieve better elasticity, multi-tenancy and enable advanced security when running Spark applications on Hadoop leveraging YARN. Come out an learn how to get Spark up, running, and scaling on your clusters!
About The Speakers:
Shane Kumpf – Solutions Engineer, Hortonworks
Shane’s relentless curiosity for distributed computing has led to wearing many hats throughout his career; from Linux guru, to managing high volume web properties, to data, to software dev. As a Solutions Engineer at Hortonworks, he helps customers get the most value out of the only 100% open source Hadoop distribution, the Hortonworks Data Platform. Through real world experience, Shane has gained a strong foundation across many verticals, with experience ranging from fortune 100s to start ups. Shane has had the luxury of working with Hadoop full time since 2010, managing everything from single digit and two hundred node plus Hadoop clusters, and participating in the evolution of the modern data architecture.
Mac Moore – Solutions Engineer, Hortonworks
As a Solutions Engineer at Hortonworks, Mac helps enterprises improve the scale, performance, and cost effectiveness of their Big Data applications using various components from HDP, the industry leading open source Hadoop distribution. Prior to joining Hortonworks, Mac served as a Solutions Architect with multiple vendors in the In-Memory/Big Data space and also previously served as a Director of Information Systems in the higher-education sector. He has over 13 years of experience designing, developing, and integrating enterprise systems where performance and scalability are essential.