Founders

founders

Alan Gates
Alan is an original member of the engineering team that took Pig from a Yahoo! Labs research project to a successful Apache open source project. Alan also designed HCatalog and guided its adoption as an Apache Incubator project. Alan has a BS in Mathematics from Oregon State University and a MA in Theology from Fuller Theological Seminary. He is also the author of Programming Pig, a book from O’Reilly Press. Follow Alan on Twitter: @alanfgates.

Arun C. Murthy
Arun is a Apache Hadoop PMC member and has been a full time contributor to the project since the inception in 2006. He is also the lead of the MapReduce project and has focused on building NextGen MapReduce (YARN). Prior to co-founding Hortonworks, Arun was responsible for all MapReduce code and configuration deployed across the 42,000+ servers at Yahoo!. In essence, he was responsible for running Apache Hadoop’s MapReduce as a service for Yahoo!. Also, he jointly holds the current world sorting record using Apache Hadoop. Follow Arun on Twitter: @acmurthy.

Devaraj Das
Devaraj is an Apache Hadoop committer and member of the Apache Hadoop PMC as well as a committer on the Apache HCatalog project. Prior to co-founding Hortonworks, Devaraj was critical in making Apache Hadoop a success at Yahoo! by designing, implementing, leading and managing large and complex core Apache Hadoop and Hadoop-related projects on Yahoo!’s production clusters (including MapReduce, Hadoop Security and HCatalog). Devaraj also worked as an engineer at HP in Bangalore earlier in his career. He has a Master’s degree from the Indian Institute of Science in Bangalore, India, and a B.E. degree from Birla Institute of Technology and Science in Pilani, India. Follow Devaraj on Twitter: @ddraj.

Eric Baldeschwieler
Prior to co-founding Hortonworks, Eric served as VP Hadoop Software Engineering for Yahoo!, where he led the evolution of Apache Hadoop from a 20 node prototype to a 42,000 node service that is behind every click at Yahoo!. Eric also served as a technology leader for Inktomi’s web service engine, which Yahoo! acquired in 2003. Prior to Inktomi, Eric developed software for video games, video post production systems and 3D modeling systems. Eric has a Master’s degree in Computer Science from the University of California, Berkeley and a Bachelor’s degree in Mathematics and Computer Science from Carnegie Mellon University. Follow Eric on Twitter: @jeric14.

Mahadev Konar
Mahadev is a core contributor and PMC member of Apache Hadoop and ZooKeeper. Prior to co-founding Hortonworks, Mahadev spent more than five years at Yahoo! working on Hadoop technologies. He holds a Master’s and Bachelor’s degree in Computer Science from SUNY Stony Brook and IIT Bombay (India) respectively. Follow Mahadev on Twitter: @mahadevkonar.

Owen O’Malley
Owen has been contributing patches to Hadoop since before it became an independent Apache project. He was the first committer added and still remains one of the most active contributors to Apache Hadoop. He was also the founding chair of the Apache Hadoop Project Management Committee. Prior to co-founding Hortonworks, Owen worked on Yahoo! Search’s WebMap project, which built a graph of the known web and applied many heuristics to the entire graph. Once ported to Apache Hadoop, it became the single largest known Hadoop application. He has a PhD in Software Engineering from the University of California, Irvine. Follow Owen on Twitter: @owen_omalley.

Sanjay Radia
Sanjay is an Apache Hadoop committer and member of the Apache Hadoop PMC. Prior to co-founding Hortonworks, Sanjay was the architect of the Hadoop HDFS project at Yahoo!. He has also held senior engineering positions at Sun Microsystems and INRIA, where he developed software for distributed systems and grid/utility computing infrastructures. Sanjay has a PhD in Computer Science from the University of Waterloo in Canada. Follow Sanjay on Twitter: @srr.

Suresh Srinivas
Suresh is an Apache Hadoop committer and member of the Apache Hadoop PMC. Suresh is an active contributor to the Apache HDFS project. Prior to co-founding Hortonworks, he served as a software architect at Yahoo! working on Apache Hadoop HDFS. Suresh also worked for Sylantro Systems in various technical leadership roles, developing scalable infrastructure for hosted communications services. He has a Bachelor’s degree in Electronics and Communication from the National Institute of Technology, Karnataka (India). Follow Suresh on Twitter: @suresh_m_s.

Contact Us
Hortonworks provides enterprise-grade support, services and training. Discuss how to leverage Hadoop in your business with our sales team.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.