Introducing the Hortonworks Founders

Hi Folks,

Things are going really well at Hortonworks.  We’re in our new office, connected to our data center of nearly 1000 nodes (thanks Yahoo!) and working away on our new computers.  We’ve gotten a lot done in a very small amount of time.  Along with our excellent G&A team, a key reason we’ve gotten so much done is that our founders have really stepped up and are taking responsibility for getting their teams moving.

I wanted to take this opportunity to mention them, because without them Hortonworks wouldn’t be Hortonworks.  These are the team leads and architects I’ve worked with and relied on over the last 4-6 year while we invested in taking Apache Hadoop from an early prototype to what it is today.  Without our founders and their teams Map-Reduce, HDFS and Pig would not be what they are today.

Who are they?

* Alan Gates – Pig and HCatalog PMC

* Arun Murthy – Hadoop PMC

* Devaraj Das – Hadoop and HCatalog PMC

* Mahadev Konar – ZooKeeper and Hadoop PMC

* Marco Nicosia – Hadoop operations expert

* Owen O’Malley – Hadoop PMC

* Sanjay Radia – Hadoop PMC

* Suresh Srinivas – Hadoop PMC

Our founders are busy working on the next generation of Hadoop.  Expect to hear more directly from then on this blog over the coming months.  More about our team: About Hortonworks .



Twitter: @jeric14, @hortonworks

P.S. Are you a key contributor to an Apache Hadoop project?  Want to be paid to work on Apache code full-time?  Reach out to us!  We are hiring.


Categorized by :
Hadoop Ecosystem Other

Leave a Reply

Your email address will not be published. Required fields are marked *

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.