Get fresh updates from Hortonworks by email

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.


Sign up for the Developers Newsletter

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.


Get Started


Ready to Get Started?

Download sandbox

How can we help you?

* I understand I can unsubscribe at any time. I also acknowledge the additional information found in Hortonworks Privacy Policy.
closeClose button


Consulting | Bangalore, India


  • Work directly with customer’s technical resources to devise and recommend solutions based on the understood requirements
  • Analyse complex distributed production deployments, and make recommendations to optimize performance
  • Able to document and present complex architectures for the customer’s technical teams
  • Work closely with Hortonworks’ teams at all levels to help ensure the success of project consulting engagements with customer
  • Deploy, augment, upgrade and operate large Hadoop clusters
  • Write and produce technical documentation, knowledge-base articles
  • Keep current with the Hadoop Big Data ecosystem technologies


  • Overall 5+ years experience
  • Experience implementing data transformation and processing solutions using Apache PIG
  • Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
  • Experience implementing MapReduce jobs
  • Experience setting up multi-node Hadoop clusters
  • Experience in systems administration or DevOps experience on one or more open-source operating systems [Big Data Developers interested in Administration and consulting can also apply]
  • Strong understanding with various enterprise security practices and solutions such as LDAP and/or Kerberos
  • Strong understanding of network configuration, devices, protocols, speeds and optimizations
  • Experience using configuration management tools such as Ansible, Puppet or Chef
  • 2+ years designing and deploying 3 tier architectures or large-scale Hadoop solutions
  • Familiarity with scripting tools such as bash shell scripts, Python and/or Perl
  • Experience with NiFi is desired
  • Significant previous work writing to network-based APIs, preferably REST/JSON or XML/SOAP
  • Understanding of the Java ecosystem and enterprise offerings, including debugging and profiling tools (e.g. jstack, jmap, jconsole), logging and monitoring tools (log4j, JMX)
  • Ability to understand and translate customer requirements into technical requirements
  • Excellent verbal and written communications

Nice, but not required experience:

  • Site Reliability Engineering concepts and practices
  • Ability to understand big data use-cases, and recommend standard design patterns commonly used in Hadoop-based deployments.
  • Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc.


« Back to current openings
Apply Now

Application for employment at Hortonworks

Consultant | Consulting | Bangalore, India

Thank you for considering a career at Hortonworks. Please take a minute to fill out the following form. After you have completed your application an email will be sent to you with information about how to check the status of your application.

« Back to current openings