Get fresh updates from Hortonworks by email

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.


Sign up for the Developers Newsletter

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.


Get Started


Ready to Get Started?

Download sandbox

How can we help you?

* I understand I can unsubscribe at any time. I also acknowledge the additional information found in Hortonworks Privacy Policy.
closeClose button

Sr. Consultant I

Consulting | Sydney, Australia

Senior Consultant, Professional Services APAC


  • Work directly with customer business and technical teams to understand requirements and develop high quality solutions
  • Design highly scalable and reliable data pipelines to consume, integrate, and analyze large amounts of data from various sources.
  • Able to understand big data use-cases and recommend standard design, implementation patterns used in Hadoop-based deployments
  • Able to document and present complex architectures for the customer’s technical teams
  • Work closely with Hortonworks teams at all levels to ensure project and customer success
  • Design effective data models for optimal storage and retrieval, deploy inclusive data quality checks to ensure high quality of data
  • Design, build, tune and maintain data pipelines using Hadoop, NiFi or related data integration technologies
  • Install, deploy, augment, upgrade, manage and operate large Hadoop clusters
  • Write and produce technical documentation, customer status reports and knowledgebase articles
  • Keep up with current Hadoop, NiFi, Big Data ecosystem / technologies.



  • Overall 8+ years IT experience, with at least 4+ years of production experience working with Hadoop and/or NiFi, data engineering.
  • Hands-on experience with all aspects of developing, testing and implementing low-latency big data pipelines.
  • Demonstrated production experience in data engineering, data management, cluster management and/or analytics domains.
  • Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
  • Experience implementing MapReduce, Spark jobs
  • Experience setting up multi-node Hadoop clusters
  • Experience in systems administration or DevOps experience with one or more open-source operating systems [Big Data Developers interested in Administration and consulting can also apply]
  • Experience with Data Warehouse design, ETL (Extraction, Transformation & Load), architecting efficient software designs for DW platform.
  • Experience implementing operational best practices such as alerting, monitoring, and metadata management.
  • Strong understanding with various enterprise security practices and solutions such as LDAP and/or Kerberos
  • Experience using configuration management tools such as Ansible, Puppet or Chef
  • Familiarity with scripting tools such as bash shell scripts, Python and/or Perl
  • Experience with Apache NiFi is desired
  • Significant previous work writing to network-based APIs, preferably REST/JSON or XML/SOAP
  • Understanding of the Java ecosystem and enterprise offerings, including debugging and profiling tools (e.g. jstack, jmap, jconsole), logging and monitoring tools (log4j, JMX)
  • Ability to understand and translate customer requirements into technical requirements
  • Excellent verbal and written communications

Nice, but not required experience:

  • Site Reliability Engineering concepts and practices
  • Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc
  • Experience using a compiled programming language, preferably one that runs on the JVM (Java, Scala, etc)
  • Experience coding with streaming/micro-batch compute frameworks, preferably Kafka, Spark 

All employees are required to adhere to all Hortonworks employment policies, including without limitation information security policies.


Having gone public with its IPO in December 2014, Hortonworks is experiencing extraordinary growth as we deliver essential support to the burgeoning big data and IoT communities. Hortonworks is an industry leading innovator that creates, distributes and supports enterprise-ready open data platforms and modern data applications that deliver actionable intelligence from all data: data-in-motion and data-at-rest. Hortonworks is focused on driving innovation in open source communities such as Apache Hadoop, Apache NiFi and Apache Spark. Along with our 2,100+ partners, Hortonworks provides the expertise, training, and services that allow customers to unlock transformational value for their organizations across any line of business.

For more information, visit

Hortonworks, Powering the Future of Data, HDP and HDF are registered trademarks or trademarks of Hortonworks, Inc. and its subsidiaries in the United States and other jurisdictions.


« Back to current openings
Apply Now

Application for employment at Hortonworks

Sr. Consultant I | Consulting | Sydney, Australia

Thank you for considering a career at Hortonworks. Please take a minute to fill out the following form. After you have completed your application an email will be sent to you with information about how to check the status of your application.

« Back to current openings