Cloudera is seeking experienced Sr. Consultants to join our growing Public Sector Professional Services team. This key role has two major responsibilities: first to work directly with our customers and partners to optimize their plans and objectives for designing and deploying Apache Hadoop environments, and, secondly, to assist in building or designing reference configurations to enable our customers and influence our product.
Sr. Consultants will facilitate the communication flow between Cloudera teams and the customer. For these strategically important roles, we are seeking outstanding talent to join our team.
Work directly with prospective customers’ technical resources to devise and recommend solutions based on the understood requirements
Represent Cloudera Public Professional Services while on site with clients by demonstrating subject matter expertise in the fields of big data and data modernization.
Analyze complex distributed production deployments, make recommendations, and implement Cloudera solutions to optimize performance
Work closely with Cloudera’s teams at all levels to ensure rapid response to customer questions and project blockers
Help develop reference Hadoop architectures and configurations
Drive delivery projects with customers to ensure successful adoption of Cloudera's technologies
Travel up to 75%
US Citizenship is required.
More than three years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions
Experience designing and deploying production large-scale Hadoop solutions
Experience installing and administering multi-node Hadoop clusters
Experience designing queries against data in a Hadoop environment using tools such as Apache Hive, Apache Druid, Apache Phoenix or others.
Strong experience implementing software and/or solutions in the enterprise Linux or Unix environment
Strong understanding of various enterprise security solutions such as LDAP and/or Kerberos
Strong understanding of network configuration, devices, protocols, speeds and optimizations
Strong understanding of the Java development, debugging & profiling
Significant previous work writing to network-based APIs, preferably REST/JSON or XML/SOAP
Solid background in Database administration or design
Familiarity with scripting tools such as bash shell scripts, Python and/or Perl
Ability to understand and translate customer requirements into technical requirements
Excellent verbal and written communications
Bachelor’s degree in Computer Science or a related field, or equivalent
Nice, but not required experience:
Active Secret or Top Secret clearance preferred. Ideal candidates, if not already cleared, should be willing to undergo evaluation for a US Government clearance.
Ability to understand big data use-cases, and recommend standard design patterns commonly used in Hadoop-based deployments.
Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc.
Experience with Cloud Platforms & deployment automation
Sr. Consultant – Public Sector
Thank you for considering a career at Hortonworks. Please take a minute to fill out the following form. After you have completed your application an email will be sent to you with information about how to check the status of your application.
Apache, Hadoop, Falcon, Atlas, Tez, Sqoop, Flume, Kafka, Pig, Hive, HBase, Accumulo, Storm, Solr, Spark, Ranger, Knox, Ambari, ZooKeeper, Oozie, Phoenix, NiFi, Nifi Registry, HAWQ, Zeppelin, Slider, Mahout, MapReduce, HDFS, YARN, Metron and the Hadoop elephant and Apache project logos are either registered trademarks or trademarks of the Apache Software Foundation in the United States or other countries.