Title: Systems Architect
Location: London, UK
Opening Date: 06/12/18
Closing Date: 30/01/19
Travel Requirement: EMEA wide travel 70%
This is an exciting role at Cloudera where you will be delivering cutting edge Opensource solutions to FTSE 100 customers. The Systems Architect forms part of the Professional Services organisation. Working across a diverse range of industries and projects enabling our customers on their Big Data journey. Engaging with customers from Proof of Concept (POC) stages through to implementation of complex distributed production environments. You will work collaboratively with customers to optimize performance, develop reference architectures and form part of a team that will foster a long standing collaborative relationship with our customer group.
Drive Proof of Concepts with customers to successful completion
Analyze complex distributed production deployments, and make recommendations to optimize performance
Help develop reference Hadoop architectures and configurations
Write and produce technical documentation, knowledge base articles
Work directly with prospective customers' technical resources to devise and recommend solutions based on the understood requirements
Participate in the pre- and post- sales process, helping both the sales and product teams to interpret customers' requirements
Work closely with Cloudera teams at all levels to ensure rapid response to customer questions and project
Playing an active within the Open Source Community
Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions
Experience designing and deploying 3 tier architectures or large scale Hadoop solutions
Experience working with Apache Hadoop including: Knowledge on how to create and debug Hadoop jobs
Ability to understand big data use-cases, and recommend standard design patterns commonly used in Hadoop-based deployments
Ability to understand and translate customer requirements into technical requirements
Experience implementing data transformation and processing solutions using Apache PIG
Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
Experience implementing MapReduce jobs
Experience setting up multi-node Hadoop clusters
Strong experience implementing software and/or solutions in the enterprise Linux or Unix environment
Strong understanding of the Java ecosystem and enterprise offerings, including debugging and profiling tools (jconsole), logging and monitoring tools (log4j, JMX), and security offerings (Kerberos/SPNEGO).
Familiarity with scripting tools such as bash shell scripts, Python and/or Perl
Demonstrable experience using R and the algorithms provided by Mahout
Hortonworks/Cloudera Certifications are an advantage but not essential
At Cloudera, we believe that data can make what is impossible today, possible tomorrow. We empower people to transform complex data into clear and actionable insights. Cloudera delivers an enterprise data cloud for any data, anywhere, from the Edge to AI. Powered by the relentless innovation of the open source community, Cloudera advances digital transformation for the world’s largest enterprises. Learn more at cloudera.com.
“Please bear with us as we consolidate job openings after the merger. To view legacy Cloudera jobs, click here https://www.cloudera.com/careers/careers-listing.html“
Cloudera is an Equal Opportunity / Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.
If you need assistance with applying for a position, please email our office at email@example.com.« Back to current openings
Thank you for considering a career at Hortonworks. Please take a minute to fill out the following form. After you have completed your application an email will be sent to you with information about how to check the status of your application.