Get fresh updates from Hortonworks by email

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.

cta

Get Started

cloud

Ready to Get Started?

Download sandbox

How can we help you?

closeClose button

HDP Certified Spark Developer

Hortonworks University is excited to announce a new hands-on, performance-based certification for Spark on the Hortonworks Data Platform (HDP). Our industry-recognized certifications are unique because candidates perform actual tasks on a live installation of our products, instead of simply guessing at multiple-choice questions. Being able to prove your skills allows you to distinguish yourself as an experienced and capable professional in the big data workplace. The Spark certification exam is designed for developers responsible for developing Spark Core and Spark SQL applications in Scala or Python.

To register for the exam, visit examslocal.com

Exam Objectives

Category Task Resource(s)
Core Spark Write a Spark Core application in Python or Scala http://spark.apache.org/docs/latest/programming-guide.html
  Initialize a Spark application http://spark.apache.org/docs/latest/programming-guide.html#initializing-spark
Run a Spark job on YARN https://spark.apache.org/docs/1.1.0/cluster-overview.html
  Create an RDD http://spark.apache.org/docs/latest/programming-guide.html#resilient-distributed-datasets-rdds
  Create an RDD from a file or directory in HDFS http://spark.apache.org/docs/latest/programming-guide.html#external-datasets
  Persist an RDD in memory or on disk http://spark.apache.org/docs/latest/programming-guide.html#rdd-persistence
  Perform Spark transformations on an RDD http://spark.apache.org/docs/latest/programming-guide.html#rdd-operations
  Perform Spark actions on an RDD http://spark.apache.org/docs/latest/programming-guide.html#actions
  Create and use broadcast variables and accumulators http://spark.apache.org/docs/latest/programming-guide.html#shared-variables
  Configure Spark properties https://spark.apache.org/docs/1.5.2/configuration.html
Spark SQL Create Spark DataFrames from an existing RDD http://spark.apache.org/docs/latest/sql-programming-guide.html#creating-dataframes
  Perform operations on a DataFrame http://spark.apache.org/docs/latest/sql-programming-guide.html#dataframe-operations
  Write a Spark SQL application http://spark.apache.org/docs/latest/sql-programming-guide.html
  Use Hive with ORC from Spark SQL https://hortonworks.com/hadoop-tutorial/using-hive-with-orc-from-apache-spark/
  Write a Spark SQL application that reads and writes data from Hive tables http://spark.apache.org/docs/latest/sql-programming-guide.html#hive-tables