Apache Pig

A scripting platform for processing and analyzing large data sets

With YARN as the architectural center of ApacheTM Hadoop, multiple data access engines such as Apache Pig interact with data stored in the cluster. Apache Pig allows Apache Hadoop users to write complex MapReduce transformations using a simple scripting language called Pig Latin. Pig translates the Pig Latin script into MapReduce so that it can be executed within YARN for access to a single dataset stored in the Hadoop Distributed File System (HDFS).

Hortonworks Focus for Pig

The Apache Pig community is working on continued development in two major areas:

Theme Planned Enhancements
Performance Improved performance through further optimizations to existing code and support for Apache Spark as an optional runtime
Analytics Deeper analytics through built-in operators and enhancements to related libraries such as DataFu

Recent Progress in Pig

Hadoop Version Progress
0.14.0
  • Improved speed with support for Apache Tez
  • Automatic reducer parallelism
  • Inclusion of DataFu library
  • Faster UNION and JOIN operations
0.13.0
  • Support for back ends other than MapReduce
  • Blacklist and whitelist operators
  • Low latency for small jobs
0.12.0
  • Support for OVER operator
  • Support for CASE, IN and SUBTRACT expressions

What Pig Does

Pig was designed for performing a long series of data operations, making it ideal for three categories of Big Data jobs:

  • Extract-transform-load (ETL) data pipelines,
  • Research on raw data, and
  • Iterative data processing.

Whatever the use case, Pig will be:

Characteristic Benefit
Extensible Pig users can create custom functions to meet their particular processing requirements
Easily programmed Complex tasks involving interrelated data transformations can be simplified and encoded as data flow sequences. Pig programs accomplish huge tasks, but they are easy to write and maintain.
Self-optimizing Because the system automatically optimizes execution of Pig jobs, the user can focus on semantics.

How Pig Works

Pig runs on Apache Hadoop YARN and makes use of MapReduce and the Hadoop Distributed File System (HDFS). The language for the platform is called Pig Latin, which abstracts from the Java MapReduce idiom into a form similar to SQL. While SQL is designed to query the data, Pig Latin allows you to write a data flow that describes how your data will be transformed (such as aggregate, join and sort).

Since Pig Latin scripts can be graphs (instead of requiring a single output) it is possible to build complex data flows involving multiple inputs, transforms, and outputs. Users can extend Pig Latin by writing their own functions, using Java, Python, Ruby, or other scripting languages. Pig Latin is sometimes extended using UDFs (User Defined Functions), which the user can write in any of those languages and then call directly from the Pig Latin.

The user can run Pig in two modes, using either the “pig” command or the “java” command:

  • MapReduce Mode. This is the default mode, which requires access to a Hadoop cluster.
  • Local Mode. With access to a single machine, all files are installed and run using a local host and file system.

Pig Tutorials

Pig in our Blog

Forums

to create new topics or reply. | New User Registration

This forum contains 106 topics and 177 replies, and was last updated by  Tom Borgstadt 1 day, 15 hours ago.

Viewing 21 topics - 1 through 20 (of 107 total)
Viewing 21 topics - 1 through 20 (of 107 total)

You must be to create new topics. | Create Account

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.
Stay up to date!
Developer updates!