Hive / HCatalog Forum

Help with Regular Expression

  • #43680
    Zin Zin
    Participant

    I am trying to import some weblog data into hive table. I have not worked with regular expression. Can some one please point me to a good tutorial on regular expression ?

    I have seen a example like

    CREATE EXTERNAL TABLE logs_20120101 (
    host STRING,
    identity STRING,
    user STRING,
    time STRING,
    request STRING,
    status STRING,
    size STRING)
    ROW FORMAT SERDE ‘org.apache.hadoop.hive.contrib.serde2.RegexSerDe’
    WITH SERDEPROPERTIES (
    “input.regex” =
    “([^ ]*) ([^ ]*) ([^ ]*) (-|\\[[^\\]]*\\]) ([^ \”]*|\”[^\”]*\”) (-|[0-9]*) (-|[0-9]*)”,
    “output.format.string”=”%1$s %2$s %3$s %4$s %5$s %6$s %7$s”
    )
    STORED AS TEXTFILE LOCATION ‘/data/logs/20120101/';

    The sample weblog I have is

    “6466.6668.666868” “US” “Buy” “Movie” “Enter the Dragon”
    “6688.6668.779779” “CA” “Rent” “Movie” “Matrix”

    The DDL I will be creating for the Hive table is

    CREATE EXTERNAL TABLE logs_20120101 (
    host STRING,
    country STRING,
    action STRING,
    category STRING,
    title STRING
    )
    ROW FORMAT SERDE ‘org.apache.hadoop.hive.contrib.serde2.RegexSerDe’
    WITH SERDEPROPERTIES (
    “input.regex” =

    “output.format.string”=”%1$s %2$s %3$s %4$s %5$s”
    )

    I want some help in writing the input regular expression

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #43704
    Carter Shanklin
    Participant

    Paulie, first thing to note is the RegexSerDe uses Java style regexes underneath the covers, there are many different styles of RegEx so it’s important to know which one you’re working with.

    This tutorial is pretty good:
    http://www.vogella.com/articles/JavaRegularExpressions/article.html

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.