SLF4J: Class path contains multiple SLF4J bindings.

to create new topics or reply. | New User Registration

This topic contains 2 replies, has 2 voices, and was last updated by  Subbiah Subramanian 3 months ago.

  • Creator
    Topic
  • #51487

    Scott Sunderland
    Participant

    Hi All,

    When I attempt the following command in the serverlogs tutorial I get the errors below.

    hcat -e “CREATE TABLE FIREWALL_LOGS(time STRING, ip STRING, country STRING, status STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY ‘|’ LOCATION ‘/flume/events';”

    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/usr/lib/hive/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

    This also occurs when i try the follwoing command from the sentimaent analysis tutorial. (and yes i already copied the required hiveddl.sql and generate_logs.py files to the host machine).

    hive -f hiveddl.sql

    How to resolve this as it completely blocks the serverlogs and sentiment analysis tutorials in the sandbox from being done…

    cheers,

    Scott

Viewing 2 replies - 1 through 2 (of 2 total)

You must be to reply to this topic. | Create Account

  • Author
    Replies
  • #65484

    Subbiah Subramanian
    Participant

    I also get the same error. Please let me know how to resolve the problem?

    Collapse
    #51488

    Scott Sunderland
    Participant

    PS I am using windows 7, PuTTY and WinSCP

    Collapse
Viewing 2 replies - 1 through 2 (of 2 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.