The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

YARN Forum

SLF4J version issue yarn/hadoop

  • #52317
    Prabhat Singh
    Participant

    hi,

    I tried to run nutch1.8 with ambar(2.1) which has hadoop2.4 jar.
    it gives error as

    Log Type: stderr
    Log Length: 743
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/hadoop/yarn/local/usercache/hdfs/appcache/application_1398533490859_0006/filecache/10/job.jar/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    log4j:WARN No appenders could be found for logger (org.apache.hadoop.ipc.Server).
    log4j:WARN Please initialize the log4j system properly.
    log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

    Log Type: stdout
    Log Length: 0

    Log Type: syslog
    Log Length: 134506
    Showing 4096 bytes of 134506 total. Click here for the full log.
    org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Calling stop for all the services
    2014-04-26 18:28:04,312 INFO [Thread-64] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Stopping JobHistoryEventHandler. Size of the outstanding queue size is 0
    2014-04-26 18:28:04,375 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Copying hdfs://wsmhcluster/user/hdfs/.staging/job_1398533490859_0006/job_1398533490859_0006_1.jhist to hdfs://wsmhcluster/mr-history/tmp/hdfs/job_1398533490859_0006-1398536845853-hdfs-inject+urls-1398536884224-0-0-FAILED-default-1398536852863.jhist_tmp
    2014-04-26 18:28:04,504 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Copied to done location: hdfs://wsmhcluster/mr-history/tmp/hdfs/job_1398533490859_0006-1398536845853-hdfs-inject+urls-1398536884224-0-0-FAILED-default-1398536852863.jhist_tmp
    2014-04-26 18:28:04,513 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Copying hdfs://wsmhcluster/user/hdfs/.staging/job_1398533490859_0006/job_1398533490859_0006_1_conf.xml to hdfs://wsmhcluster/mr-history/tmp/hdfs/job_1398533490859_0006_conf.xml_tmp
    2014-04-26 18:28:04,553 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Copied to done location: hdfs://wsmhcluster/mr-history/tmp/hdfs/job_1398533490859_0006_conf.xml_tmp
    2014-04-26 18:28:04,569 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Moved tmp to done: hdfs://wsmhcluster/mr-history/tmp/hdfs/job_1398533490859_0006.summary_tmp to hdfs://wsmhcluster/mr-history/tmp/hdfs/job_1398533490859_0006.summary
    2014-04-26 18:28:04,575 INFO [eventHandlingThread] org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler: Moved tmp to done: hdfs://wsmhcluster/mr-history/tmp/hdfs/job_1398533490859_0006_conf.xml_tmp to hdfs://wsmhcluster/mr-history/tmp/hdfs/job_1398533490859_0006_conf.xml
    2014-04-26 18

The forum ‘YARN’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.