Jobinfo files?

This topic contains 2 replies, has 2 voices, and was last updated by  Snoot 1 year, 9 months ago.

  • Creator
    Topic
  • #26413

    Snoot
    Participant

    Hi,

    After executing a job, I want to parse JobInfo/JobHistory files, that help me find out about the job tasks – which task attempts executed on which nodes, etc. We use the method parseJobTasks with the file in order to parse it.

    In CDH (4) we found the file in /var/log/hadoop-0.20-mapreduce/history/done/ but we can’t find them on HDP.

    Any idea where I can find them?

    Thanks!

Viewing 2 replies - 1 through 2 (of 2 total)

The topic ‘Jobinfo files?’ is closed to new replies.

  • Author
    Replies
  • #26420

    Snoot
    Participant

    Found them on HDFS (was looking in local fs) in /mapred/history/done – it can be configured with mapred.job.tracker.history.completed.location

    Collapse
    #26418

    Larry Liu
    Moderator

    Hi, Saggi

    On my HDP cluster, I found those files under this folder:

    /var/log/hadoop/mapred/history

    Can you please check if you can find information you are looking?

    Regards.

    Larry

    Collapse
Viewing 2 replies - 1 through 2 (of 2 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.