Files are already split in data node in blocks while copy file from local to hdf

to create new topics or reply. | New User Registration

Tagged: 

This topic contains 1 reply, has 2 voices, and was last updated by  Steve Loughran 10 months, 2 weeks ago.

  • Creator
    Topic
  • #55418

    Hutashan Chandrakar
    Participant

    Files are already split in data node in blocks while copy file from local to hdfs then what is use of input splits in mapreduce framework

Viewing 1 replies (of 1 total)

You must be to reply to this topic. | Create Account

  • Author
    Replies
  • #55421

    Steve Loughran
    Participant

    Input splits let the mappers work on fractions of a larger file. As there’s usually three copies of each file in HDFS, you can run three mappers against different parts of the same block, giving you 3x the bandwidth, so hopefully 3x the performance

    Collapse
Viewing 1 replies (of 1 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.