When hfiles are written to hdfs

to create new topics or reply. | New User Registration

This topic contains 2 replies, has 2 voices, and was last updated by  Craig Clarks 11 months, 3 weeks ago.

  • Creator
    Topic
  • #57194

    Craig Clarks
    Participant

    Hi ,

    When hfiles are written to hdfs? .

    My understanding is all the minor compaction and major compaction are local to hbase ( regional servers). When this hfiles will be written to hdfs?

    Question 2. Does regional servers should always reside where datanodes are location?

    Regards
    Craig

Viewing 2 replies - 1 through 2 (of 2 total)

You must be to reply to this topic. | Create Account

  • Author
    Replies
  • #57247

    Craig Clarks
    Participant

    Hi Siva,

    what you send is correct – Hfiles are files for storing the data into Hbase.

    My understand was, whenever mem-store fills, it will spill to hfiles. These hfiles will be subject to minor and major compactions, subsequently. Does the journey of hfiles ends at major compaction?

    or will these major compaction files will be moved to hdfs ?

    or hfiles are directly written to hdfs from memstore ? In this case all the hfiles have default replication of hdfs?

    Some how I am not clear on the documentation and expecting a good explanation from this froum

    Regards
    Craig

    Collapse
    #57246

    SambaSivaRao Y
    Participant

    Hi Craig,

    As of I know Hfiles are files for storing the data into Hbase.
    If your question is about data commit into Hbase, Can you share your sample code which you wrote to write the Hfiles and to commit ?

    Thanks,
    SambaSiva.

    Collapse
Viewing 2 replies - 1 through 2 (of 2 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.