HDP on Linux – Installation Forum

HMC single machine install problems

  • #12519
    Thomas Emge
    Member

    I have CentOS 6.3 virtual machine running on Hyper-V and I am attempting a single machine HDP installation. I believe that I have followed the pre-deployment steps very carefully but my deployment during the HMC steps fail when I am reaching the final step of the cluster install.

    As per suggestion I ran the check script and uploaded the file as test7.aplhive3.out

    Any suggestions on what I am missing or trouble-shooting are highly welcome.

    Thanks,
    – Thomas

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #12520
    Sasha J
    Moderator

    Thomas,
    you hit well known problem.
    HMC discovers mount points incorrectly and as a result, attempting to create directories in the device file, instead of real mount point:
    mkdir -p /dev/mapper/vg_aplhive3-lv_root/hadoop/hdfs/namenode]/returns (err): change from notrun to 0 failed: mkdir -p /dev/mapper/vg_aplhive3-lv_root/hadoop/hdfs/namenode returned 1 instead of one of [0]

    Please, rerun installation and in the mount-point selection page uncheck /dev/mapper locations and type “/” (no quotes!!!) in the text field.

    Thank you!
    Sasha

    #12522
    Thomas Emge
    Member

    Excellent, after making the change the install process completes just fine and all the services are running.

    Thanks,
    Thomas

    #12523
    Sasha J
    Moderator

    Thomas,
    as expected :)

    Sasha

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.