HMC single machine install problems

to create new topics or reply. | New User Registration

This topic contains 3 replies, has 2 voices, and was last updated by  Sasha J 2 years, 3 months ago.

  • Creator
    Topic
  • #12519

    Thomas Emge
    Member

    I have CentOS 6.3 virtual machine running on Hyper-V and I am attempting a single machine HDP installation. I believe that I have followed the pre-deployment steps very carefully but my deployment during the HMC steps fail when I am reaching the final step of the cluster install.

    As per suggestion I ran the check script and uploaded the file as test7.aplhive3.out

    Any suggestions on what I am missing or trouble-shooting are highly welcome.

    Thanks,
    – Thomas

Viewing 3 replies - 1 through 3 (of 3 total)

You must be to reply to this topic. | Create Account

  • Author
    Replies
  • #12523

    Sasha J
    Moderator

    Thomas,
    as expected :)

    Sasha

    Collapse
    #12522

    Thomas Emge
    Member

    Excellent, after making the change the install process completes just fine and all the services are running.

    Thanks,
    Thomas

    Collapse
    #12520

    Sasha J
    Moderator

    Thomas,
    you hit well known problem.
    HMC discovers mount points incorrectly and as a result, attempting to create directories in the device file, instead of real mount point:
    mkdir -p /dev/mapper/vg_aplhive3-lv_root/hadoop/hdfs/namenode]/returns (err): change from notrun to 0 failed: mkdir -p /dev/mapper/vg_aplhive3-lv_root/hadoop/hdfs/namenode returned 1 instead of one of [0]

    Please, rerun installation and in the mount-point selection page uncheck /dev/mapper locations and type “/” (no quotes!!!) in the text field.

    Thank you!
    Sasha

    Collapse
Viewing 3 replies - 1 through 3 (of 3 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.