The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

HMC single machine install problems

  • #12519
    Thomas Emge
    Member

    I have CentOS 6.3 virtual machine running on Hyper-V and I am attempting a single machine HDP installation. I believe that I have followed the pre-deployment steps very carefully but my deployment during the HMC steps fail when I am reaching the final step of the cluster install.

    As per suggestion I ran the check script and uploaded the file as test7.aplhive3.out

    Any suggestions on what I am missing or trouble-shooting are highly welcome.

    Thanks,
    – Thomas

  • Author
    Replies
  • #12520
    Sasha J
    Moderator

    Thomas,
    you hit well known problem.
    HMC discovers mount points incorrectly and as a result, attempting to create directories in the device file, instead of real mount point:
    mkdir -p /dev/mapper/vg_aplhive3-lv_root/hadoop/hdfs/namenode]/returns (err): change from notrun to 0 failed: mkdir -p /dev/mapper/vg_aplhive3-lv_root/hadoop/hdfs/namenode returned 1 instead of one of [0]

    Please, rerun installation and in the mount-point selection page uncheck /dev/mapper locations and type “/” (no quotes!!!) in the text field.

    Thank you!
    Sasha

    #12522
    Thomas Emge
    Member

    Excellent, after making the change the install process completes just fine and all the services are running.

    Thanks,
    Thomas

    #12523
    Sasha J
    Moderator

    Thomas,
    as expected :)

    Sasha

The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.