HDP on Linux – Installation Forum

Problem install HDP single host

  • #10450

    Hi ,
    I try to evaluate HDP product. (Install on a single host)

    I didn’t succeeded to install (last step browser installation wizard is failed):
    log file is : http://pastebin.com/MULrS3Qp

    I try to uninstall but uninstall failed also:
    log file: http://pastebin.com/NGjCTfv4

    I execute script from hortonworks site (check.sh) and here is a result:
    script result: http://pastebin.com/pQMtiSXa

    please assist , because I stuck and I even didn’t succeed to uninstall/reinstall the product and I don’t know how to go forward with the issue.

    Thanks in advance

to create new topics or reply. | New User Registration

  • Author
  • #10451
    Sasha J

    You attempting to use logical volume device files as a mountoint, which is incorrect:
    lv_root/hadoop/hdfs/namenode]/Hdp::Exec[mkdir -p /dev/mapper/vg_robinhood-lv_root/hadoop/hdfs/namenode]/Exec[mkdir -p /dev/mapper/vg_robinhood-lv_root/hadoop/hdfs/namenode]/returns (err): change from notrun to 0 failed: mkdir -p /dev/mapper/vg_robinhood-lv_root/hadoop/hdfs/namenode returned 1 instead of one of [0] at /etc/puppet/agent/modules/hdp/manifests/init.pp:255
    On the mount point selection page, uncheck all /dev/mapper lines and simply pit “/” (no quotes) to the text filed.

    Alos, it is not a good idea to use localhost as a host name.
    Put some meaningful name and some IP address not in subnet.
    Take a look on this post, it have a very good step by step instructions:


    Thanks you Sasha for a quick responce.
    I really going to follow this instructions , but in my case I should first UNINSTALL my incorrect installation. Am I right?
    I didn’t succeed to uninstall it.
    What is the steps should I do to clean up my environment ?

    Sasha J

    Just do:

    yum -y erase hmc puppet
    yum -y install hmc
    service hmc start


    Hi Sasha.
    Still have a problem.
    cluster install failed:

    Could be this a problem? And if yes how to resolve this?

    $pre_installed_pkgs]/returns (notice): Transaction Check Error:\””,
    “\”Mon Oct 01 00:54:22 +0200 2012 /Stage[1]/Hdp::Pre_install_pkgs/Hdp::Exec[yum install $pre_installed_pkgs]/Exec[yum install $pre_installed_pkgs]/returns (notice): file /usr/lib64/ganglia/modcpu.so from install of ganglia-gmond-3.2.0-99.x86_64 conflicts with file from package ganglia-3.1.7-6.el6.x86_64\””,

    If you see that problem is in another place please point me on it.

    By the way , I configure my ssh to local host from here:

    I did such steps:

    If you cannot ssh to localhost without a passphrase, execute the following commands:
    $ ssh-keygen -t dsa -P ” -f ~/.ssh/id_dsa
    $ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys

    and I provide to the hortonworks installation wizard this file : ~/.ssh/id_dsa.
    Is it correct?

    Thanks in advance

    Sasha J

    Yes, this is a problem.
    Takle a look to the following document:
    and make sure your system meet all the pre-requisites.
    remove all packages which lister in there with the exact versions, hmc will install all needed dependencies.


    Hi Sasha ,

    After fixing all dependencies issues I pass cluster install step.
    But step just after it HDFS start — failed.
    here is the log file:


    what could be a problem here?
    please assist.


    Sasha J

    your paste bin.com is not accessible.,
    run script from this post:

    Sasha J

    Is there a way to contact you directly, not through the forums?


    I uploaded result of the script here:

    Let me know is it accessible?

    Sure we can contact out of the forum.
    my skype — oruchovets.

    Let me know what is the way you want to contact.

The topic ‘Problem install HDP single host’ is closed to new replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.