Problem install HDP single host

to create new topics or reply. | New User Registration

This topic contains 9 replies, has 2 voices, and was last updated by  oruchovets oruchovets 2 years, 10 months ago.

  • Creator
  • #10450

    Hi ,
    I try to evaluate HDP product. (Install on a single host)

    I didn’t succeeded to install (last step browser installation wizard is failed):
    log file is :

    I try to uninstall but uninstall failed also:
    log file:

    I execute script from hortonworks site ( and here is a result:
    script result:

    please assist , because I stuck and I even didn’t succeed to uninstall/reinstall the product and I don’t know how to go forward with the issue.

    Thanks in advance

Viewing 9 replies - 1 through 9 (of 9 total)

The topic ‘Problem install HDP single host’ is closed to new replies.

  • Author
  • #10473

    I uploaded result of the script here:
    Let me know is it accessible?

    Sure we can contact out of the forum.
    my skype — oruchovets.

    Let me know what is the way you want to contact.


    Sasha J

    Is there a way to contact you directly, not through the forums?


    Sasha J

    your paste is not accessible.,
    run script from this post:


    Hi Sasha ,

    After fixing all dependencies issues I pass cluster install step.
    But step just after it HDFS start — failed.
    here is the log file:

    what could be a problem here?
    please assist.



    Sasha J

    Yes, this is a problem.
    Takle a look to the following document:
    and make sure your system meet all the pre-requisites.
    remove all packages which lister in there with the exact versions, hmc will install all needed dependencies.


    Hi Sasha.
    Still have a problem.
    cluster install failed:

    Could be this a problem? And if yes how to resolve this?

    $pre_installed_pkgs]/returns (notice): Transaction Check Error:\””,
    “\”Mon Oct 01 00:54:22 +0200 2012 /Stage[1]/Hdp::Pre_install_pkgs/Hdp::Exec[yum install $pre_installed_pkgs]/Exec[yum install $pre_installed_pkgs]/returns (notice): file /usr/lib64/ganglia/ from install of ganglia-gmond-3.2.0-99.x86_64 conflicts with file from package ganglia-3.1.7-6.el6.x86_64\””,

    If you see that problem is in another place please point me on it.

    By the way , I configure my ssh to local host from here:

    I did such steps:

    If you cannot ssh to localhost without a passphrase, execute the following commands:
    $ ssh-keygen -t dsa -P ” -f ~/.ssh/id_dsa
    $ cat ~/.ssh/ >> ~/.ssh/authorized_keys

    and I provide to the hortonworks installation wizard this file : ~/.ssh/id_dsa.
    Is it correct?

    Thanks in advance


    Sasha J

    Just do:

    yum -y erase hmc puppet
    yum -y install hmc
    service hmc start


    Thanks you Sasha for a quick responce.
    I really going to follow this instructions , but in my case I should first UNINSTALL my incorrect installation. Am I right?
    I didn’t succeed to uninstall it.
    What is the steps should I do to clean up my environment ?


    Sasha J

    You attempting to use logical volume device files as a mountoint, which is incorrect:
    lv_root/hadoop/hdfs/namenode]/Hdp::Exec[mkdir -p /dev/mapper/vg_robinhood-lv_root/hadoop/hdfs/namenode]/Exec[mkdir -p /dev/mapper/vg_robinhood-lv_root/hadoop/hdfs/namenode]/returns (err): change from notrun to 0 failed: mkdir -p /dev/mapper/vg_robinhood-lv_root/hadoop/hdfs/namenode returned 1 instead of one of [0] at /etc/puppet/agent/modules/hdp/manifests/init.pp:255
    On the mount point selection page, uncheck all /dev/mapper lines and simply pit “/” (no quotes) to the text filed.

    Alos, it is not a good idea to use localhost as a host name.
    Put some meaningful name and some IP address not in subnet.
    Take a look on this post, it have a very good step by step instructions:

Viewing 9 replies - 1 through 9 (of 9 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.