HDP on Linux – Installation Forum

Single Node Installation Failure

  • #8468

    I am currently installing HDP on a CentOS VM Ware image.

    Here is what i have done:
    1. Setup password-less ssh authentication
    2. Upgrade browser.

    3. Update the hosts file with localhost defined as node1.
    4. Update the network in /etc/sysconfig
    5. Using HMC to install
    6. Select hosts file
    7. Select Private Key
    8. The Node Discovery step is successful
    – It says Finding reacable nodes – All 5 nodes (Question: what are the rest of 4 nodes as i only have one node configured)
    – Obtaining information about reachable nodes (all 5 nodes succeeded)
    – verifiying and updating node information (all 5 nodes suceeded)
    – Preparing discovered nodes (all 1 nodes suceeded)
    – Finalizing bootstrapped node (all 1 nodes suceeded)
    9. I select Proceed to Select Services.
    10. I select all the services
    11. Then assign master service to hosts (all of them default to – 2 GB – 1 Cores)
    12. Press next
    13. specify Disk mount points does not find any mount point. I enter a custom mount point as /mnt/hgfs/hws/ (I have a shared directory on VMWare which i am using)
    14. Then i customise settings giving the passwords for various services like nagios and mysql.
    15. I also enable web hdfs
    16. Enable LZO compression
    17. Setup Java home path
    18. The Deployment progress starts to show

    A cluster install failed message appears.

    Am i doing something wrong here?

    “2″: {
    “nodeReport”: {
    “nodeLogs”: []
    “56″: {
    “nodeReport”: [],
    “nodeLogs”: []
    “123″: {
    “nodeReport”: [],
    “nodeLogs”: []
    “124″: {
    “nodeReport”: [],
    “nodeLogs”: []

to create new topics or reply. | New User Registration

  • Author
  • #8576

    You may have slow connection to download code, please see “Sasha J” … comments under
    Forums ..HDP Installation.
    Puppet Failed : Pre Deploy

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.