HBase Forum

HDP single node installation – Cluster install fails

  • #8456
    aabbasi
    Participant

    Hi,
    I am currently installing HDP on a CentOS VM Ware image.

    Here is what i have done:
    1. Setup password-less ssh authentication
    2. Upgrade browser.

    3. Update the hosts file with localhost defined as node1.
    4. Update the network in /etc/sysconfig
    5. Using HMC to install
    6. Select hosts file
    7. Select Private Key
    8. The Node Discovery step is successful
    – It says Finding reacable nodes – All 5 nodes (Question: what are the rest of 4 nodes as i only have one node configured)
    – Obtaining information about reachable nodes (all 5 nodes succeeded)
    – verifiying and updating node information (all 5 nodes suceeded)
    – Preparing discovered nodes (all 1 nodes suceeded)
    – Finalizing bootstrapped node (all 1 nodes suceeded)
    9. I select Proceed to Select Services.
    10. I select all the services
    11. Then assign master service to hosts (all of them default to 127.0.0.1 – 2 GB – 1 Cores)
    12. Press next
    13. specify Disk mount points does not find any mount point. I enter a custom mount point as /mnt/hgfs/hws/ (I have a shared directory on VMWare which i am using)
    14. Then i customise settings giving the passwords for various services like nagios and mysql.
    15. I also enable web hdfs
    16. Enable LZO compression
    17. Setup Java home path
    18. The Deployment progress starts to show

    A cluster install failed message appears.

    Am i doing something wrong here?

    {
    “2”: {
    “nodeReport”: {
    “PUPPET_KICK_FAILED”: [],
    “PUPPET_OPERATION_FAILED”: [
    “127.0.0.1”
    ],
    “PUPPET_OPERATION_TIMEDOUT”: [
    “127.0.0.1”
    ],
    “PUPPET_OPERATION_SUCCEEDED”: []
    },
    “nodeLogs”: []
    },
    “56”: {
    “nodeReport”: [],
    “nodeLogs”: []
    },
    .
    .
    .
    “123”: {
    “nodeReport”: [],
    “nodeLogs”: []
    },
    “124”: {
    “nodeReport”: [],
    “nodeLogs”: []
    }
    }

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #8481
    Sasha J
    Moderator

    Hi,

    try installing all the rpm’s prior to starting HMC and let us know if this does not fix the issue

    puppet kick timeout

    #8807
    aabbasi
    Participant

    Hi Sasha,

    Can you let me know which RPM’s to install ? Is their a list available ?

    Regards,

    #8808
    Sasha J
    Moderator
    #13212

    I have same error.

    after I run
    1) yum erase hmc puppet
    2) yum install hmc
    3) yum install -y hadoop hadoop-libhdfs.x86_64 hadoop-native.x86_64 hadoop-pipes.x86_64 hadoop-sbin.x86_64 hadoop-lzo hadoop hadoop-libhdfs.x86_64 ……………
    then start hmc and can not fix this error.

    this error is :
    {
    “2”: {
    “nodeReport”: {
    “PUPPET_KICK_FAILED”: [],
    “PUPPET_OPERATION_FAILED”: [
    “hadoopserver1″
    ],
    “PUPPET_OPERATION_TIMEDOUT”: [
    “hadoopserver1″
    ],
    “PUPPET_OPERATION_SUCCEEDED”: []
    },
    “nodeLogs”: []
    },
    “25”: {
    “nodeReport”: [],
    “nodeLogs”: []
    },
    “26”: {
    “nodeReport”: [],
    “nodeLogs”: []
    },
    “27”: {
    “nodeReport”: [],
    “nodeLogs”: []
    },
    “30”: {
    “nodeReport”: [],
    “nodeLogs”: []
    },
    “32”: {
    “nodeReport”: [],
    “nodeLogs”: []
    },
    “33”: {
    “nodeReport”: [],
    “nodeLogs”: []
    },
    “35”: {
    “nodeReport”: [],
    “nodeLogs”: []
    },
    “37”: {
    “nodeReport”: [],
    “nodeLogs”: []
    },
    “39”: {
    “nodeReport”: [],
    “nodeLogs”: []
    },
    “40”: {
    “nodeReport”: [],
    “nodeLogs”: []
    },
    “41”: {
    “nodeReport”: [],
    “nodeLogs”: []
    },
    “43”: {
    “nodeReport”: [],
    “nodeLogs”: []
    },
    “44”: {
    “nodeReport”: [],
    “nodeLogs”: []
    },
    “45”: {
    “nodeReport”: [],
    “nodeLogs”: []
    },
    “47”: {
    “nodeReport”: [],
    “nodeLogs”: []
    },
    “48”: {
    “nodeReport”: [],
    “nodeLogs”: []
    }
    }

    #13219
    tedr
    Member

    Hi Hama,

    Again Thanks for trying out HDP.

    Please follow the instructions here:
    http://hortonworks.com/community/forums/topic/hmc-installation-support-help-us-help-you
    Also, in the future, please post to only one forum unless it is a different issue.

    Thanks,
    Ted.

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.