HDP on Linux – Installation Forum

problem in confirming the Hosts

  • #50734

    I am using Apache Ambari tool to setup the Hadoop Cluster environment.I am following Horton works Install guide.
    I am using RHEL5.7 version Linux servers.
    I have added the server hostnames in the /etc/hosts file properly with respective IP addresses.I have made passwordless SSH connection to all the servers.
    I am using ambari-agent- version.I am getting below errors while registering the hosts.Manual registration is also failing.Please suggest how to resolve this issue.

    File “/usr/lib/python2.6/site-packages/ambari_agent/security.py”, line 65, in connect
    File “/usr/lib64/python2.6/ssl.py”, line 338, in wrap_socket
    File “/usr/lib64/python2.6/ssl.py”, line 120, in __init__
    File “/usr/lib64/python2.6/ssl.py”, line 279, in do_handshake
    SSLError: [Errno 1] _ssl.c:491: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed

to create new topics or reply. | New User Registration

  • Author
  • #50735
    Upen K

    what do you mean by “I have made passwordless SSH connection to all the servers.” ? You need to make password less connection from your Ambari server to all the hosts.

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.