Ambari Forum

Ambari Agent Registration Failure on RHEL 6.5 Due to OpenSSL

  • #51117
    Pramod Thangali

    Ambari Agent Registration Failure on RHEL 6.5 Due to OpenSSL

    Symptom: If you are deploying HDP using Ambari on RHEL 6.5 you will likely see Ambari Agents fail to register with Ambari Server during the Confirm Hosts step in the Cluster Install wizard. After clicking on Failed link on the Wizard page to display the Agent logs. You will see a log entry that looks like this indicating the SSL connection between the Agent and Server failed during registration.

    INFO 2014-04-02 04:25:22,669 - Failed to connect to https://<ambari-server>:8440/cert/ca due to [Errno 1] _ssl.c:492: error:100AE081:elliptic curve routines:EC_GROUP_new_by_curve_name:unknown group


    • RHEL / CentOS 6.5
    • Ambari 1.4 or later

    Root Cause: The OpenSSL library available and installed by default on RHEL/CentOS 6.5 has a bug. Refer to for detailed information on the bug.


    1. Check the OpenSSL library version installed on your host(s):

      rpm -qa | grep openssl
    2. If the output says openssl-1.0.1e-15.x86_64 (1.0.1 build 15) you will need to upgrade the OpenSSL library by running the following command:

      yum upgrade openssl

    3. Verify you have the newer version of OpenSSL (1.0.1 build 16):

      rpm -qa | grep openssl


    4. Restart Ambari Agent(s) and Click Retry Failed on the Wizard.

to create new topics or reply. | New User Registration

  • Author
  • #53787
    Jie Lu

    I already install openssl-1..0.1e-16.el6_5.x86_64 on ambari-server and ambari-agent.
    But it is also print out this error info without no reason after “due to”.

    Darius Agharokh

    Hi, Not sure if this is the correct thread. I am doing an automated install with Ambari 1.6 and hdp2.0. On “Cluster Install Wizard” step during ‘Install, start and Test”, I get a failure on two nodes of my 5 node test cluster. The error “Hearbeat lost for cluster”. “Ambari agent process is not heartbeating on the host”. This is happening on one of my datanodes and my master namenode. I have correct ssl version,. perhaps not the correct thread, can you please direct me .
    Many thanks

    Darius Agharokh

    Hi, this issue is now resolved. Started the agent manually on the two nodes and ‘retry’ the install. went much further and failed again as I had specified two directories for the namenode (comma delimited), removed the second directory , the install went through.

    S Siddalingaiah

    Hi, i faced similar issue i also noticed my open ssl version was openssl-1.0.1e-15.el6.x86_64 after upgrading to openssl-1.0.1e-16.el6.x86_64 Ambari Registration went through successfully

    Ramesh S


    This solution worked perfectly for me. I am using CentOS 6.5 and Ambari 1.7. The original openssl version was openssl.x86_64 0:1.0.1e-15.el6 and the new one is openssl.x86_64 0:1.0.1e-30.el6.8. As this is a cluster with some hosts, to apply the change on to all hosts I used SSH like this (from the ambari server): for I in <list of host names with spaces in between>;do ssh $I “yum upgrade openssl -y”;done

    Marco Aurelio Freiberger Monteiro

    Anwar Mian

    I have the following openssl version but it’s still complaining:


    I have CentOs 6.5 (final) and ambari-server-2.0.1-45.noarch

    Anwar Mian

    I’ve solved the problem. My data node (I have only one now) had openssl-1.0.1e-15.el6.x86_64. It was not upgraded. Once I upgraded it all my nodes were registered successfully. It’s very important to upgrade openssl.

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.