HDP on Linux – Installation Forum

Error registering.

  • #14291
    Brent Evans
    Member

    I am installing on centos 6
    and I can’t get past the Confirm Hosts phase.
    I get an a “Failed” setus after Registering phase is complete
    Here is a tail from the log file display by Ambari

    “osfamily”: “RedHat”, “sshrsakey”: “AAAAB3NzaC1yc2EAAAABIwAAAQEA3FJuuZYynHMv92KkLrntrlp29FPW3Msl7mT3JuBguEYh0+nSb/tbgcFhqovmlhYNit9v++s4VlTrAq6r0vEMRfVdOTqO2nLotkQmJ8a07Qf6X8CdcUcUvyqKi9EU0uXvkiUd/4E3p/nKB0WHLKcB9OGwxrr4CGXPg2D/BXVUC0hrJ//X7JrJKXiZBA85uZH4SaCuhfM8vWuxqgUhQrM50TuAWtsN90x7DEENOnrk3yKHKJL/lS1PVqSBr2yVah1NkpUYyDB5D2R9NDVMO6LM1m9Y3obx7zt6koRXGxtI2nieuF5R4uqhpSRTz3f//bNAjuESffFgOsuObbw37KOoMw==”, “interfaces”: “bond0,eth0,eth1,lo”, “physicalprocessorcount”: 2, “path”: “/usr/lib/ambari-agent/lib/ruby-1.8.7-p370/bin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin”, “ipaddress”: “50.116.196.27”, “manufacturer”: “Dell”, “processor6″: “Intel(R) Xeon(R) CPU X5675 @ 3.07GHz”, “processor7″: “Intel(R) Xeon(R) CPU X5675 @ 3.07GHz”, “processor4″: “Intel(R) Xeon(R) CPU X5675 @ 3.07GHz”, “processor5″: “Intel(R) Xeon(R) CPU X5675 @ 3.07GHz”, “processor2″: “Intel(R) Xeon(R) CPU X5675 @ 3.07GHz”, “processor3″: “Intel(R) Xeon(R) CPU X5675 @ 3.07GHz”, “processor0″: “Intel(R) Xeon(R) CPU X5675 @ 3.07GHz”, “processor1″: “Intel(R) Xeon(R) CPU X5675 @ 3.07GHz”, “fqdn”: “app017.atl1″, “processor8″: “Intel(R) Xeon(R) CPU X5675 @ 3.07GHz”, “processor9″: “Intel(R) Xeon(R) CPU X5675 @ 3.07GHz”}, “timestamp”: 1358984460874, “hostname”: “app017.atl1.turn.com”, “responseId”: -1, “publicHostname”: “app017.atl1.turn.com”}’
    INFO 2013-01-23 15:41:03,932 Controller.py:103 – Unable to connect to: https://app017.atl1.turn.com:8441/agent/v1/register/app017.atl1
    Traceback (most recent call last):
    File “/usr/lib/python2.6/site-packages/ambari_agent/Controller.py”, line 89, in registerWithServer
    ret = json.loads(response)
    File “/usr/lib64/python2.6/json/__init__.py”, line 307, in loads
    return _default_decoder.decode(s)
    File “/usr/lib64/python2.6/json/decoder.py”, line 319, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
    File “/usr/lib64/python2.6/json/decoder.py”, line 338, in raw_decode
    raise ValueError(“No JSON object could be decoded”)
    ValueError: No JSON object could be decoded
    ‘, None)

    STDERR
    Connection to app017.atl1.turn.com closed
    Registering with the server…
    Registration with the server failed.

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #14331
    Sasha J
    Moderator

    This thread working through the support case.

    #17116
    Robert
    Participant

    Hi Brent,
    This issue is fixed in HDP 1.2.1 which was released around the first week of February time frame. The defect is mentioned here:
    https://issues.apache.org/jira/browse/AMBARI-1255

    Basically you were able to determine the correct fully qualified domain name to use, which you found out by running the python call :
    import socket;
    socket.getfqdn();

    Using the fqdn returned by the above python call got you passed the registering hosts phase, but not the installation phase. The HDP 1.2.1 fix should provide the fix to allow the installation phase to go through and finish.

    Regards,
    Robert

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.