The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

How to byass step9 error

  • #54243
    William Degnan

    The ambari-agent.log keeps cycling the following every 10 seconds, I am getting 0% progress during “Install Start and test” step 9. I can reboot and log in again, but it jumps back to this step. From ambari-agent.log every 10 seconds a new batch of rows are added :

    WARNING 2014-05-22 00:11:28,047 – Failed to connect to https://localhost.localdomain.localdomain:8440/cert/ca due to [Errno -2] Name or service not known
    INFO 2014-05-22 00:11:28,047 – Server at https://localhost.localdomain.localdomain:8440 is not reachable, sleeping for 10 seconds…

    I need to change the value pulled by, but I don’t know much about Psql which I believe has the wrong value thus causing the NetUtil to keep pulling the wrong URL. As you can see localhost.localdomain.localdomain is wrong it should be localhost.localdomain….how to fix so I can complete the install without having to start all over again….it seems like I could just fix the database or maybe even hard code NetUtil to GET the correct URL? Please help!

    Note hosts is correct, hostname -f returns localhost.localdomain

    I believe the incorrect value “localhost.localdomain.localdomain” is here
    from urlparse import urlparse
    import time
    import logging
    import httplib
    from ssl import SSLError

The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.