HDP on Linux – Installation Forum

Failed to connect

  • #29931

    After completing the installation and configuring part and starting the namenode, secondarynamenode and datanode when I am trying the smoke test using browser by typing http://myhostname:50070 every time it replies
    “Failed to Connect
    Firefox can’t establish a connection to the server at myhostname:50070
    I would be very grateful to you if you help me to sort the problem as soon as possible.

to create new topics or reply. | New User Registration

  • Author
  • #29937

    Hi Marshal,

    Did you make sure that the namenode and datanode processes are still running? Is this from a manual install? Is it HDP 1.3 on linux or HDP on windows?



    Hi Ted,
    Yes the namenode and datanode processes are running and this is from the manual install and HDP 1.3 on linux.


    HI Marshall,

    What are the contents of your /etc/hosts file? Did you make sure to set up passwordless ssh? (yes this is necessary even on a single node cluster – Hadoop uses ssh to communicate)



    Hi Ted,
    The contents of /etc/hosts file are

    # Do not remove the following line, or various programs
    #that require network functionality will fail. localhost myhostname

    and would you mind to help me how to set up passwordless ssh.


    Sasha J

    first of all, you should have your hosts file configured on your client machine (from which you run firefox).
    You should be able to ping your hadoop box.
    Also, you should disable all firewalls on your hadoop box.
    As of passwordless ssh setup, take a look here:

    Thank you!

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.