HDP on Linux – Installation Forum

Failed to connect

  • #30637

    I am trying to install hadoop (manual installation) on linux (RHEL5.5) in a single host(Both for Datanode and namenode). After completing the installation I became able to start the namenode, secondarynamenode and datanode. I have configured the file (core-site.xml,hdfs-site.xml,mapred-site.xml,taskcontroller.cfg) and placed them in the proper location. I have edited the host name and placed the hostname into the configuration file too. But when I am trying to run the smoke test by typing http://myhostname:50070 in the browser , it replies failed to connect everytimes. Please help me to sort the problem as early as possible.

to create new topics or reply. | New User Registration

  • Author
  • #30668
    Sasha J

    Do you have firewall running on your node?
    Also, did your restart processes after changing hostname?

    Thank you!


    No firewall is not running on the node and I have restarted processes after changing the hostname.

    Seth Lyubich

    Hi Marshal,

    Please make sure that you setup http address property correctly in hdfs-site.xml. Please see below for example.

    [root@sandbox conf]# grep -B1 -A2 50070 hdfs-site.xml

    Hope this helps,


You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.