Hue Forum

How to configure HTTPFS for filebrowser in NamenodeHA cluster

  • #49607

    I’ve installed the latest HDP2 via Ambari, enabled Namenode-HA and installed Hue as described here.

    I can login to Hue, but the filebrowser isn’t working. In Ambari =>Service HDFS => General WebHDFS is enabled, but none of the cluster nodes is listening on configured port 14000?!?! Therefore it makes sense, that “check for misconfiguration” in Hue shows the error:
    hadoop.hdfs_clusters.default.webhdfs_url || Current value: http://hadoop-pg-7.cluster:14000/webhdfs/v1/ || Failed to access filesystem root

    Why is none of the namenodes offering WebHDFS / listening to 14000 ?!?! I restarted the services several times to ensure the latest config has been applied.

    Additionally I found the information that for environments using Namenode-HA the Hue filebrowser shall use HTTPFS instead of WebHDFS. Is this still correct and how to configure this in Ambari ?!? I didn’t find any hints for that.

    many thanks in advance, Gerd

to create new topics or reply. | New User Registration

  • Author
  • #49608


    sorry for bothering 😉
    I thought HTTPFS is also started automatically while restarting all the services via Ambari.
    I manually started this service and configured in hue.ini the current active namenode as defaultFS and now the filebrowser is working.

    There is just one open question: if a namenode takeover occurs the filebrowser would stop working since the configured server isn’t the active one. Do I have to reconfigure hue.ini anytime the active namenode switches over ?!?!


    Hi Gerd,

    In your hue.ini you should have a setting for:


    When using HA you should set this to the service name which you specified in the HA setup (or it is found in the hdfs-site.xml under <name>dfs.nameservices</name>

    This will only point to the service and not worry about going directly to the NameNode




    that’s perfect, many thanks Dave.

The topic ‘How to configure HTTPFS for filebrowser in NamenodeHA cluster’ is closed to new replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.