Potential misconfiguration detected. Fix and restart Hue.

to create new topics or reply. | New User Registration


This topic contains 3 replies, has 3 voices, and was last updated by  Cameron Hunt 1 month, 1 week ago.

  • Creator
  • #57348

    Carolus Holman

    I have been trying to get HUE running on an EC2 cluster. The Filebrowser give me this error:
    WebHdfsException at /filebrowser/
    <urlopen error [Errno 111] Connection refused>

    I have updated the ini file with my local server dns name but the error still occurs. When going to the potential mis-configuration page i see this:

    hadoop.hdfs_clusters.default.webhdfs_url Current value: http://iocalserverurl.internal:50070/webhdfs/v1/
    Failed to access filesystem root
    hcatalog.templeton_url Current value: http://localhost:50111/templeton/v1/
    Oozie Editor/Dashboard The app won’t work without a running Oozie server

Viewing 3 replies - 1 through 3 (of 3 total)

You must be to reply to this topic. | Create Account

  • Author
  • #67570

    Cameron Hunt

    PK, a whole bunch of things *don’t* happen when you install the RPM, like adding the service script to /etc/init.d, create and populate /etc/hadoop-httpfs and /var/lib, etc. I assume that’s based on post-script not running due to this post: http://hortonworks.com/community/forums/topic/hadoop-httpfs-install-issues-in-hdp-2-2/

    To get it working, copy the service file, symlink the config and tomcat-deployment directories from /usr/hdp/, and symlink the tomcat bin and lib directories from bigtable into tomcat-deployment


    P K

    Cameron, could you please share how did you fix the hadoop httpfs rpm issues?



    Cameron Hunt

    Carolus, do you have HA enabled? If so, read below.

    Hue needs httpfs for a cluster with HA enabled.

    I had the same problem because although I had enabled High Availability, I had not installed and configured httpfs (a webhdfs proxy). When my active/standby name nodes switched their status, Hue still was pointing at the original active node.

    The Fix:
    (note – I’m running HDP 2.2 on Centos 6.6, and the post-install RPM scripts for hadoop-httpfs RPM don’t properly work. If you have the same behavior, you’ll need to install the init.d script and the conf files manually)

    To configure Hue to work with Namenode HA, we need to use httpfs instead of webhdfs to communicate with the namenode.
    Below is the list of steps after the whole cluster is setup correctly with Namenode HA enabled.

    1. Install httpfs server in any node within the cluster by running: yum install hadoop-httpfs

    2. Configure httpfs-site.xml and ensure the following properties are correct:


    3. Add the following properties to core-site.xml for httpfs via Ambari Web UI, which requires a restart of HDFS service:

    4. Start hadoop-httpfs service by running: service hadoop-httpfs restart

    5. Configure hue.ini by modify the below variable to:
    webhdfs_url=http://<fqdn of httpfs server>:14000/webhdfs/v1/

    6 Restart Hue.

Viewing 3 replies - 1 through 3 (of 3 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.