WebHdfsException at /filebrowser/

to create new topics or reply. | New User Registration

This topic contains 6 replies, has 3 voices, and was last updated by  steve ruegsegger 1 month, 2 weeks ago.

  • Creator
    Topic
  • #53637

    Steven Trescinski
    Participant

    Hey all,

    We have some troubles getting the filebrowser working in Hue.

    HDP was installed through Ambari and the installation guide below was used to install Hue:
    http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.0.6.0/bk_installing_manually_book/content/rpm-chap-hue.html

    When we click the filebrowser icon the following error is displayed:
    Failed to obtain user group information: org.apache.hadoop.security.authorize.AuthorizationException: Unauthorized connection for super-user: hue from IP 192.168.0.1 (error 401)

    When we click the ‘Check for misconfiguration’ button the following is shown (the value is in fact correct, as I can visit the URL manually):
    hadoop.hdfs_clusters.default.webhdfs_url Current value: http://hado001d.hadoop.local:50070/webhdfs/v1/
    Failed to access filesystem root

    We assumed it to be related to the proxyuser settings, but these seem to be correct, our core-site.xml contains the following (configured through Ambari and all services have been restarted afterward):
    hadoop.proxyuser.hue.groups *
    hadoop.proxyuser.hue.hosts *

    We also tried adding specific values to core-site.xml but if we do that we get the ‘could not impersonate’ error message:
    hadoop.proxyuser.hue.groups hue
    hadoop.proxyuser.hue.hosts 192.168.0.1

    Anybody any ideas?

    Steven

Viewing 6 replies - 1 through 6 (of 6 total)

You must be to reply to this topic. | Create Account

  • Author
    Replies
  • #69645

    steve ruegsegger
    Participant

    I have this same problem and have not found a solution yet. I installed HDP using Ambari and I just manually installed Hue and then clicked on “Check for misconfiguration”. One of the errors is “Failed to access filesystem root ” for config property hadoop.hdfs_clusters.default.webhdfs_url = http://localhost:50070/webhdfs/v1.

    http://localhost:50070 works just fine and takes me to /dfshealth. I can browse the hdfs cluster using that port too and /explorer.html. However, that exact URL above does not work. The webhdfs/v1 seems to be the bad part.

    Do I just remove that webhdfs/v1 part from the hue hdfs property? Do I need to (re) configure webhdfs?

    STeve

    Collapse
    #54006

    Dave
    Moderator

    Hi Steven,

    On is the hue user in the hadoop group and on all the machines in the cluster?

    Thanks

    Dave

    Collapse
    #53974

    Steven Trescinski
    Participant

    Hi Dave,

    Yes, no issues when browsing the filesystem through the NameNode UI.

    Steven

    Collapse
    #53952

    Dave
    Moderator

    Hi Steven,

    If you go to your NameNode UI (http://$NAMENODE:50070/ ) and go to “Browse the Filesystem” – does this work?

    Thanks

    Dave

    Collapse
    #53794

    Steven Trescinski
    Participant

    Hi Dave,

    Not using NameNode HA nor Kerberos.

    Telnet is indeed working without any issues, so no firewall related issues:

    [root@hopping ~]# telnet hado001d.hadoop.local 8020
    Trying 192.168.0.10...
    Connected to hado001d.hadoop.local.

    On the hue system we run all de cmd-line tools (pig, hadoop,…) without issues…

    Steven

    Collapse
    #53733

    Dave
    Moderator

    Hi Steven,

    Are you using NameNode HA or Kerberos?
    Have you checked that webhdfs is working correctly and not firewalled (ie telnet working from hue machine?)

    Thanks

    Dave

    Collapse
Viewing 6 replies - 1 through 6 (of 6 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.