The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

Ambari (default "admin" account to browse HDFS file system)

  • #28443
    Pal J
    Participant

    Hi ,
    All these day I have been using the ambari default account “admin”. When I try to browse the HDFS file system via ambary portal using this account I can’t access some the directories and files created . Can you please advise how use unix account or root account via ambari portal to browse HDFS file system

    thanks
    Pal

  • Author
    Replies
  • #28530
    Sasha J
    Moderator

    Pal,
    there is a property in HDFS configuration named dfs.web.ugi
    By default, it set to “gopher,gopher”, as a result, any folder not widely readable, can not be browsed by this user.
    If you really want to be able to browse the whole HDFS, change this property to “hdfs,hdfs” from the UI.

    Hope this helps.
    Thank you!
    Sasha

    #28532
    Pal J
    Participant

    Hi Sasha,
    Thanks for the update, there are two location of “hdfs-site.xml” one in /etc/alternatives/hadoop-conf/ and “/usr/bin/hadoop/conf” In which one should I change I think “/usr/bin/hadoop/conf” ?

    thanks
    Pal

    #28533
    Pal J
    Participant

    Hi Sasha,
    I tired to change values of property “dfs.web.ugi” in “hdfs-site.xml” in both the locations /etc/alternatives/hadoop-conf/ and “/usr/bin/hadoop/conf”. I still get same error and more over “hdfs-site.xml” is refreshed each when stop and start HDFS service via amabri portal i.e. all may changes gone i.e. “hdfs , hdfs” is replaced by “gopher,gopher”.
    Is this due to starting and stopping HDFS service via amabri portal may be using hdfs-site.xm from one these location
    var/lib/ambari-server/resources/stacks/HDP/
    var/lib/ambari-server/resources/stacks/HDPLOCAL/

    thanks
    Pal

    #28648
    Pal J
    Participant

    Hi Sasha,
    I changed the property “hdfs, hdfs” as per your advice, as soon the restart HDFS the property reverts back to “gopher,gopher”.
    Can please advice

    thanks
    Pal

    #28659
    tedr
    Moderator

    Hi Pal,

    On an Ambari Installed/Managed cluster you should make these settings in the Ambari web UI. You can find them on the "services" tab, then select "HDFS", then “"configurations", then expand the "Advanced" area and look for the property. NOTE saving property changes in this area requires that HDFS and MapReduce be shutdown before saving.

    Thanks,
    Ted.

    #28742
    Pal J
    Participant

    Hi Ted,
    thanks it works

    thanks
    Pal

    #28860
    tedr
    Moderator

    Hi Pal,

    Thanks for letting us know that it is now working like you want it to.

    Thanks,
    Ted.

    #29051
    Pal J
    Participant

    Hi Ted,
    On Ambari installation, If I have to change values of few properties in “core-site.xml” and add additional property like “dfs.support.append” in” hdfs-site.xml”, and also couple of changes to “hive-site.xml” , and so on..
    Can you please point me to physical location of configuration files for Hadoop services so that I don’t make changes via ambari portal.

    I would appreciate your advice

    thanks
    Pal

    #29071
    Sasha J
    Moderator

    Pal,
    any changes made in the configuration files not through Ambari UI will be reverted back by Amabri on next service restart.
    You have to utilize Ambari UI for making any changes in the configurations.

    Thank you!
    Sasha

    #29120
    Pal J
    Participant

    Hi Sasha,
    Thanks for your reply, I wanted to know is there a way make changes to these configuration files in particular to “core-site.xml” which is not available via Ambari UI manually without using Ambari UI. And also I want to add new properties to “hdfs-site.xml”
    Basically I installing HUE and have to change couple of configuration files and I would like to take backup of them before making changes

    thanks
    Pal

    #29134
    tedr
    Moderator

    Hi Pal,

    Without using the Ambari UI there is not a way to make permanent changes to these configuration files. As Sasha pointed out they will always be reverted to what Ambari has them set to when Ambari is restarted. Though if you don’t see the properties you need to change listed in the Ambari UI settings you can add them in the “Custom” area, you will need to enter then property names and values.

    Thanks,
    Ted.

The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.