Starting up HDFS

to create new topics or reply. | New User Registration

This topic contains 2 replies, has 2 voices, and was last updated by  Chris Rivadeneira 1 year, 8 months ago.

  • Creator
  • #44033

    I’m currently following the manual installation of HDP. I am trying to make it pass the smoke tests for HDFS an I cannot access my files through the browser. I can access the Hadoop Namenode page where it states the status of it, but if I click the “Browse the filesystem” I get this error:

    HTTP ERROR 500

    Problem accessing /nn_browsedfscontent.jsp. Reason:

    Can’t browse the DFS since there are no live nodes available to redirect to.

    From what I have read it is because the datanodes are started properly, and I’ve tried getting them to start but I don’t get any information back from running hdfs datanode as root. If I run hdfs datanode from a standard hadoop user I get this error:

    13/11/19 15:58:11 WARN common.Util: Path /grid/hadoop/hdfs/dn should be specified as a URI in configuration files. Please update hdfs configuration.
    13/11/19 15:58:11 WARN datanode.DataNode: Invalid /grid/hadoop/hdfs/dn :
    EPERM: Operation not permitted

    I’ve followed the tutorial to the best of my ability, but I must be missing something. Any help would be appreciated! Thanks :)

Viewing 2 replies - 1 through 2 (of 2 total)

You must be to reply to this topic. | Create Account

  • Author
  • #44105

    I added some of the users, but I gave permissions to root so datanode could start up under them. Will hadoop not allow datanode to start under root and I need to add the users to the system? Also, what’s the best way to go about setting up these users and groups? I’m currently using CentOS 6



    Hi Chris,

    Did you change the directories permissions and ownerships as required in the manual install?
    The DataNode should run as hdfs and not root.



Viewing 2 replies - 1 through 2 (of 2 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.