HDP on Linux – Installation Forum

Starting up HDFS

  • #44033

    I’m currently following the manual installation of HDP. I am trying to make it pass the smoke tests for HDFS an I cannot access my files through the browser. I can access the Hadoop Namenode page where it states the status of it, but if I click the “Browse the filesystem” I get this error:

    HTTP ERROR 500

    Problem accessing /nn_browsedfscontent.jsp. Reason:

    Can’t browse the DFS since there are no live nodes available to redirect to.

    From what I have read it is because the datanodes are started properly, and I’ve tried getting them to start but I don’t get any information back from running hdfs datanode as root. If I run hdfs datanode from a standard hadoop user I get this error:

    13/11/19 15:58:11 WARN common.Util: Path /grid/hadoop/hdfs/dn should be specified as a URI in configuration files. Please update hdfs configuration.
    13/11/19 15:58:11 WARN datanode.DataNode: Invalid dfs.datanode.data.dir /grid/hadoop/hdfs/dn :
    EPERM: Operation not permitted

    I’ve followed the tutorial to the best of my ability, but I must be missing something. Any help would be appreciated! Thanks :)

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #44097
    Dave
    Moderator

    Hi Chris,

    Did you change the directories permissions and ownerships as required in the manual install?
    The DataNode should run as hdfs and not root.

    Thanks

    Dave

    #44105

    I added some of the users, but I gave permissions to root so datanode could start up under them. Will hadoop not allow datanode to start under root and I need to add the users to the system? Also, what’s the best way to go about setting up these users and groups? I’m currently using CentOS 6

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.