HDP on Linux – Installation Forum

Ambari 1.6 Bug>> NameNode data

  • #58205
    William Lu
    Participant

    In Ambari 1.6, when two or more local directories are specified for “NameNode data”, Ambari does not manage their file/sub-directory ownerships properly.

    For example, when both /vol1/hadoop/hdfs/namenode and /vol2/hadoop/hdfs/namenode are specified, Ambari needs to run the following command in BOTH directories, not just one:
    chown -R hdfs:hadoop current

    Here are some sample error messages caused by this bug:

    2014-07-22 17:26:54,993 WARN namenode.FSNamesystem (FSNamesystem.java:loadFromDisk(640)) – Encountered exception loading fsimage
    java.io.FileNotFoundException: /opt/hadoop/hdfs/namenode/current/VERSION (Permission denied)
    at java.io.RandomAccessFile.open(Native Method)
    at java.io.RandomAccessFile.<init>(RandomAccessFile.java:241)
    at org.apache.hadoop.hdfs.server.common.StorageInfo.readPropertiesFile(StorageInfo.java:241)
    at org.apache.hadoop.hdfs.server.common.StorageInfo.readProperties(StorageInfo.java:227)
    at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:310)
    at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:202)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:891)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:638)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:480)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:536)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:695)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:680)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1329)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1395)

to create new topics or reply. | New User Registration

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.