Home Forums HDP on Linux – Installation HDP 2.4.0 datanode startup fail: FATAL – Exception in secureMain

Tagged: 

This topic contains 5 replies, has 4 voices, and was last updated by  Jan Peters 2 months ago.

  • Creator
    Topic
  • #54182

    Katrina Poon
    Participant

    I installed HDP 2.4 following the Manual Install (RPMs) – http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.1.2/bk_installing_manually_book/content/rpm-chap1.html.

    The NameNode and SecondaryNameNode startup with no problem. However, I am unable to start up datanodes.
    Here are the errors I get when I try to start datanode:
    ——-
    2014-05-21 13:23:42,149 INFO datanode.DataNode (DataNode.java:initDataXceiver(544)) – Listening on UNIX domain socket: /var/lib/hadoop-hdfs/dn_socket
    2014-05-21 13:23:42,165 FATAL datanode.DataNode (DataNode.java:secureMain(2002)) – Exception in secureMain
    java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.SharedFileDescriptorFactory.createDescriptor0(Ljava/lang/String;Ljava/lang/String;I)Ljava/io/FileDescriptor;
    at org.apache.hadoop.io.nativeio.SharedFileDescriptorFactory.createDescriptor0(Native Method)
    at org.apache.hadoop.io.nativeio.SharedFileDescriptorFactory.create(SharedFileDescriptorFactory.java:87)
    at org.apache.hadoop.hdfs.server.datanode.ShortCircuitRegistry.<init>(ShortCircuitRegistry.java:169)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.initDataXceiver(DataNode.java:548)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:736)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:281)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1885)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1772)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1819)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1995)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2019)
    2014-05-21 13:23:42,170 INFO util.ExitUtil (ExitUtil.java:terminate(124)) – Exiting with status 1
    2014-05-21 13:23:42,172 INFO datanode.DataNode (StringUtils.java:run(640)) – SHUTDOWN_MSG:
    /************************************************************
    SHUTDOWN_MSG: Shutting down DataNode at e5/10.10.123.85
    ************************************************************/

Viewing 5 replies - 1 through 5 (of 5 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #58875

    Jan Peters
    Participant

    I’m facing the same issue and posted it here http://hortonworks.com/community/forums/topic/kerberos-implementation-datanodes-wont-start/
    so am looking for the same solution. Thanks in advance for your help.
    mit Freundlichen Grüßen (with Friendly Greetings),
    Jan Peters

    Collapse
    #58336

    Hi Katrina, i have the same problem!
    Did you solve it?

    Thanks

    Collapse
    #54235

    Dave
    Moderator

    Hi Katrina,

    Is there anyway you could attach or share your hdfs-site and core-site.xml

    Thanks

    Dave

    Collapse
    #54206

    Katrina Poon
    Participant

    I modified the core-site.xml and hdfs-site.xml files according to the manual.

    Here is the hadoop version and rpm info :
    —–
    bash-4.1$ hadoop version
    Hadoop 2.4.0.2.1.2.0-402
    Subversion git@github.com:hortonworks/hadoop.git -r 9e5db004df1a751e93aa89b42956c5325f3a4482
    Compiled by jenkins on 2014-04-27T22:28Z
    Compiled with protoc 2.5.0
    From source with checksum 9e788148daa5dd7934eb468e57e037b5
    This command was run using /usr/lib/hadoop/hadoop-common-2.4.0.2.1.2.0-402.jar
    bash-4.1$
    bash-4.1$ rpm -qa |grep hadoop
    hadoop-client-2.4.0.2.1.2.0-402.el6.x86_64
    hadoop-mapreduce-2.4.0.2.1.2.0-402.el6.x86_64
    hadoop-hdfs-2.4.0.2.1.2.0-402.el6.x86_64
    hadoop-yarn-2.4.0.2.1.2.0-402.el6.x86_64
    hadoop-libhdfs-2.4.0.2.1.2.0-402.el6.x86_64
    hadoop-2.4.0.2.1.2.0-402.el6.x86_64
    bash-4.1$

    Collapse
    #54202

    Dave
    Moderator

    Hi Katrina,

    Can you run:

    hadoop version
    rpm -qa | grep hadoop

    and paste the output here?

    Also check that your hdfs-site and core-site.xml are populated correctly from the helper files

    Thanks

    Dave

    Collapse
Viewing 5 replies - 1 through 5 (of 5 total)