Home Forums HDP on Linux – Installation FATAL DataNode Exception in secureMain

This topic contains 1 reply, has 2 voices, and was last updated by  Sanjeev 4 months, 1 week ago.

  • Creator
    Topic
  • #58319

    Hi all, I have the next problem when I try to start my DataNode, (2.4 version) (java version 1.7.0_03)

    2014-08-05_15:36:08.40926 14/08/05 15:36:08 INFO datanode.DataNode: Starting DataNode with maxLockedMemory = 0
    2014-08-05_15:36:08.43122 14/08/05 15:36:08 INFO datanode.DataNode: Opened streaming server at /192.168.106.204:50010
    2014-08-05_15:36:08.43413 14/08/05 15:36:08 INFO datanode.DataNode: Balancing bandwith is 6250000 bytes/s
    2014-08-05_15:36:08.44498 14/08/05 15:36:08 FATAL datanode.DataNode: Exception in secureMain
    2014-08-05_15:36:08.44501 java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.SharedFileDescriptorFactory.createDescriptor0(Ljava/lang/String;Ljava/lang/String;I)Ljava/io/FileDescriptor;
    2014-08-05_15:36:08.44501 at org.apache.hadoop.io.nativeio.SharedFileDescriptorFactory.createDescriptor0(Native Method)
    2014-08-05_15:36:08.44501 at org.apache.hadoop.io.nativeio.SharedFileDescriptorFactory.create(SharedFileDescriptorFactory.java:87)
    2014-08-05_15:36:08.44501 at org.apache.hadoop.hdfs.server.datanode.ShortCircuitRegistry.<init>(ShortCircuitRegistry.java:169)
    2014-08-05_15:36:08.44502 at org.apache.hadoop.hdfs.server.datanode.DataNode.initDataXceiver(DataNode.java:548)
    2014-08-05_15:36:08.44502 at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:736)
    2014-08-05_15:36:08.44502 at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:281)
    2014-08-05_15:36:08.44502 at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1878)
    2014-08-05_15:36:08.44502 at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1772)
    2014-08-05_15:36:08.44503 at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1812)
    2014-08-05_15:36:08.44503 at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1988)
    2014-08-05_15:36:08.44504 at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2012)
    2014-08-05_15:36:08.44607 14/08/05 15:36:08 INFO util.ExitUtil: Exiting with status 1
    2014-08-05_15:36:08.44739 14/08/05 15:36:08 INFO datanode.DataNode: SHUTDOWN_MSG:
    2014-08-05_15:36:08.44741 /************************************************************
    2014-08-05_15:36:08.44741 SHUTDOWN_MSG: Shutting d…

    When I try to start it manually:

    [root@root sbin]# ./hadoop-daemon.sh –config /etc/hadoop –script hdfs start datanode
    starting datanode, logging to /opt/rb/var/hadoop/logs/hadoop-root-datanode-pablo04.out
    ./hadoop-daemon.sh: line 123: 7648 Killed nohup nice -n $HADOOP_NICENESS $hdfsScript –config $HADOOP_CONF_DIR $command “$@” > “$log” 2>&1 < /dev/null

    Could anyone help me please?
    Thanks!

Viewing 1 replies (of 1 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #58816

    Sanjeev
    Participant

    Hi Sergio,

    Can you provide the error from the datanode .out log file from the time when you start the datanode?

    Thanks
    Sanjeev

    Collapse
Viewing 1 replies (of 1 total)