Home Forums HDP on Linux – Installation HDP2.0: Datanode does not work, DataXceiver error processing unknown operation

This topic contains 3 replies, has 2 voices, and was last updated by  Sasha J 1 year, 4 months ago.

  • Creator
    Topic
  • #30875


    Member

    Hi,

    I am evaluating HDP 2.0 on CentOS 6.3 running on Windows Azure. However I do not manage to get the datanode running on my 1-node cluster installation.

    The datanode log file contains the following entry:

    2013-08-06 08:16:24,813 ERROR datanode.DataNode (DataXceiver.java:run(225)) – hdp201:50010:DataXceiver error processing unknown operation src: /100.86.80.150:63034 dest: /100.86.80.26:50010
    java.io.EOFException
    at java.io.DataInputStream.readShort(DataInputStream.java:298)
    at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:50)
    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:198)
    at java.lang.Thread.run(Thread.java:662)

    I have opened the following ports in the Windows Azure Firewall: 50010,50020,50070,50075,50090,8020,8010. The CentOS machine does not have a firewall (iptables is disabled).

    Datanode registration first did not work, but after I added “100.86.80.26 hdp201″ to the /etc/hosts file, it did work. So in http://hdp201.cloudapp.net:50070/dfsnodelist.jsp?whatNodes=LIVE it is stated that the datanode is running.

    However when I try to run the following commands from http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.0.4.0/bk_installing_manually_book/content/rpm-chap4-2.html, they do not work

    su $HDFS_USER

    hadoop fs -ls
    ls: `.': No such file or directory

    hadoop fs -mkdir /user/hdfs
    mkdir: `/user/hdfs': No such file or directory

    Note that the environment variables, like HDFS_USER, are correctly set.

    Thank you in advance for your help!

Viewing 3 replies - 1 through 3 (of 3 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #30950

    Sasha J
    Moderator

    I see your point…
    Please look at the datanode logs on both mentioned machines: src: /100.86.80.150:58756 dest: /100.86.80.26:50010
    Also, take a look to namenode log.

    People usually do not use Azure for such things, people use VMWare virtualization, or Virtual box, or EC2.

    Thank you!
    Sasha

    Collapse
    #30945


    Member

    Hi Sasha,

    thank you for your reply! I have tried “hadoop fs -ls /” and it does not return anything (at least no error), which should be ok for an empty hdfs. However the HDP2.0 documentation says that I should run “hadoop fs -mkdir /user/hdfs” after formatting the hdfs. Maybe the beta documentation is misleading in that manner (see http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.0.4.0/bk_installing_manually_book/content/rpm-chap4-2.html).

    I am using the manual installation, because the sandbox is not available for HyperV, which could be uploaded to Azure. I want to use Azure, because it allows me to setup 10 or more virtual machines.

    Please note that the datanode log still lists “hdp201:50010:DataXceiver error processing unknown operation src: /100.86.80.150:58756 dest: /100.86.80.26:50010″, although the directory “user” could be created with “hadoop fs -mkdir /user”. You are right, running “hadoop fs -mkdir /user/hdfs” requires “/user” to exist first.

    Thank you!

    Toni

    Collapse
    #30923

    Sasha J
    Moderator

    Toni,
    try this command:
    hadoop fs -ls /

    It will show you what actual folders you have in your HDFS.
    most likely, your /user folder does not exist.

    Also, make sure that your actual datanode process running.

    And as a last recommendation, forget about Azure, just use Sandbox (downloadable from Hortonworks web site) and use it at your pleasure on your local computer.

    Thank you!
    Sasha

    PS. this is actually wrong thread for questions on HDP 2.0….

    Collapse
Viewing 3 replies - 1 through 3 (of 3 total)