I am evaluating HDP 2.0 on CentOS 6.3 running on Windows Azure. However I do not manage to get the datanode running on my 1-node cluster installation.
The datanode log file contains the following entry:
2013-08-06 08:16:24,813 ERROR datanode.DataNode (DataXceiver.java:run(225)) – hdp201:50010:DataXceiver error processing unknown operation src: /100.86.80.150:63034 dest: /100.86.80.26:50010
I have opened the following ports in the Windows Azure Firewall: 50010,50020,50070,50075,50090,8020,8010. The CentOS machine does not have a firewall (iptables is disabled).
Datanode registration first did not work, but after I added “100.86.80.26 hdp201″ to the /etc/hosts file, it did work. So in http://hdp201.cloudapp.net:50070/dfsnodelist.jsp?whatNodes=LIVE it is stated that the datanode is running.
However when I try to run the following commands from http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-188.8.131.52/bk_installing_manually_book/content/rpm-chap4-2.html, they do not work
hadoop fs -ls
ls: `.': No such file or directory
hadoop fs -mkdir /user/hdfs
mkdir: `/user/hdfs': No such file or directory
Note that the environment variables, like HDFS_USER, are correctly set.
Thank you in advance for your help!