Home Forums HDFS UnHealthy Data Nodes

This topic contains 1 reply, has 2 voices, and was last updated by  Robert Molina 7 months, 3 weeks ago.

  • Creator
    Topic
  • #45527

    Hello,

    I have followed the steps as documented here to install the HDP-2.0.6.0 manually.

    http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.0.6.0/bk_installing_manually_book/content/rpm-chap1.html

    I am successfully able to upload the files via CLI to HDFS and have 3 files uploaded. Now at the step “5. Smoke Test MapReduce” I have executed the command to create the 10g files. However it doesn’t seem to be responding and DataNodes now show me to be in Error. I am still able to upload files to the HDFS via cli tough.

    This is the error i get on the http://dubhe:8088/cluster page

    application_1387145389246_0002 hdfs TeraGen MAPREDUCE default Sun, 15 Dec 2013 22:47:07 GMT N/A ACCEPTED UNDEFINED
    UNASSIGNED
    application_1387145389246_0001 hdfs TeraGen MAPREDUCE default Sun, 15 Dec 2013 22:20:09 GMT N/A ACCEPTED UNDEFINED
    UNASSIGNED

Viewing 1 replies (of 1 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #46358

    Robert Molina
    Moderator

    Hi Atul
    This seems to indicate an issue on the Yarn side, rather than HDFS. You would probably get better responses in the Yarn forums on this issue. The error seems to indicate are no available resources as seen by the ResourceManager. Do you see “running” on the Resource Manager web UI first page for your nodes states?

    Regards,
    Robert

    Collapse
Viewing 1 replies (of 1 total)