Home Forums HDFS WRF with Hadoop

This topic contains 2 replies, has 2 voices, and was last updated by   3 months, 3 weeks ago.

  • Creator
    Topic
  • #54423


    Participant

    Hi,
    Is it possible build Weather Research and Forecasting (WRF) model in hadoop? Currently we are using computer cluster where there is one head node and 10 compute node; when a job submitted in the system through the head node, other 10 compute node perform the same submitted job. the total system act like a super-computer where total combined processing power used to perform the same job. is this possible with HDP?

    Thanks,
    Saj

Viewing 2 replies - 1 through 2 (of 2 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #54428


    Participant

    Thanks for reply. I am new in hadoop and have very little understanding, based on that knowledge when a job submitted in hadoop master node distribute the job to lots of data node and each data node perform its own portion of work and send the result back when data node complete its own portion. In this approach there is no communication between the data nodes each other. for example sometime it may require to communicate node to node which is called node to node message passing layer. what i need is I need to combine the total processing power to perform one singe job, the job cannot be distributed but only computation to complete the job can be distributed. please help me to understand how this works in hadoop.
    Thanks
    Saj.

    Collapse
    #54424

    Sheetal Dolas
    Participant

    Absolutely! Hadoop cluster in a sense is a supercomputer where multiple nodes of cluster work together to achieve a common goal. That is how the MapReduce, Tez works and that’s how machine learning algorithms are executed using libraries like Mahout and R on hadoop.
    Quite a few weather researchers are already using Hadoop for this purpose.

    Collapse
Viewing 2 replies - 1 through 2 (of 2 total)