WRF with Hadoop

to create new topics or reply. | New User Registration

This topic contains 2 replies, has 2 voices, and was last updated by   10 months ago.

  • Creator
  • #54423


    Is it possible build Weather Research and Forecasting (WRF) model in hadoop? Currently we are using computer cluster where there is one head node and 10 compute node; when a job submitted in the system through the head node, other 10 compute node perform the same submitted job. the total system act like a super-computer where total combined processing power used to perform the same job. is this possible with HDP?


Viewing 2 replies - 1 through 2 (of 2 total)

You must be to reply to this topic. | Create Account

  • Author
  • #54428


    Thanks for reply. I am new in hadoop and have very little understanding, based on that knowledge when a job submitted in hadoop master node distribute the job to lots of data node and each data node perform its own portion of work and send the result back when data node complete its own portion. In this approach there is no communication between the data nodes each other. for example sometime it may require to communicate node to node which is called node to node message passing layer. what i need is I need to combine the total processing power to perform one singe job, the job cannot be distributed but only computation to complete the job can be distributed. please help me to understand how this works in hadoop.


    Sheetal Dolas

    Absolutely! Hadoop cluster in a sense is a supercomputer where multiple nodes of cluster work together to achieve a common goal. That is how the MapReduce, Tez works and that’s how machine learning algorithms are executed using libraries like Mahout and R on hadoop.
    Quite a few weather researchers are already using Hadoop for this purpose.

Viewing 2 replies - 1 through 2 (of 2 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.