Thanks for reply. I am new in hadoop and have very little understanding, based on that knowledge when a job submitted in hadoop master node distribute the job to lots of data node and each data node perform its own portion of work and send the result back when data node complete its own portion. In this approach there is no communication between the data nodes each other. for example sometime it may require to communicate node to node which is called node to node message passing layer. what i need is I need to combine the total processing power to perform one singe job, the job cannot be distributed but only computation to complete the job can be distributed. please help me to understand how this works in hadoop.
WRF with Hadoop
This topic contains 2 replies, has 2 voices, and was last updated by 9 months, 1 week ago.
Is it possible build Weather Research and Forecasting (WRF) model in hadoop? Currently we are using computer cluster where there is one head node and 10 compute node; when a job submitted in the system through the head node, other 10 compute node perform the same submitted job. the total system act like a super-computer where total combined processing power used to perform the same job. is this possible with HDP?