As part of Map - The input probelm wouldbe broken into into small data crunches and will be distribute them to its working nodes for processing. The working nodes accepts the data and process them independently and give it back to namenode.
Reduce : is the process the of collecting the results from the data nodes and preparing a final output.
Hadoop framework is working on the concept of distributed file system which is implemented by google called GFS (google File System). That is data is distributed across the cluster in the form of block (chunks) .
answered By  0
Kind of MPP with Data Redundancy on a Distributed File System Clustered setup.