Hadoop is an open source framework from Apache. Hadoop is developed in the Java programming language and enables distributed process of large datasets across clusters of computers via programming models.
An application using the Hadoop framework works in an environment that provides distributed storage and computation across an arbitrary number of computer clusters. Hadoop is designed to be extremely scalable whether it is one server to many more machines.
The Hadoop framework includes the following 4 modules:
Hadoop Common: Collection of Java libraries and utilities required by other Hadoop modules. These libraries are required to run Hadoop.
Hadoop YARN: Framework for job scheduling and managing resources on clusters.
Hadoop Distributed File System: Distributed file system. These are literally the block where data is stored before being distributed into the different clusters.
Hadoop Map Reduce: Yarn based system to process in parrallel very large datasets.
Advantages of Hadoop
Quick to write and test distributed systems. The framework is very efficient, scalable, automated and runs processes in parrallel.
Hadoop framework does not rely on hardware for fault taulerance and high availability. Hadoop handles these at the application layer and manages redundancy backups.
You can dynamically add and remove servers to the cluster without affecting the operation.
Since Hadoop is writte in Java it is compatible across all platforms. Also it is open sourced.