Write a review to share your experiences with open source!
On DevRates we focus on reviews by developers using libraries on their daily work.
Interested in the latest trends and top-rated open source projects for all layers of your application?
DevRates contains projects reviews of most popular tagged categories and programming languages.
Follow projects and don't miss any news from blogs and twitter on your wall.
Your company is looking for talented developers? Register on DevRates and show your technology stack on your company profile.
The Apache™ Hadoop™ project develops open-source software for reliable, scalable, distributed computing.
The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using a simple programming model. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage. Rather than rely on hardware to deliver high-avaiability, the library itself is designed to detect and handle failures at the application layer, so delivering a highly-availabile service on top of a cluster of computers, each of which may be prone to failures.
The project includes these subprojects:
Hadoop Common: The common utilities that support the other Hadoop subprojects.
Hadoop Distributed File System (HDFS™): A distributed file system that provides high-throughput access to application data.
Hadoop MapReduce: A software framework for distributed processing of large data sets on compute clusters.