Learn More
Applications that process large volumes of data (such as, search engines, grid computing applications, data mining applications, etc.) require a backend infrastructure for storing data. The distributed file system is the central component for storing data infrastructure. There have been many projects focused on network computing that have designed and(More)
The distributed file system (DFS) provides a mechanism where a file can be stored across several physical computer nodes ensuring replication transparency and failure transparency. In order to achieve this, one important feature of DFS's data structure is to employ a proper way to organize the large scale data and file namespaces to provide a fast access(More)
In a distributed file system, operations such as read, write, send and receive should be efficient and must increase the overall performance of the system. Especially every remote read and write operation we encounter network and secondary storage latencies. In order to reduce this latency time and improve the performance of distributed file system, we(More)
  • 1