Sudheer Kumar Battula

  • Citations Per Year
Learn More
Hadoop Distributed File System (HDFS) is the core component of Apache Hadoop project. In HDFS, the computation is carried out in the nodes where relevant data is stored. Hadoop also implemented a parallel computational paradigm named as Map-Reduce. In this paper, we have measured the performance of read and write operations in HDFS by considering small and(More)
Distributed file systems are used in distributed systems to support scalable storage and to enable the parallel and distributed processing of data. Sharing the data stored in the distributed file system among authorized users is a major requirement of the distributed file system. Popular distributed file systems use session semantics to share the data among(More)
  • 1