Omar AlQudah

Learn More
Map Reduce is a parallel and a distributed computing framework used to process datasets that have large scale nature on a cluster. Due to the nature of data that needs to be handled in the Map Reduce problem which involves huge amount of data, many problems came up that are of great importance. Scheduling tasks is considered one of these major problems that(More)
The majority of large-scale data severe applications executed by data centers are based on MapReduce or its open-source implementation i. e. Hadoop. For processing huge sum of data in parallel Hadoop programming framework provides Distributed File System (HDFS)[2] and MapReduce Programming Model[3]. Job scheduling is an imperative process in Hadoop(More)
  • 1