Ekasit Kijsipongse

Learn More
Data centers always face challenges of peak and fluctuating resource demand from time to time. Building a data center that is large enough to meet a peak demand is not cost effective. The emerging of Cloud computing model allows the data center to dynamically acquire additional resources on demand and pay only for what resources having been used. So, the(More)
The calculation of pairwise correlation coefficient on a dataset, known as the correlation matrix, is often used in data analysis, signal processing, pattern recognition, image processing, and bioinformatics. With the state-of-the-art Graphic Processing Units (GPUs) that consist of massive cores capable to do processing up to several Gflops, the calculation(More)
Rendering is a crucial process in the production of computer generated animation movies. It executes computer programs to produce series of images which will be sequenced into a movie. However, rendering process on a single machine can be tedious, time-consuming and unproductive, especially for 3D animation. To resolve these problems, animation rendering is(More)
MapReduce framework has commonly been used to perform large-scale data processing, such as social network analysis, data mining as well as machine learning, on cluster computers. However, building a large dedicated cluster for MapReduce is not cost effective if the system is underutilized. To speedup the MapReduce computation with low cost, the computing(More)
K-Means is the clustering algorithm which is widely used in many areas such as information retrieval, computer vision and pattern recognition. With the recent advance in General Purpose Graphics Processing Unit (GPGPU), we can use a modern GPU which is capable to do computation up to Tflops to calculate K-Means clustering on average problems. However, due(More)
In the recent years, high performance computing (HPC) resources has grown up rapidly and diversely. The next generation of HPC platforms is assembled from resources of various types such as multi-core CPUs and GPUs. Thus, the development of a parallel program to fully utilize heterogeneously distributed resources in HPC environment is a challenge. A(More)
The Internet has formed the largest computing resource and data repository on this planet. The advent of it greatly benefited us in creating an important building block of the collaborative environment. There are some problems which impedes the success of creating it effectively. From the computing resources view, it is very difficult to permit users to(More)
Due to the computational demand of data intensive applications, parallel computer hardware such as the HPC Cluster system is required to execute such the applications. However, building large HPC Clusters for this sole purpose is not always feasible or even not cost-effective since the purchasing, operational and maintenance cost of' the dedicated systems(More)
Several scientific applications such as 3D Jacobi iteration (Rivera and Tseng, 2000) and LQCD (Gupta,1996) demand high computing power, and run on parallel systems. Such applications mostly operate on high dimensional data, and partitioning them into smaller units would help reduce their execution time considerably. Many algorithms such as CBP (Beaumont,(More)