Antoine Vandecreme

  • Citations Per Year
Learn More
The goal of this survey paper is to overview cellular measurements using optical microscopy imaging followed by automated image segmentation. The cellular measurements of primary interest are taken from mammalian cells and their components. They are denoted as two- or three-dimensional (2D or 3D) image objects of biological interest. In our applications,(More)
We present a characterization of four basic Terabyte-sized image computations on a Hadoop cluster in terms of their relative efficiency according to the modified Amdahl's law. The work is motivated by the lack of standard benchmarks and stress tests for big image processing operations on a Hadoop computer cluster platform. Our benchmark design and(More)
We present a framework for the execution and dissemination of customizable content-based file comparison methods. Given digital objects such as files, database entries, or in-memory data structures, we are interested in establishing their proximity (i.e. similarity or dissimilarity) based on the information encoded within the files (text, images, 3D, video,(More)
Identification and quantification of the characteristics of stem cell preparations is critical for understanding stem cell biology and for the development and manufacturing of stem cell based therapies. We have developed image analysis and visualization software that allows effective use of time-lapse microscopy to provide spatial and dynamic information(More)
This article introduces readers to a web-based solution useful for interactive nanoscale measurements of centimeter-sized specimens. This solution is a client-server system that promotes collaborative measurements and discovery. The system consists of multiple computational modules that enable uploading microscopy images, extracting metadata, assembling(More)
Microscopes can now cover large spatial areas and capture stem cell behavior over time. However, without discovering statistically reliable quantitative stem cell quality measures, products cannot be released to market. A Web-based measurement system overcomes desktop limitations by leveraging cloud and cluster computing for offline computations and by(More)
Our objective is to lower the barrier of executing spatial image computations in a computer cluster/cloud environment instead of in a desktop/laptop computing environment. We research two related problems encountered during an execution of spatial computations over terabyte-sized images using Apache Hadoop running on distributed computing resources. The two(More)
This work addresses the problem of re-projecting a terabyte-sized 3D data set represented as a set of 2D Deep Zoom pyramids. In general, a re-projection for small 3D data sets is executed directly in RAM. However, RAM becomes a limiting factor for terabyte-sized 3D volumes formed by a stack of hundreds of megapixel to gigapixel 2D frames. We have(More)
  • 1