Songkran Jarusirisawad

  • Citations Per Year
Learn More
We present a novel 3D display that can show any 3D contents in free space using laser-plasma scanning in the air. The laser-plasma technology can generate a point illumination at an arbitrary position in the free space. By scanning the position of the illumination, we can display a set of point illuminations in the space, which realizes 3D display in the(More)
This paper proposes a novel method for synthesizing free viewpoint video captured by uncalibrated pure rotating and zooming cameras. Neither intrinsic nor extrinsic parameters of our cameras are known. Projective grid space (PGS), which is the 3D space defined by the epipolar geometry of two basis cameras, is employed for weak camera calibration. Trifocal(More)
In this paper, we present a new online video-based rendering (VBR) method that creates new views of a scene from uncalibrated cameras. Our method does not require information about the cameras intrinsic parameters. For obtaining a geometrical relation among the cameras, we use projective grid space (PGS) which is 3D space defined by epipolar geometry(More)
In most conventional systems for new viewpoint video synthesis of a moving object, calibrated multiple fixed cameras are used. Field of view of every camera must cover all the area in which the object moves. If the area is large, object size in the captured images and also in the new viewpoint video will become very small. In this paper, we propose a novel(More)
This paper proposes a novel method for calibrating multiple hand-held cameras target for diminished reality application. Our method does not require any special markers or information about camera parameters. projective grid space (PGS) which is 3D space defined by epipolar geometry of two basis cameras is used for dynamic cameras calibration. Geometrical(More)
In most of previous systems for free viewpoint video synthesis of a moving object, cameras are calibrated once at initial setting and can not zoom or change view direction during capture. Field of view of each camera in those systems must cover all the area in which the object moves. If the area is large, the object’s resolution in the captured video and(More)
In this paper, we present a new online video-based rendering (VBR) method that creates new views of a scene from uncalibrated cameras. Our method does not require information about the cameras intrinsic parameters. For obtaining a geometrical relation among the cameras, we use projective grid space (PGS) which is 3D space defined by epipolar geometry(More)
We present an online rendering system which removes occluding objects in front of the objective scene from an input video using multiple videos taken with multiple cameras. To obtain geometrical relations between all cameras, we use projective grid space (PGS) defined by epipolar geometry between two basis cameras. Then we apply plane-sweep algorithm for(More)
  • 1