Jan-Friso Evers-Senne

Learn More
In this paper we present a novel approach for interactive rendering of virtual views from real image sequences. Combining the concepts of light fields, depth-compensated image warping and view dependent texture mapping, this plenoptic modeling approach can handle large and complex scenes. A portable, handheld multi-camera system has been developed that(More)
In this work we present a novel approach for image based rendering (IBR) of complex real scenes that have been recorded with freely moving hand-operated cameras. The images are automatically calibrated and 3D scene depth maps are computed for each real view. To render a new virtual view, the depth maps of the nearest real views are fused in a scalable(More)
We propose a system for robust modeling and visualisation of complex outdoor scenes from multi-camera image sequences and additional sensor information. A camera rig with one or more fire-wire cameras is used in conjunction with a 3-axis rotation sensor to robustly obtain a calibration of the scene with an uncalibrated structure from motion approach. Dense(More)
In this paper we will present a novel approach of using surface patches for Image Based Rendering. Based on image sequences acquired with a freely moving portable multi-camera-rig we can extrapolate novel views of complex real scenes in realtime. The cameras are calibrated from the image sequence itself and dense depth maps are computed for each camera view(More)
A distributed realtime system for immersive visualization is presented which uses distributed interaction for control. We will focus on user tracking with fixed and pan-tilt-zoom cameras, synchronization of multiple interaction devices and distributed synchronized visualization. The system uses only standard hardware and standard network protocols.(More)
In the last decade the visualization of virtual environments and interaction within was possible only with specialized hardware. This hardware was very expensive, had a lack of scalability and used specific protocols, busses, networks for communication and specialized graphics hardware for visualization. We will present a simple protocol for synchronized(More)
For AR applications the 3D position and direction of the users view have to be determined in real time. At the same time, the augmentation, one or more 3D objects, have to be rendered. This article describes a distributed mobile AR-System designed for industrial service applications. By distributing the different tasks involved in AR, the computational load(More)
This paper describes a visual markerless real-time tracking system for Augmented Reality applications. The system uses a firewire camera with a fisheye lens mounted at 10 fps. Visual tracking of 3D scene points is performed simultaneously with 3D camera pose estimation without any prior scene knowledge. All visual-geometric data is acquired using a(More)
In this paper we present an approach to generate novel views from multiple images without known geometry at interactive framerates. Starting with a set of calibrated images, coarse depth information is computed for the images using a plane-sweep algorithm. During interactive rendering, the geometry approximation is used to guide the photo consistent(More)
  • 1