Corpus ID: 236956656

3D Human Reconstruction in the Wild with Collaborative Aerial Cameras

  title={3D Human Reconstruction in the Wild with Collaborative Aerial Cameras},
  author={Cherie Ho and Andrew Jong and Harry Freeman and Rohan Rao and Rogerio Bonatti and Sebastian A. Scherer},
Aerial vehicles are revolutionizing applications that require capturing the 3D structure of dynamic targets in the wild, such as sports, medicine and entertainment. The core challenges in developing a motion-capture system that operates in outdoors environments are: (1) 3D inference requires multiple simultaneous viewpoints of the target, (2) occlusion caused by obstacles is frequent when tracking moving targets, and (3) the camera and vehicle state estimation is noisy. We present a real-time… Expand

Figures from this paper


Towards a Robust Aerial Cinematography Platform: Localizing and Tracking Moving Targets in Unstructured Environments
This work proposes a complete system for aerial cinematography that combines a vision-based algorithm for target localization; a real-time incremental 3D signed-distance map algorithm for occlusion and safety computation; and areal-time camera motion planner that optimizes smoothness, collisions, occlusions and artistic guidelines. Expand
Autonomous Aerial Cinematography In Unstructured Environments With Learned Artistic Decision-Making
A complete system for real-time aerial cinematography that for the first time combines vision-based target estimation; 3D signed-distance mapping for occlusion estimation; efficient trajectory optimization for long time-horizon camera motion; and learning-based artistic shot selection is proposed. Expand
Markerless Outdoor Human Motion Capture Using Multiple Autonomous Micro Aerial Vehicles
This work describes the first fully autonomous outdoor capture system based on flying vehicles, which combines multiple state-of-the-art 2D joint detectors with a 3D human body model and a powerful prior on human pose to robustly fit the 2D measurements. Expand
Flycon: real-time environment-independent multi-view human pose estimation with aerial vehicles
We propose a real-time method for the infrastructure-free estimation of articulated human motion. The approach leverages a swarm of camera-equipped flying robots and jointly optimizes the swarm's andExpand
FlyCap: Markerless Motion Capture Using Multiple Autonomous Flying Cameras
  • Lan Xu, Yebin Liu, +4 authors Lu Fang
  • Computer Science, Medicine
  • IEEE Transactions on Visualization and Computer Graphics
  • 2018
This paper proposes a novel non-rigid surface registration method to track and fuse the depth of the three flying cameras for surface motion tracking of the moving target, and simultaneously calculate the pose of each flying camera. Expand
Human Motion Capture Using a Drone
This work argues that, besides the capability of tracking a moving subject, a flying drone also provides fast varying viewpoints, which is beneficial for motion reconstruction, and introduces a drone-based system for 3D human MoCap. Expand
ActiveMoCap: Optimized Viewpoint Selection for Active Human Motion Capture
This paper introduces an algorithm that predicts which viewpoints should be chosen to capture future frames so as to maximize 3D human pose estimation accuracy and yields improved 3D body pose estimates and outperforms or matches existing ones that are based on person following and orbiting. Expand
Domes to Drones: Self-Supervised Active Triangulation for 3D Human Pose Reconstruction
This work introduces ACTOR, an active triangulation agent for 3d human pose reconstruction, a fully trainable agent consisting of a 2d pose estimation network and a deep reinforcement learning-based policy for camera viewpoint selection that produces significantly more accurate 3d pose reconstructions. Expand
Shot type constraints in UAV cinematography for autonomous target tracking
The interplay between cinematography and computer vision in the area of autonomous UAV filming is explored, with UAV target-tracking trajectories formalized and geometrically modeled, so as to analytically compute maximum allowable focal length per scenario. Expand
Real-time planning for automated multi-view drone cinematography
The online nature of the method enables incorporation of feedback into the planning and control loop, makes the algorithm robust to disturbances and extended to include coordination between multiple drones to enable dynamic multi-view shots, typical for action sequences and live TV coverage. Expand