A Collaborative Visual SLAM Framework for Service Robots
@article{Ouyang2021ACV, title={A Collaborative Visual SLAM Framework for Service Robots}, author={Ming Ouyang and Xuesong Shi and Yujie Wang and Yuxin Tian and Yingzhe Shen and Dawei Wang and Peng Wang}, journal={2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)}, year={2021}, pages={8679-8685} }
We present a collaborative visual simultaneous localization and mapping (SLAM) framework for service robots. With an edge server maintaining a map database and performing global optimization, each robot can register to an existing map, update the map, or build new maps, all with a unified interface and low computation and memory cost. We design an elegant communication pipeline to enable real-time information sharing between robots. With a novel landmark organization and retrieval method on the…
5 Citations
Time-Critical IoT Applications Enabled by Wi-Fi 6 and Beyond
- Computer ScienceIEEE Internet of Things Magazine
- 2022
Next generation Wi-Fi technologies are presented and it is described how they can be leveraged to enable three time-critical Industry 4.0 use cases: wireless industrial automation control, remote rendering in extended reality applications and cooperative simultaneous localization and mapping using autonomous mobile robots in a factory plant.
DynNetSLAM: Dynamic Visual SLAM Network Offloading
- Computer ScienceIEEE Access
- 2022
DynNetSLAM with the hysteresis substantially reduces the probability of track loss events compared to the state-of-the-art ORB-SLAM2 approach for processing statically on the mobile device and the enhanced static Edge SLAM and nearly attains the low absolute position error.
CORB2I-SLAM: An Adaptive Collaborative Visual-Inertial SLAM for Multiple Robots
- Computer ScienceElectronics
- 2022
A collaborative SLAM framework, CORB2I-SLAM, in which each participating robot carries a camera and an inertial sensor to run odometry, and can be adapted to use Visual Odometry (VO) when the measurements from inertial sensors are noisy.
Using a Two-Stage Method to Reject False Loop Closures and Improve the Accuracy of Collaborative SLAM Systems
- Computer ScienceElectronics
- 2021
A two-stage false positive loop-closure rejection method based on three types of consistency checks that improves the accuracy and robustness of the back-end pose-graph optimization with a strong ability to rejectfalse positive loop closures.
References
SHOWING 1-10 OF 27 REFERENCES
CCM‐SLAM: Robust and efficient centralized collaborative monocular simultaneous localization and mapping for robotic teams
- Computer ScienceJ. Field Robotics
- 2019
CCM‐SLAM is presented, a centralized collaborative SLAM framework for robotic agents, each equipped with a monocular camera, a communication unit, and a small processing board, that ensures their autonomy as individuals while a central server with potentially bigger computational capacity enables their collaboration.
OpenVSLAM: A Versatile Visual SLAM Framework
- Computer ScienceACM Multimedia
- 2019
OpenVSLAM is introduced, a visual SLAM framework with high usability and extensibility, designed to be easily used and extended and incorporates several useful features and functions for research and development.
CVI-SLAM—Collaborative Visual-Inertial SLAM
- Computer ScienceIEEE Robotics and Automation Letters
- 2018
Thoroughly analyzing CVI-SLAM, it is attest to its accuracy and the improvements arising from the collaboration, and its scalability in the number of participating agents and applicability in terms of network requirements is evaluated.
C2TAM: A Cloud framework for cooperative tracking and mapping
- Computer ScienceRobotics Auton. Syst.
- 2014
DXSLAM: A Robust and Efficient Visual SLAM System with Deep Features
- Computer Science2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
- 2020
This paper shows that feature extraction with deep convolutional neural networks (CNNs) can be seamlessly incorporated into a modern SLAM framework, and the full system achieves much lower trajectory errors and much higher correct rates on all evaluated data.
ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual–Inertial, and Multimap SLAM
- Computer ScienceIEEE Transactions on Robotics
- 2021
This article presents ORB-SLAM3, the first system able to perform visual, visual-inertial and multimap SLAM with monocular, stereo and RGB-D cameras, using pin-hole and fisheye lens models, resulting in real-time robust operation in small and large, indoor and outdoor environments.
Voxel Map for Visual SLAM
- Computer Science2020 IEEE International Conference on Robotics and Automation (ICRA)
- 2020
This work argues that keyframes are not the optimal choice for this task, due to several inherent limitations, such as weak geometric reasoning and poor scalability, and proposes a voxel-map representation to efficiently retrieve map points for visual SLAM.
Redesigning SLAM for Arbitrary Multi-Camera Systems
- Computer Science2020 IEEE International Conference on Robotics and Automation (ICRA)
- 2020
This work proposes an adaptive initialization scheme, a sensor-agnostic, information- theoretic keyframe selection algorithm, and a scalable voxel- based map that can adapt to a wide range of camera setups without the need of sensor-specific modifications or tuning.
Collaborative visual SLAM for multiple agents: A brief survey
- Computer ScienceVirtual Real. Intell. Hardw.
- 2019
Are We Ready for Service Robots? The OpenLORIS-Scene Datasets for Lifelong SLAM
- Computer Science2020 IEEE International Conference on Robotics and Automation (ICRA)
- 2020
The term lifelong SLAM is used here to address SLAM problems in an ever-changing environment over a long period of time and the OpenLORIS-Scene datasets are released, which are collected in real-world indoor scenes, for multiple times in each place to include scene changes in real life.