Corpus ID: 236772184

Semantic-aware plant traversability estimation in plant-rich environments for agricultural mobile robots

@article{Matsuzaki2021SemanticawarePT,
  title={Semantic-aware plant traversability estimation in plant-rich environments for agricultural mobile robots},
  author={Shigemichi Matsuzaki and Jun Miura and Hiroaki Masuzawa},
  journal={ArXiv},
  year={2021},
  volume={abs/2108.00759}
}
This paper describes a method of estimating the traversability of plant parts covering a path and navigating through them in greenhouses for agricultural mobile robots. Conventional mobile robots rely on scene recognition methods that consider only the presence of objects. Those methods, therefore, cannot recognize paths covered by flexible plants as traversable. In this paper, we present a novel framework of the scene recognition based on image-based semantic segmentation for robot navigation… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 41 REFERENCES
Traversability classification using unsupervised on-line visual learning for outdoor robot navigation
TLDR
A novel on-line learning method which can make accurate predictions of the traversability properties of complex terrain based on autonomous training data collection which exploits the robot's experience in navigating its environment to train classifiers without human intervention. Expand
LiDAR-only based navigation algorithm for an autonomous agricultural robot
TLDR
In this paper, additional filters and refinements of the PEARL algorithm are presented in the context of crop detection, and it appears that adding those modification improved the crop detection and thus the robot navigation. Expand
Multi-source Pseudo-label Learning of Semantic Segmentation for the Scene Recognition of Agricultural Mobile Robots
This paper describes a novel method of training a semantic segmentation model for environment recognition of agricultural mobile robots by unsupervised domain adaptation exploiting publicly availableExpand
The learning and use of traversability affordance using range images on a mobile robot
TLDR
This paper studied how a mobile robot, equipped with a 3D laser scanner, can learn to perceive the traversability affordance and use it to wander in a room tilled with spheres, cylinders and boxes. Expand
Combining Hector SLAM and Artificial Potential Field for Autonomous Navigation Inside a Greenhouse
TLDR
In this study, it is shown that the wheeled mobile robot is safe to operate autonomously with a human presence, and that in contrast to classical odometry methods, no calibration is needed for repositioning the robot over repetitive runs. Expand
Find your own way: Weakly-supervised segmentation of path proposals for urban autonomy
TLDR
A weakly-supervised approach to segmenting proposed drivable paths in images with the goal of autonomous driving in complex urban environments is presented and it is illustrated how the method can generalise to multiple path proposals at intersections. Expand
Curiosity-driven learning of traversability affordance on a mobile robot
TLDR
This paper studies the learning of traversability affordance on a mobile robot and investigates how the number of interactions required can be minimized with minimial degradation on the learning process, and proposes a two step learning process which consists of bootstrapping and curiosity-based learning phases. Expand
A Novel Autonomous Robot for Greenhouse Applications
TLDR
A novel agricultural robot that allows for autonomous operation both in open environments and on rails by using only low-cost sensors and is used for UV treatment of cucumber plants. Expand
Enhancing Supervised Terrain Classification with Predictive Unsupervised Learning
TLDR
This paper describes a method for classifying the traversability of terrain by combining unsupervised learning of color models that predict scene geometry with supervised learning of the relationship between geometric features and traversability, and presents results from DARPA-conducted tests that demonstrate its effectiveness in a variety of outdoor environments. Expand
Learning Long-range Terrain Perception for Autonomous Mobile Robots
TLDR
A statistical prediction framework to enhance long-range terrain perception for autonomous mobile robots that incorporates spatial relationships between terrain regions in a principled way and outperforms existing approaches in terms of accuracy, robustness and adaptability to dynamic unstructured outdoor environments. Expand
...
1
2
3
4
5
...