Autonomous land vehicle project at CMU

@inproceedings{Kanade1986AutonomousLV,
  title={Autonomous land vehicle project at CMU},
  author={Takeo Kanade and Charles E. Thorpe and William Whittaker},
  booktitle={CSC '86},
  year={1986}
}
1 . Introduction This paper provides an overview of the Autonomous Land Vehicle (ALV) Project at CMU. The goal of the CMU ALV Project is to build vision and intelligence for a mobile robot capable of operating in the real world outdoors. We are attacking this on a number of fronts: building appropriate research vehicles, exploiting high. speed experimental computers, and building software for reasoning about the perceived world. Research topics includes: • Construction of research vehicles… 

Stereo Vision-based Autonomous Vehicle Navigation

TLDR
Tests in a controlled environment show promising results and the interfaces between the server and the Robocart have been defined, so that the proposed method can be used on the golf cart as soon as the mechanical systems are fully functional.

Mobile Robot Navigation: The CMU System

TLDR
The current status of autonomous land vehicle (ALV) research at F Carnegie Mellon University’s Robotics Institute is described, with an autonomous mobile robot system capable of operating in outdoor environments and a navigation system working at two test sites and on two experimental vehicles.

1987 year end report for road following at Carnegie Mellon

TLDR
Progress in vision and navigation for outdoor mobile robots at the Carnegie Mellon Robotics Institute during 1987 is described, which centers on guiding outdoor autonomous vehicles.

The CMU system for mobile robot navigation

  • Y. GotoA. Stentz
  • Computer Science
    Proceedings. 1987 IEEE International Conference on Robotics and Automation
  • 1987
TLDR
The various perception, planning, and control components of the CODGER software system for integrating these components into a single system, synchronizing the data flow between them in order to maximize parallelism are described.

An architecture for sensor fusion in a mobile robot

TLDR
This paper describes sensor fusion in the context of an autonomous mobile robot, and describes the software architecture of the NAVLAB, consisting of a "whiteboard" system called CODGER that is similar to a blackboard but supports parallelism in the knowledge source modules, and an organized collection of perceptual and navigational modules tied together by the COD GER system.

Sensors and Sensor Fusion in Autonomous Vehicles

TLDR
The current state-of-the-art in this area will be presented, such as 3D object detection method for leveraging both image and 3D point cloud information, moving object detection and tracking system, and occupancy grid mapping used for navigation and localization in dynamic environments.

Model-directed mobile robot navigation

TLDR
The authors report on the system and methods used by UMass Mobile Robot Project, which integrates perception, planning, and execution of actions, and some experiments that demonstrate the performance of its components are described.

Road obstacle detection and tracking by an active and intelligent sensing strategy

TLDR
A sensor composed of a range finder coupled with a (charge-coupled-device) CCD camera is used and this sensor is mounted in front of a vehicle to determine 2D visual targets in intensity images of the camera.

Vision and Navigation for the CMU Navlab

TLDR
The first system that uses the CMU Blackboard for scheduling, geometric transformations, inter and intra machine communications is completed, and the perception now uses adaptive color classification for road tracking, and scanning laser rangefinder data for obstacle detection.

Development Of Autonomous Systems

  • T. Kanade
  • Computer Science
    Defense, Security, and Sensing
  • 1989
TLDR
The Navlab is a four-wheeled vehicle (van) for road and open terrain navigation, and the Ambler is a six-legged locomotor for Mars exploration.
...

References

SHOWING 1-10 OF 14 REFERENCES

An architecture for autonomous vehicle navigation

  • A. StentzC. Thorpe
  • Computer Science
    Proceedings of the 1985 4th International Symposium on Unmanned Untethered Submersible Technology
  • 1985
TLDR
The main part of the paper is a discussion of the blackboard architecture that ties all the processes together, and the experiences with a real implementation of the sensor interpretation processes.

Module Programmer's Guide to Local Map Builder for ALVan

TLDR
This document describes the local map blackboard component of a mobile robot system, the Autonomous Land Vehicle (ALV), under construction at CMU, and provides detailed specifications for programmers implementing these types of modules.

A sonar-based mapping and navigation system

  • A. Elfes
  • Computer Science
    Proceedings. 1986 IEEE International Conference on Robotics and Automation
  • 1986
This paper describes a sonar-based mapping and navigation system for autonomous mobile robots operating in unknown and unstructured surroundings. The system uses sonar range data to build a

Progress in robot road-following

TLDR
In test runs of an outdoor robot vehicle, the Terregator, under control of the Warp computer, it is demonstrated continuous motion vision-guided road-following at speeds up to 1.08 km/hour with image processing and steering servo loop times of 3 sec.

First Results in Robot Road-Following

TLDR
The new Carnegie Mellon Autonomous I and Vehicle group has produced the first demonstrations of road following robots, and the vision system of the CMU ALV is described, including a simple and stable control scheme for visual servoing.

Warp as a machine for low-level vision

TLDR
This paper shows how the Warp architecture can be used to fulfill the computational needs of low-level vision, and study the characteristics of low level vision algorithms and show how they lead to requirements for computer architecture.

Global operations on the CMU Warp machine

TLDR
It is described how Warp can efficiently implement these global operations of the fast Fourier transform, component labeling, and image warping, and an efficient parallel algorithm for component labeling is proposed.

Outdoor scene analysis using range data

  • M. Hebert
  • Computer Science
    Proceedings. 1986 IEEE International Conference on Robotics and Automation
  • 1986
TLDR
Techniques for outdoor scene analysis using range data are described to build a 3-D representation of the environment of an mobile robot equipped with a range sensor and have been successfully applied to the problem of path planning through obstacles.

Multiple Levels of Representation and Problem-Solving Using Maps From Sonar Data

  • Multiple Levels of Representation and Problem-Solving Using Maps From Sonar Data

Technical Report (being prepared), CMU Robotics Institute

  • Technical Report (being prepared), CMU Robotics Institute
  • 1985