Exploring big volume sensor data with Vroom

  title={Exploring big volume sensor data with Vroom},
  author={Oscar Moll and Aaron Zalewski and S. Pillai and S. Madden and M. Stonebraker and V. Gadepally},
  journal={Proc. VLDB Endow.},
State of the art sensors within a single autonomous vehicle (AV) can produce video and LIDAR data at rates greater than 30 GB/hour. Unsurprisingly, even small AV research teams can accumulate tens of terabytes of sensor data from multiple trips and multiple vehicles. AV practitioners would like to extract information about specific locations or specific situations for further study, but are often unable to. Queries over AV sensor data are different from generic analytics or spatial queries… Expand
Challenges and Opportunities for Autonomous Vehicle Query Systems
Laser2Vec: Similarity-based Retrieval for Robotic Perception Data
  • Samer B. Nashed
  • Computer Science
  • 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
  • 2020
Efficient User Guidance for Validating Participatory Sensing Data
traj2bits: indexing trajectory data for efficient query
SLAM-aware, self-supervised perception in mobile robots
Technical Report: Optimizing Human Involvement for Entity Matching and Consolidation
Accuracy and Performance Comparison of Video Action Recognition Approaches


A High-rate, Heterogeneous Data Set From The DARPA Urban Challenge
Building, Curating, and Querying Large-Scale Data Repositories for Field Robotics Applications
1 year, 1000 km: The Oxford RobotCar dataset
A Demonstration of the BigDAWG Polystore System
OctoMap: an efficient probabilistic 3D mapping framework based on octrees
Building Rome in a day
3D is here: Point Cloud Library (PCL)
  • R. Rusu, S. Cousins
  • Engineering, Computer Science
  • 2011 IEEE International Conference on Robotics and Automation
  • 2011
The BigDAWG polystore system and architecture
YOLO9000: Better, Faster, Stronger
A Framework for Estimating Driver Decisions Near Intersections