Cost-Bounded Active Classification Using Partially Observable Markov Decision Processes
@article{Wu2018CostBoundedAC, title={Cost-Bounded Active Classification Using Partially Observable Markov Decision Processes}, author={B. Wu and Mohamadreza Ahmadi and Suda Bharadwaj and Ufuk Topcu}, journal={2019 American Control Conference (ACC)}, year={2018}, pages={1216-1223} }
Active classification, i.e., the sequential decision making process aimed at data acquisition for classification purposes, arises naturally in many applications, including medical diagnosis, intrusion detection, and object tracking. In this work, we study the problem of actively classifying dynamical systems with a finite set of Markov decision process (MDP) models. We are interested in finding strategies that actively interact with the dynamical system, and observe its reactions so that the…
Figures and Tables from this paper
6 Citations
Constrained Active Classification Using Partially Observable Markov Decision Processes
- Computer ScienceArXiv
- 2020
This work presents a decision-theoretic framework based on partially observable Markov decision processes (POMDPs) that relies on assigning a classification belief (a probability distribution) to the attributes of interest and presents two different algorithms to compute such strategies.
On the Detection of Markov Decision Processes
- Mathematics, Computer Science
- 2021
This work investigates whether it is possible to asymptotically detect the ground truth MDP model perfectly based on a single observed history (stateaction sequence) and develops an algorithm that efficiently determines the existence of policies and synthesizes one when they exist.
Automated Verification and Synthesis of Stochastic Hybrid Systems: A Survey
- Computer ScienceAutom.
- 2022
An integrated methodology to control the risk of cardiovascular disease in patients with hypertension and type 1 diabetes
- MedicineComput. Intell.
- 2021
An integrated methodology including Markov decision processes (MDP) and genetic algorithm (GA) to control the risk of cardiovascular disease in patients with hypertension and type 1 diabetes is aimed at developing.
Operations research and health systems: A literature review
- Medicine, Political Science
- 2020
The present study aimed to evaluate the application of operations research models based on the research process in health systems including Markov decision-making processes (MDPs) and partially observableMarkov decision process (POMDP), etc., and compare these methods with each other.
References
SHOWING 1-10 OF 26 REFERENCES
An Adaptive Sampling Algorithm for Solving Markov Decision Processes
- Computer ScienceOper. Res.
- 2005
An adaptive sampling algorithm that adaptively chooses which action to sample as the sampling process proceeds and generates an asymptotically unbiased estimator, whose bias is bounded by a quantity that converges to zero at rate (lnN)/ N.
Markov Decision Processes: Discrete Stochastic Dynamic Programming
- Computer ScienceWiley Series in Probability and Statistics
- 1994
Markov Decision Processes covers recent research advances in such areas as countable state space models with average reward criterion, constrained models, and models with risk sensitive optimality criteria, and explores several topics that have received little or no attention in other books.
Simulation-Based Algorithms for Markov Decision Processes
- Computer Science
- 2013
The self-contained approach of this book will appeal not only to researchers in MDPs, stochastic modeling, and control, and simulation but will be a valuable source of tuition and reference for students of control and operations research.
OR Forum - A POMDP Approach to Personalize Mammography Screening Decisions
- MedicineOper. Res.
- 2012
This work proposes a personalized mammography screening policy based on the prior screening history and personal risk characteristics of women, and formulation of a finite-horizon, partially observable Markov decision process POMDP model that outperforms existing guidelines with respect to the total expected quality-adjusted life years.
SARSOP: Efficient Point-Based POMDP Planning by Approximating Optimally Reachable Belief Spaces
- Computer ScienceRobotics: Science and Systems
- 2008
This work has developed a new point-based POMDP algorithm that exploits the notion of optimally reachable belief spaces to improve com- putational efficiency and substantially outperformed one of the fastest existing point- based algorithms.
POMDP Model Learning for Human Robot Collaboration
- Computer Science2018 IEEE Conference on Decision and Control (CDC)
- 2018
This work adopts a partially observable Markov decision process (POMDP) model, which provides a general modeling framework for sequential decision making where states are hidden and actions have stochastic outcomes.
Planning and Acting in Partially Observable Stochastic Domains
- MathematicsArtif. Intell.
- 1998
Cooperative Active Perception using POMDPs
- Computer Science
- 2008
A decision-theoretic approach to cooperative active perception is presented, by formalizing the problem as a Partially Observable Markov Decision Process (POMDP).
Bounded Policy Synthesis for POMDPs with Safe-Reachability Objectives
- Computer ScienceAAMAS
- 2018
The method compactly represents a goal-constrained belief space, which only contains beliefs reachable from the initial belief under desired executions that can achieve the given safe-reachability objective, and employs an incremental Satisfiability Modulo Theories solver to efficiently search for a valid policy over it.
Active Classification: Theory and Application to Underwater Inspection
- Computer ScienceISRR
- 2011
The problem in which an autonomous vehicle must classify an object based on multiple views is formulated as an extension to Bayesian active learning, and the benefit of acting adaptively as new information becomes available is formally analyzed.