Pablo H. Ibargüengoytia

Learn More
We propose a novel approach for solving continuous and hybrid Markov Decision Processes (MDPs) based on two phases. In the first phase, an initial approximate solution is obtained by partitioning the state space based on the reward function, and solving the resulting discrete MDP. In the second phase, the initial abstraction is refined and improved. States(More)
This paper develops a new theory and model for information and sensor validation. The model represents relationships between variables using Bayesian networks and utilizes probabilistic propagation to estimate the expected values of variables. If the estimated value of a variable differs from the actual value, an apparent fault is detected. The fault is(More)
The validation of data from sensors has be­ come an important issue in the operation and control of modern industrial plants. One ap­ proach is to use know ledge based techniques to detect inconsistencies in measured data. This article presents a probabilistic model for the detection of such inconsistencies. Based on probability propagation, this method is(More)
For many real time applications, it is impor­ tant to validate the information received from the sensors before entering higher levels of reasoning. This paper presents an any time probabilistic algorithm for validating the in­ formation provided by sensors. The sys­ tem consists of two Bayesian network mod­ els. The first one is a model of the dependen­(More)
This paper introduces a novel approach for fault diagnosis based on probabilistic models. This approach is suitable for applications where reliable measurements are unlikely to occur or where a deterministic analytical model is difficult to obtain. In particular, a combination of two Bayesian networks is used to detect and isolate faulty components. One(More)
Modern control systems and other monitoring systems require the acquisition of values of most of the parameters involved in the process. Examples of processes are industrial procedures or medical treatments or financial forecasts. However, sometimes some parameters are inaccessible through the use of traditional instrumentation. One example is the blades(More)
Markov decision processes (MDPs) have developed as a standard for representing uncertainty in decision-theoretic planning. However , MDPs require an explicit representation of the state space and the probabilistic transition model which, in continuous or hybrid continuous-discrete domains, are not always easy to define. Even when this representation is(More)