Shane Strasser

Learn More
In this paper we propose several approximation algorithms for the problems of full and partial abductive inference in Bayesian belief networks. Full abductive inference is the problem of finding the k most probable state assignments to all non-evidence variables in the network while partial abductive inference is the problem of finding the k most probable(More)
—Bayesian networks are powerful probabilistic models that have been applied to a variety of tasks. When applied to classification problems, Bayesian networks have shown competitive performance when compared to other state-of-the-art classifiers. However, structure learning of Bayesian networks has been shown to be NP-Hard. In this paper, we propose a novel(More)
Bayesian networks are probabilistic graphical models that have proven to be able to handle uncertainty in many real-world applications. One key issue in learning Bayesian networks is parameter estimation, i.e., learning the local conditional distributions of each variable in the model. While parameter estimation can be performed efficiently when complete(More)
Particle Swarm Optimization (PSO) has been shown to perform very well on a wide range of optimization problems. One of the drawbacks to PSO is that the base algorithm assumes continuous variables. In this paper, we present a version of PSO that is able to optimize over discrete variables. This new PSO algorithm, which we call Integer and Categorical PSO(More)
Factored Evolutionary Algorithms (FEA) have proven to be fast and efficient optimization methods, often outperforming established methods using single populations. One restriction to FEA is that it requires a central communication point between all of the factors, making FEA difficult to use in completely distributed settings. The Distributed Factored(More)
  • 1