Learn More
Global likelihood maximization is an important aspect of many statistical analyses. Often the likelihood function is highly multi-extremal. This presents a significant challenge to standard search procedures, which often settle too quickly into an inferior local maximum. We present a new approach based on the cross-entropy (CE) method, and illustrate its(More)
The cross-entropy (CE) method is a new generic approach to combinatorial and multi-extremal optimization and rare event simulation. The purpose of this tutorial is to give a gentle introduction to the CE method. We present the CE methodology, the basic algorithm and its modifications, and discuss applications in combinatorial optimization and machine(More)
The RESTART method is a widely applicable simulation technique for the estimation of rare event probabilities. The method is based on the idea to restart the simulation in certain system states, in order to generate more occurrences of the rare event. One of the main questions for any RESTART implementation is how and when to restart the simulation, in(More)
We present a new adaptive kernel density estimator based on linear diffusion processes. The proposed estimator builds on existing ideas for adap-tive smoothing by incorporating information from a pilot density estimate. In addition, we propose a new plug-in bandwidth selection method that is free from the arbitrary normal reference rules used by existing(More)
The two-node tandem Jackson network serves as a convenient reference model for the analysis and testing of different methodologies and techniques in rare event simulation. In this paper we consider a new approach to efficiently estimate the probability that the content of the second buffer exceeds some high level <i>L</i> before it becomes empty, starting(More)
This chapter describes how difficult statistical estimation problems can often be solved efficiently by means of the cross-entropy (CE) method. The CE method can be viewed as an adaptive importance sampling procedure that uses the cross-entropy or Kullback–Leibler divergence as a measure of closeness between two sampling distributions. The CE method is(More)
We present new theoretical convergence results on the Cross-Entropy method for discrete optimization. Our primary contribution is to show that a popular implementation of the Cross-Entropy method converges, and finds an optimal solution with probability arbitrarily close to 1. We also give necessary conditions and sufficient conditions under which an(More)