Learn More
The long computational time required in constructing optimal designs for computer experiments has limited their uses in practice. In this paper, a new algorithm for constructing optimal experimental designs is developed. There are two major developments involved in this work. One is on developing an efficient global optimal search algorithm, named as(More)
Approximation models (also known as metamodels) have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is directly related to the sampling strategies used. Our goal in this paper is to investigate the general applicability(More)
We propose symmetric Latin hypercubes for designs of computer exepriment. The goal is to ooer a compromise between computing eeort and design opti-mality. The proposed class of designs has some advantages over the regular Latin hypercube design with respect to criteria such as entropy and the minimum intersite distance. An exchange algorithm is proposed for(More)
Large financial institutions such as Bank of America handle hundreds of thousands of wire transactions per day. Although most transactions are legitimate, these institutions have legal and financial obligations in discovering those that are suspicious. With the methods of fraudulent activities ever changing, searching on pre-defined patterns is often(More)
Kriging is a popular analysis approach for computer experiment for the purpose of creating a cheap-to-compute "metamodel" as a surrogate to a computationally expensive engineering simulation model. The maximum likelihood approach is employed to estimate the parameters in the Kriging model. However, the likelihood function near the optimum may be flat in(More)
In this work, we propose an integrated framework for optimization under uncertainty that can bring both the design objective robustness and the probabilistic design constraints into account. The fundamental development of this work is the employment of an inverse reliability strategy that uses percentile performance for assessing both the objective(More)
* Metamodeling approach has been widely used due to the high computational cost of using high-fidelity simulations in engineering design. Interpretation of metamodels for the purpose of design, especially design under uncertainty, becomes important. The computational expenses associated with metamodels and the random errors introduced by sample-based(More)