Corpus ID: 236318535

A local approach to parameter space reduction for regression and classification tasks

@article{Romor2021ALA,
  title={A local approach to parameter space reduction for regression and classification tasks},
  author={Francesco Romor and Marco Tezzele and Gianluigi Rozza},
  journal={ArXiv},
  year={2021},
  volume={abs/2107.10867}
}
Frequently, the parameter space, chosen for shape design or other applications that involve the definition of a surrogate model, present subdomains where the objective function of interest is highly regular or well behaved. So, it could be approximated more accurately if restricted to those subdomains and studied separately. The drawback of this approach is the possible scarsity of data in some applications, but in those, where a quantity of data, moderately abundant considering the parameter… Expand
ATHENA: Advanced Techniques for High Dimensional Parameter Spaces to Enhance Numerical Analysis
TLDR
ATHENA is an open source Python package for reduction in parameter space intended as a tool for regression, sensitivity analysis, and in general to enhance existing numerical simulations’ pipelines tackling the curse of dimensionality. Expand

References

SHOWING 1-10 OF 50 REFERENCES
Learning nonlinear level sets for dimensionality reduction in function approximation
TLDR
This work exploited reversible networks (RevNets) to learn nonlinear level sets of a high-dimensional function and parameterize its level sets in low-dimensional spaces to alleviate the over-fitting issue caused by data insufficiency. Expand
Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis
TLDR
A new linear supervised dimensionality reduction method called local Fisher discriminant analysis (LFDA), which effectively combines the ideas of FDA and LPP, and which can be easily computed just by solving a generalized eigenvalue problem. Expand
Active Subspaces - Emerging Ideas for Dimension Reduction in Parameter Studies
Scientists and engineers use computer simulations to study relationships between a model's input parameters and its outputs. However, thorough parameter studies are challenging, if not impossible,Expand
A non-intrusive approach for the reconstruction of POD modal coefficients through active subspaces
TLDR
The enhanced ROM results in a reduced number of input solutions to reach the desired accuracy by coupling the proper orthogonal decomposition with interpolation (PODI)—a data-driven reduced order method—with the active subspace (AS) property, an emerging tool for reduction in parameter space. Expand
Active Manifolds: A non-linear analogue to Active Subspaces
TLDR
Overall, AM represents a novel technique for analyzing functional models with benefits including: reducing $m$-dimensional analysis to a 1-D analogue, permitting more accurate regression than AS (at more computational expense), enabling more informative sensitivity analysis, and granting accessible visualizations of parameter sensitivity along the AM. Expand
Model order reduction assisted by deep neural networks (ROM-net)
TLDR
This paper introduces the concept of dictionary-based ROM-nets, where deep neural networks recommend a suitable local reduced-order model from a dictionary, constructed from a clustering of simplified simulations enabling the identification of the subspaces in which the solutions evolve for different input tensors. Expand
Gradient-Based Dimension Reduction of Multivariate Vector-Valued Functions
TLDR
A gradient-based method for detecting and exploiting low-dimensional input parameter dependence of multivariate functions and reveals that the choice of norm on the codomain of the function can have a significant impact on the function's low- dimensional approximation. Expand
Sliced Inverse Regression for Dimension Reduction
Abstract Modern advances in computing power have greatly widened scientists' scope in gathering and investigating information from many variables, information which might have been ignored in theExpand
Localized Sliced Inverse Regression
We develop a supervised dimension reduction method that integrates the idea of localization from manifold learning with the sliced inverse regression framework. We call our method localized slicedExpand
Combined Parameter and Model Reduction of Cardiovascular Problems by Means of Active Subspaces and POD-Galerkin Methods
In this chapter we introduce a combined parameter and model reduction methodology and present its application to the efficient numerical estimation of a pressure drop in a set of deformed carotids.Expand
...
1
2
3
4
5
...