• Corpus ID: 251253378

Block sparsity and gauge mediated weight sharing for learning dynamical laws from data

@inproceedings{Gotte2022BlockSA,
  title={Block sparsity and gauge mediated weight sharing for learning dynamical laws from data},
  author={Martin Gotte and Jan Fuksa and Ingo Roth and Jens Eisert},
  year={2022}
}
Recent years have witnessed an increased interest in recovering dynamical laws of complex systems in a largely data-driven fashion under meaningful hypotheses. In this work, we propose a method for scalably learning dynamical laws of classical dynamical systems from data. As a novel ingredient, to achieve an efficient scaling with the system size, block sparse tensor trains – instances of tensor networks applied to function dictionaries – are used and the self similarity of the problem is… 

Figures from this paper

References

SHOWING 1-10 OF 35 REFERENCES

Tensor network approaches for learning non-linear dynamical laws

This work demonstrates that efficient rank-adaptive optimization algorithms can be used to learn optimal tensor network models without requiring a~priori knowledge of the exact tensor ranks, and provides a physics-informed approach to recovering structured dynamical laws from data.

Tensor network approaches for data-driven identification of non-linear dynamical laws

It is demonstrated that efficient rank-adaptive optimization algorithms can be used to learn optimal tensor network models without requiring a priori knowledge of the exact tensor ranks.

Hand-waving and interpretive dance: an introductory course on tensor networks

The curse of dimensionality associated with the Hilbert space of spin systems provides a significant obstruction to the study of condensed matter systems. Tensor networks have proven an important

On the Expressive Power of Deep Learning: A Tensor Analysis

It is proved that besides a negligible set, all functions that can be implemented by a deep network of polynomial size, require exponential size in order to be realized (or even approximated) by a shallow network.

Stable ALS approximation in the TT-format for rank-adaptive tensor completion

This article introduces a singular value based regularization to the standard alternating least squares (ALS), which is motivated by averaging in microsteps, and proves its stability and derive a natural semi-implicit rank adaption strategy.

Multidimensional Approximation of Nonlinear Dynamical Systems

The method multidimensional approximation of nonlinear dynamical systems (MANDy) is proposed which combines data-driven methods with tensor network decompositions and the efficiency of the introduced approach will be illustrated with the aid of several high-dimensional nonlinear Dynamical systems.

Tensor-based algorithms for image classification

It is shown that tensor-based methods developed for learning the governing equations of dynamical systems from data can, in the same way, be used for supervised learning problems and two novel approaches for image classification are proposed.

Supervised Learning with Tensor Networks

It is demonstrated how algorithms for optimizing tensor networks can be adapted to supervised learning tasks by using matrix product states (tensor trains) to parameterize non-linear kernel learning models.

Tensor network decompositions in the presence of a global symmetry

This work discusses how to incorporate a global symmetry, given by a compact, completely reducible group G, in tensor network decompositions and algorithms, by considering tensors that are invariant under the action of the group G.

Discovering governing equations from data by sparse identification of nonlinear dynamical systems

This work develops a novel framework to discover governing equations underlying a dynamical system simply from data measurements, leveraging advances in sparsity techniques and machine learning and using sparse regression to determine the fewest terms in the dynamic governing equations required to accurately represent the data.