Data-driven soliton mappings for integrable fractional nonlinear wave equations via deep learning with Fourier neural operator
@article{Zhong2022DatadrivenSM, title={Data-driven soliton mappings for integrable fractional nonlinear wave equations via deep learning with Fourier neural operator}, author={Ming Zhong and Zhenya Yan}, journal={ArXiv}, year={2022}, volume={abs/2209.14291} }
: In this paper, we firstly extend the Fourier neural operator (FNO) to discovery the soliton mapping between two function spaces, where one is the fractional-order index space { ǫ | ǫ ∈ ( 0, 1 ) } in the fractional integrable nonlinear wave equations while another denotes the solitonic solution function space. To be specific, the fractional nonlinear Schr¨odinger (fNLS), fractional Korteweg-de Vries (fKdV), fractional modified Korteweg-de Vries (fmKdV) and fractional sine-Gordon (fsineG…
Figures from this paper
One Citation
Data-driven forward and inverse problems for chaotic and hyperchaotic dynamic systems based on two machine learning architectures
- Physica D: Nonlinear Phenomena
- 2023
References
SHOWING 1-10 OF 63 REFERENCES
Fourier Neural Operator for Parametric Partial Differential Equations
- Computer ScienceICLR
- 2021
This work forms a new neural operator by parameterizing the integral kernel directly in Fourier space, allowing for an expressive and efficient architecture and shows state-of-the-art performance compared to existing neural network methodologies.
Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- Computer ScienceJ. Comput. Phys.
- 2019
Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators
- Computer ScienceNat. Mach. Intell.
- 2021
A new deep neural network called DeepONet can lean various mathematical operators with small generalization error and can learn various explicit operators, such as integrals and fractional Laplacians, as well as implicit operators that represent deterministic and stochastic differential equations.
A two-stage physics-informed neural network method based on conserved quantities and applications in localized wave solutions
- PhysicsJ. Comput. Phys.
- 2022
Data-driven vector localized waves and parameters discovery for Manakov system using deep learning approach
- PhysicsChaos, Solitons & Fractals
- 2022
New integrable multi-Lévy-index and mixed fractional nonlinear soliton hierarchies
- MathematicsChaos, Solitons & Fractals
- 2022
Physics-informed neural networks method in high-dimensional integrable systems
- PhysicsModern Physics Letters B
- 2021
In this paper, the physics-informed neural networks (PINNs) are applied to high-dimensional system to solve the [Formula: see text]-dimensional initial-boundary value problem with [Formula: see text]…
The Random Feature Model for Input-Output Maps between Banach Spaces
- Computer Science, MathematicsSIAM J. Sci. Comput.
- 2021
The random feature model is viewed as a non-intrusive data-driven emulator, a mathematical framework for its interpretation is provided, and its ability to efficiently and accurately approximate the nonlinear parameter-to-solution maps of two prototypical PDEs arising in physical science and engineering applications is demonstrated.
Choose a Transformer: Fourier or Galerkin
- Computer ScienceNeurIPS
- 2021
It is demonstrated for the first time that the softmax normalization in the scaled dot-product attention is sufficient but not necessary and the newly proposed simple attention-based operator learner, Galerkin Transformer shows significant improvements in both training cost and evaluation accuracy over its softmax-normalized counterparts.
Neural Operator: Graph Kernel Network for Partial Differential Equations
- Computer Science, MathematicsICLR 2020
- 2020
The key innovation in this work is that a single set of network parameters, within a carefully designed network architecture, may be used to describe mappings between infinite-dimensional spaces and between different finite-dimensional approximations of those spaces.