• Corpus ID: 239009575

Is it time to swish? Comparing activation functions in solving the Helmholtz equation using physics-informed neural networks

@inproceedings{AlSafwan2021IsIT,
  title={Is it time to swish? Comparing activation functions in solving the Helmholtz equation using physics-informed neural networks},
  author={Ali Al-Safwan and Chao Song and Umair bin Waheed},
  year={2021}
}
Solving the wave equation numerically constitutes the majority of the computational cost for applications like seismic imaging and full waveform inversion. An alternative approach is to solve the frequency domain Helmholtz equation, since it offers a reduction in dimensionality as it can be solved per frequency. However, computational challenges with the classical Helmholtz solvers such as the need to invert a large stiffness matrix can make these approaches computationally infeasible for large… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 10 REFERENCES
Eikonal Solution Using Physics-Informed Neural Networks
The eikonal equation is utilized across a wide spectrum of science and engineering disciplines. In seismology, it regulates seismic wave traveltimes needed for applications like source localization,
Solving the frequency-domain acoustic VTI wave equation using physics-informed neural networks
Frequency-domain wavefield solutions corresponding to the anisotropic acoustic wave equation can be used to describe the anisotropic nature of the Earth. To solve a frequency-domain wave equation,
Seismic waveform inversion in the frequency domain; Part 1, Theory and verification in a physical scale model
Seismic waveforms contain much information that is ignored under standard processing schemes; seismic waveform inversion seeks to use the full information content of the recorded wavefield. In this
On the Spectral Bias of Neural Networks
TLDR
This work shows that deep ReLU networks are biased towards low frequency functions, and studies the robustness of the frequency components with respect to parameter perturbation, to develop the intuition that the parameters must be finely tuned to express high frequency functions.
Searching for Activation Functions
TLDR
The experiments show that the best discovered activation function, f(x) = x \cdot \text{sigmoid}(\beta x)$, which is named Swish, tends to work better than ReLU on deeper models across a number of challenging datasets.
Wavefield solutions from machine learned
  • 2020
Towards Understanding the Spectral Bias of Deep Learning
TLDR
It is proved that the training process of neural networks can be decomposed along different directions defined by the eigenfunctions of the neural tangent kernel, where each direction has its own convergence rate and the rate is determined by the corresponding eigenvalue.
SciANN: A Keras/TensorFlow wrapper for scientific computations and physics-informed deep learning using artificial neural networks
In this paper, we introduce SciANN, a Python package for scientific computing and physics-informed deep learning using artificial neural networks. SciANN uses the widely used deep-learning packages
Towards understanding the spectral bias
  • bition,
  • 2020