Optimizing differential equations to fit data and predict outcomes

@article{Frank2022OptimizingDE,
  title={Optimizing differential equations to fit data and predict outcomes},
  author={Steven A. Frank},
  journal={ArXiv},
  year={2022},
  volume={abs/2204.07833}
}
  • S. Frank
  • Published 16 April 2022
  • Computer Science
  • ArXiv
Many scientific problems focus on observed patterns of change or on how to design a system to achieve particular dynamics. Those problems often require fitting differential equation models to target trajectories. Fitting such models can be difficult because each evaluation of the fit must calculate the distance between the model and target patterns at numerous points along a trajectory. The gradient of the fit with respect to the model parameters can be challenging. Recent technical advances in… 

Figures from this paper

Optimization of Transcription Factor Genetic Circuits

TLDR
A computational method to optimize TF networks is introduced, discovering a four-dimensional TF network that maintains a circadian rhythm over many days, successfully buffering strong stochastic perturbations in molecular dynamics and entraining to an external day-night signal that randomly turns on and off at intervals of several days.

Automatic differentiation and the optimization of differential equation models in biology

TLDR
How automatic dierentiation of trajectories is achieved and why such computational breakthroughs are likely to advance theoretical and statistical studies of biological problems are discussed are discussed.

References

SHOWING 1-10 OF 14 REFERENCES

Neural ordinary differential equations for ecological and evolutionary time‐series analysis

Inferring the functional shape of ecological and evolutionary processes from time‐series data can be challenging because processes are often not describable with simple equations. The dynamical

DifferentialEquations.jl – A Performant and Feature-Rich Ecosystem for Solving Differential Equations in Julia

TLDR
DifferentialEquations.jl offers a unified user interface to solve and analyze various forms of differential equations while not sacrificing features or performance, and is an algorithm testing and benchmarking suite which is feature-rich and highly performant.

A review of automatic differentiation and its efficient implementation

  • C. Margossian
  • Computer Science
    WIREs Data Mining Knowl. Discov.
  • 2019
TLDR
Automatic differentiation is a powerful tool to automate the calculation of derivatives and is preferable to more traditional methods, especially when differentiating complex algorithms and mathematical functions.

DiffEqFlux.jl - A Julia Library for Neural Differential Equations

TLDR
This work demonstrates the ability to incorporate DifferentialEquations.jl-defined differential equation problems into a Flux-defined neural network, and vice versa, and discusses the complementary nature between machine learning models and differential equations.

Automatic differentiation in machine learning: a survey

TLDR
By precisely defining the main differentiation techniques and their interrelationships, this work aims to bring clarity to the usage of the terms “autodiff’, “automatic differentiation”, and “symbolic differentiation" as these are encountered more and more in machine learning settings.

Neural Ordinary Differential Equations

TLDR
This work shows how to scalably backpropagate through any ODE solver, without access to its internal operations, which allows end-to-end training of ODEs within larger models.

Universal Differential Equations for Scientific Machine Learning

TLDR
The UDE model augments scientific models with machine-learnable structures for scientifically-based learning and shows how UDEs can be utilized to discover previously unknown governing equations, accurately extrapolate beyond the original data, and accelerate model simulation, all in a time and data-efficient manner.

Preconditioned Stochastic Gradient Langevin Dynamics for Deep Neural Networks

TLDR
This work proposes combining adaptive preconditioners with Stochastic Gradient Langevin Dynamics, and gives theoretical properties on asymptotic convergence and predictive risk, and empirical results for Logistic Regression, Feedforward Neural Nets, and Convolutional Neural Nets demonstrate that the preconditionsed SGLD method gives state-of-the-art performance.

Adam: A Method for Stochastic Optimization

TLDR
This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.

Adapting machine-learning algorithms to design gene circuits

TLDR
This work adapted machine-learning algorithms to significantly accelerate gene circuit discovery, and found that they could rapidly design circuits capable of executing a range of different functions, including those that recapitulate important in vivo phenomena, such as oscillators, and perform complex tasks for synthetic biology,such as counting noisy biological events.