Quasi-convergence of an implementation of optimal balance by backward-forward nudging

  title={Quasi-convergence of an implementation of optimal balance by backward-forward nudging},
  author={Gerhard Masur and Haidar Mohamad and Marcel Oliver},
. Optimal balance is a non-asymptotic numerical method to compute a point on the slow manifold for certain two-scale dynamical systems. It works by solving a modified version of the system as a boundary value problem in time, where the nonlinear terms are adiabatically ramped up from zero to the fully nonlinear dynamics. A dedicated boundary value solver, however, is often not directly available. The most natural alternative is a nudging solver, where the problem is repeatedly solved forward and… 

Figures from this paper



Optimal Balance via Adiabatic Invariance of Approximate Slow Manifolds

It is shown that, providing the ramp function which defines the homotopy is of Gevrey class 2 and satisfies vanishing conditions to all orders at the temporal end points, the solution of the optimal balance boundary value problem yields a point on the approximate slow manifold that is exponentially close to the approximation to the slow manifold via exponential asymptotics, albeit with a smaller power of the small parameter in the exponent.

Optimal balance for rotating shallow water in primitive variables

ABSTRACT Optimal balance is a near-optimal computational algorithm for nonlinear mode decomposition of geophysical flows into balanced and unbalanced components. It was first proposed as “optimal

The Back and Forth Nudging algorithm for data assimilation problems : theoretical results on transport equations

In this paper, we consider the back and forth nudging algorithm that has been introduced for data assimilation purposes. It consists of iteratively and alternately solving forward and backward in

Data assimilation on the exponentially accurate slow manifold

  • Colin Cotter
  • Environmental Science
    Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences
  • 2013
An approach to data assimilation making use of an explicit map that defines a coordinate system on the slow manifold in the semi-geostrophic scaling in Lagrangian coordinates is described, and it is shown that, if initial conditions for the system are chosen as image points of the map, then the fast components of the system have exponentially small magnitude for exponentially long times as ε→0.

The Contour-Advective Semi-Lagrangian Algorithm for the Shallow Water Equations

A new method for integrating shallow water equations, the contour-advective semi-Lagrangian (CASL) algorithm, is presented. This is the first implementation of a contour method to a system of

Optimal potential vorticity balance of geophysical flows

A method to decompose geophysical flows into a balanced flow (defined by its potential vorticity, PV) and an imbalanced component (inertia–gravity waves, IGWs) is introduced. The balanced flow,

Long-Time Accuracy for Approximate Slow Manifolds in a Finite-Dimensional Model of Balance

This paper addresses the long time validity of the slow limit equations in the simplest nontrivial case and shows that the first-order reduced model remains O(ε) accurate over a long 1/ε timescale.

Semigeostrophic Particle Motion and Exponentially Accurate Normal forms

The normal form approach extends to numerical approximations via backward error analysis and extends to particle methods for the shallow-water equations, where the result shows that particles stay close to balance over long times in the semigeostrophic limit.

Slow dynamics via degenerate variational asymptotics

  • G. GottwaldM. Oliver
  • Mathematics
    Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences
  • 2014
We introduce the method of degenerate variational asymptotics for a class of singularly perturbed ordinary differential equations in the limit of strong gyroscopic forces. Such systems exhibit

A contour‐advective semi‐lagrangian numerical algorithm for simulating fine‐scale conservative dynamical fields

This paper describes a novel numerical algorithm for simulating the evolution of fine‐scale conservative fields in layer‐wise two‐dimensional flows, the most important examples of which are the