Koopman Theory for Partial Differential Equations

July 24, 2016


We consider the application of Koopman theory to nonlinear partial differential equations. We demonstrate that the observables chosen for constructing the Koopman operator are critical for enabling an accurate approximation to the nonlinear dynamics. If such observables can be found, then the dynamic mode decomposition algorithm can be enacted to compute a finite-dimensional approximation of the Koopman operator, including its eigenfunctions, eigenvalues and Koopman modes. Judiciously chosen observables lead to physically interpretable spatio-temporal features of the complex system under consideration and provide a connection to manifold learning methods. We demonstrate the impact of observable selection, including kernel methods, and construction of the Koopman operator on two canonical, nonlinear PDEs: Burgers’ equation and the nonlinear Schr¨odinger equation. These examples serve to highlight the most pressing and critical challenge of Koopman theory: a principled way to select appropriate observables.

The DMD and Koopman algorithms

The DMD algorithm underlies the computation of the Koopman eigenvalues and modes directly from data. Its effectiveness depends sensitively on the choice of observables. Rowley et al. [6] showed that DMD approximates the Koopman operator for the set of observables g(x) = x. We will use this fact in constructing a DMD algorithm for observables of x instead of the state variable itself. To start, we use the following definition of the DMD decomposition

Fig 2.

(a) Evolution dynamics of Burgers’ equation with initial condition u(x, 0) =exp(−(x + 2)2). (b) Fifteen mode DMD approximation of the Burgers’ evolution.

Koopman Observables and Kernel Methods

The effectiveness of Koopman theory hinges on one thing: selecting appropriate observables. Once observables are selected, the previous section defines a DMD-based algorithm for computing the Koopman operator whose spectral decomposition completely characterizes the approximation. In the machine learning literature, observables are often thought of as features, and we will build upon this concept to generate appropriate observables.