Abstract
Data-driven discovery of differential equations relies on estimating model parameters using information about a solution that is often incomplete and corrupted by noise. Moreover, the sizes of the uncertainties in the model and data are usually unknown as well. This paper develops a likelihood-type cost function which incorporates both sources of uncertainty and provides a theoretically justified way of optimizing the balance between them. This approach accommodates missing information about model solutions, allows for considerable noise in the data, and is demonstrated to provide estimates which are often superior to regression methods currently used for model discovery and calibration. Practical implementation and optimization strategies are discussed both for systems of ordinary differential and partial differential equations. Numerical experiments using synthetic data are performed for a variety of test problems, including those exhibiting chaotic or complex spatiotemporal behavior.
| Original language | English (US) |
|---|---|
| Article number | 36 |
| Journal | Computational and Applied Mathematics |
| Volume | 42 |
| Issue number | 1 |
| DOIs | |
| State | Published - Feb 2023 |
| Externally published | Yes |
Keywords
- Learning differential equations
- Model error
- Parameter estimation
ASJC Scopus subject areas
- Computational Mathematics
- Applied Mathematics
Fingerprint
Dive into the research topics of 'Data-driven learning of differential equations: combining data and model uncertainty'. Together they form a unique fingerprint.Cite this
- APA
- Standard
- Harvard
- Vancouver
- Author
- BIBTEX
- RIS