TY - JOUR
T1 - Magnus' expansion for time-periodic systems
T2 - Parameter-dependent approximations
AU - Butcher, Eric A.
AU - Sari, Ma'en
AU - Bueler, Ed
AU - Carlson, Tim
N1 - Funding Information:
The authors were supported in part by National Science Foundation Grant CMS #0114500.
PY - 2009/12
Y1 - 2009/12
N2 - Magnus' expansion solves the nonlinear Hausdorff equation associated with a linear time-varying system of ordinary differential equations by forming the matrix exponential of a series of integrated commutators of the matrix-valued coefficient. Instead of expanding the fundamental solution itself, that is, the logarithm is expanded. Within some finite interval in the time variable, such an expansion converges faster than direct methods like Picard iteration and it preserves symmetries of the ODE system, if present. For time-periodic systems, Magnus expansion, in some cases, allows one to symbolically approximate the logarithm of the Floquet transition matrix (monodromy matrix) in terms of parameters. Although it has been successfully used as a numerical tool, this use of the Magnus expansion is new. Here we use a version of Magnus' expansion due to Iserles [Iserles A. Expansions that grow on trees. Not Am Math Soc 2002;49:430-40], who reordered the terms of Magnus' expansion for more efficient computation. Though much about the convergence of the Magnus expansion is not known, we explore the convergence of the expansion and apply known convergence estimates. We discuss the possible benefits to using it for time-periodic systems, and we demonstrate the expansion on several examples of periodic systems through the use of a computer algebra system, showing how the convergence depends on parameters.
AB - Magnus' expansion solves the nonlinear Hausdorff equation associated with a linear time-varying system of ordinary differential equations by forming the matrix exponential of a series of integrated commutators of the matrix-valued coefficient. Instead of expanding the fundamental solution itself, that is, the logarithm is expanded. Within some finite interval in the time variable, such an expansion converges faster than direct methods like Picard iteration and it preserves symmetries of the ODE system, if present. For time-periodic systems, Magnus expansion, in some cases, allows one to symbolically approximate the logarithm of the Floquet transition matrix (monodromy matrix) in terms of parameters. Although it has been successfully used as a numerical tool, this use of the Magnus expansion is new. Here we use a version of Magnus' expansion due to Iserles [Iserles A. Expansions that grow on trees. Not Am Math Soc 2002;49:430-40], who reordered the terms of Magnus' expansion for more efficient computation. Though much about the convergence of the Magnus expansion is not known, we explore the convergence of the expansion and apply known convergence estimates. We discuss the possible benefits to using it for time-periodic systems, and we demonstrate the expansion on several examples of periodic systems through the use of a computer algebra system, showing how the convergence depends on parameters.
KW - Chebyshev polynomials
KW - Magnus expansion
KW - Time-periodic systems
UR - http://www.scopus.com/inward/record.url?scp=67349085340&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=67349085340&partnerID=8YFLogxK
U2 - 10.1016/j.cnsns.2009.02.030
DO - 10.1016/j.cnsns.2009.02.030
M3 - Article
AN - SCOPUS:67349085340
SN - 1007-5704
VL - 14
SP - 4226
EP - 4245
JO - Communications in Nonlinear Science and Numerical Simulation
JF - Communications in Nonlinear Science and Numerical Simulation
IS - 12
ER -