Department of Mechanics
A dynamical system is a system whose state changes over time according to certain rules, or a mathematical model for describing that system. In general, a model is created by extracting several elements that affect state changes as variables and describing interactions between the elements as differential equations or difference equations.
In dynamical systems, the states of the system are defined by a set of real numbers. The difference between each state is expressed only by the difference of the variables representing that state. The change in state of the system is given by a function, and the future state can be uniquely determined from the current state. This function is called the state evolution rule.
Examples of dynamical systems include the oscillation of pendulums, fluctuations in the populations of living things that exist in nature, and the orbits of planets, but all phenomena in this world can also be regarded as dynamical systems. The behavior of the system is diverse depending on the target phenomenon and the level of description.
Concrete examples of dynamical systems Oscillating motion (harmonic oscillator, nonlinear oscillator, van der Pol oscillator)
Clock reaction (Brusselator, Oregonator)
The concept of dynamical systems originates from Newtonian mechanics. Dynamical systems, like other fields of natural science and engineering, are modeled by extracting several elements that influence changes in state as variables and describing interactions between the elements. And the state immediately after the present is given using differential equations or difference equations. The state at some point in the future can be determined by repeating the calculation to determine the state immediately after the present multiple times. Therefore, in a dynamical system, given the current state, all future states can be determined.
However, only a small part of the dynamical system can be obtained analytically, and advanced mathematics is required to solve the dynamical system. Therefore, before the advent of computers, only very simple systems were treated as objects of research.
The behavior of simple dynamical systems can be easily understood. However, as the system becomes more complex, its behavior also becomes more complex, and it becomes impossible to predict the future state without detailed analysis.
Even a well-known system may not describe all the variables that affect its behavior. Also, it is necessary to verify whether the obtained numerical solution is really appropriate as an approximate solution of the system. To solve these problems, the concept of "stability" such as Lyapunov stability and structural stability is used in the study of dynamical systems. Using the concept of stability, it is easy to explain why different initial conditions can lead to large differences in system behavior, even if the model is the same.
Since the behavior of the system depends on the initial conditions, it does not make much sense to examine the behavior under a single initial condition. It may behave cyclically under certain conditions, or it may settle into certain states. What kind of behavior is exhibited under what kind of conditions is important. In dynamical systems, the types of system behavior are classified mathematically. An example of a dynamical system in which the kinds of possible behavior are known perfectly