Introduction to Complexity MOOC

Unit 2: Dynamics and Chaos

11 thoughts
last posted Oct. 29, 2013, 11:22 a.m.
0
get stream as: markdown or atom
0

Dynamics is how systems changes over time.

Examples:

  • Planetary dynamics - planetary orbits
  • Fluid dynamics - weather, clouds, airflow
  • Electrical dynamics - electricity flow through circuits
  • Climate dynamics
  • Crowd dynamics
  • Population dynamics - population variation over time
  • Financial dynamics
  • Group dynamics - animal or social groups forming and accomplishing tasks
  • Social dynamics - conflicts and cooperation over time

Dynamics develops quantitative descriptions of changing systems.

Dynamic systems theory is a vocabulary and branch of math that describes systems changing over time.

0

Some historical figures and advances in dynamics:

Aristotle (3rd century BC) first described laws of movement (separate laws for heaven and Earth).

Copernicus (16th century) - stationary sun with planets in orbit

Galileo (17th century) - proved Aristotle's laws wrong experimentally

Newton (18th century) - gravity is consistent throughout the universe and concurrently created the Calculus

Laplace (18th century) - Newtonian reduction to causal determinism

Poincaré (19th century) - planted the seed of doubts in reductionism that led to chaos theory, small changes in initial conditions lead to large differences in final outcomes

0

Chaos

Chaos is "sensitive dependence on initial conditions".

"Butterfly effect" - a butterfly flapping its wings in Japan could cause changes in weather that eventually lead to a hurricane

Chaos shows up in:

  • orbits
  • brain activity
  • heart activity
  • financial data
  • weather
  • computer networks
  • population change
0

Question we'll be exploring: How is chaos different than randomness?

0

Iteration is doing something over and over again

In population growth, reproduction is iterated.

Simple Population Growth Model (NetLogo)

n is population
n_0 is initial population (1)
n_1 is population at year 2 (2)
n_2 is population at year 3 (4)
n_t is population at year t



birthrate = number of offspring produced per year from 1 rabbit (2)

n_1 = birthrate x n0
n_2 = birthrate x n1
n_t+1 = birthrate x n_t

Exponential population growth: year 0 is 1 year 1 is 2 year 2 is 4 year 3 is 8 year t is 2^t

Exponential growth is unrealistic in the long term.

Population vs. time is an exponential function.

This year vs. last year is a linear function yielding a straight line:
n_t+1 = birthrate * n_t
y = slope * x

Independence in a system causes linear growth.

0

In a linear system the whole is the sum of the parts. If 1 produces 5, then 5 produces 25 and 100 produces 500.

But if you add a deathrate due to overgrowding (carrying capacity is the maximum population that can be supported) you get a non-linear system:

n_t+1 = birthrate * [n_t - deaths]
where deaths is defined as n_t^2 / max_population

If you extend it slightly you get the "logistic model" of population growth developed in early 19th century by Verhulst:

n_t+1 = birthrate * n_t - (n_t^2/max_population)

Logistic Population Growth Model (NetLogo)

This model shows a "logistic function", it starts out fast, then slows and ultimately goes flat.

With a certain logistic model, 1 individual produces 7 after 3 steps, but 5 individuals don't produce 35, they produce just 21. So the whole is not just the sum of the parts.

0

The logistic model:

n_t+1 = (birthrate - deathrate) [n_t - (n_t^2 / max_population)]

The logistic map is a famous algorithm in chaos theory:

R = birthrate - deathrate
K = max population (carrying capacity)
X_t =n_t / k

x_t+1 = R (x_t - x_t^2)

Robert May and Mitchell Feigenbaum studied the logistic map.

x is always between 0 and 1.

An example:

R = 2
x_0 = .2 (20% of the carrying capacity)
x_1 = 2 (.2 - .2^2) = .32 (32% of the carrying capacity)
x_2 = 2(.32 -.32^2) = .4352
...
slowly approaches .5 and reaches it at x_7 and it stays at .5 for all the following generations. .5 is called an attractor. In this case it is a fixed point attractor.




A graph of x_t compared to x_t+1 for the logistic map forms a parabola.

The logistic map is a model, a simplified representation of reality.

LogisticMap.nlogo

0

In a fixed point attractor system, varying R, the birthrate - deathrate, changes the value of the fixed point attractor, which is the percentage of the max population (carrying capacity) the system settles to after sufficient generations, and is referred to as the ultimate dynamics of the system.

0

A periodic attractor repeats itself oscillating between values with each generation. A periodic attractor with period 2 oscillates between 2 values.

The state of the system is which periodic attractor the system is currently at.

A periodic attractor with period 4 oscillates between 4 values. As you increase the growthrate, R, the period of the system doubles. At a high enough growth rate (such as 4) there is no longer a fixed or periodic attractor and the system is chaotic. It displays sensitive dependence on initial conditions.

At growthrate, R, of 4, just a tiny change in the initial population % will have almost no effect in initial generations but the systems will diverge and follow radically different chaotic paths in later generations.

SensitiveDependence.nlogo

"Prediction becomes impossible." - Poincaré.

May in 1976 said the logistic map tells us that even with a simple model (the logistic map) with all parameters specified exactly, prediction is still impossible.

With a bifurcation diagram of R mapped to the X fixed point attractor we can see that as R reaches 3 we get to a period of 2, and as R approaches 4 we get to the onset of chaos at 3.569946 where there is no longer a period (it is infinite). At this point the system reaches the chaotic attractor, or strange attractor.

The logistic map displays the period doubling root to chaos.

0

Chaos is seemingly random behavior with sensitive dependence on initial conditions. The logistic map can display chaos for higher values of R (growthrates), which is called deterministic chaos which is chaos arising from a completely deterministic system or equation. Perfect prediction is impossible in deterministic chaos since initial conditions can never be exactly known.

While perfect prediction study of chaos has led to universality in chaos which are the highly predictable universal properties of a wide variety of chaotic systems.

All systems which display the period doubling root to chaos result in the unimodal or "one humped" parabolic graph of population to next generation's population.

One such system is the sine map, which is the pure mathematical function:

x_t+1 = R / 4 sin(pi * x_t)

SineMap

Another universality is at what values of R the period bifurcations (doubling) occur. The bifurcations come quicker and quicker as R increases and approaches the onset of chaos. The rate at which the bifurcations is shrinking reaches a limit of ~4.6692016... which is known as Feigenbaum's constant which is a universal constant for chaotic systems with unimodal (one humped) graphs.

The Feigenbaum constant has been experimentally confirmed in chaotic systems such as fluid flow and electric circuits.

These universal properties are the order in the chaos.

0

Liz Bradley - Computer Scientist

Modern computer systems are so complex they are non-linear dynamic systems so they can display chaos in chaotic performance of repeated runs of the same program, or chaotic use of memory. That is to say, sensitive dependence on initial conditions. The initial state of the computer (contents of all its registers and memory) results in chaotic performance or memory access patterns from repeated runs of the same program.

The "R" is the computer program that's being run so there is no way to change it smoothly or understand it simply like in the logistic map.

Legrangian coherent structures are structures with dynamically distinct regions of a time varying system, such as in groups of passengers moving through a transit system or clouds.

The morning glory clouds in Australia exhibit dramatic Legrangian coherent structures: Morning Glory Cloud

Legrangian coherent structures are interesting to complexity theory because they display emergence of these structures, and they represent an information loss in the system from an information theory perspective since it's possible to describe the structures rather than the individual elements forming the structure, and thereby describe the system with less information than would be otherwise required.