chaos theory

Will Machine Learning Make Us Rethink Chaos Theory?

October 04, 2018

“Long-term climate forecasting is impossible.” In 2007, this was the conclusion of mathematician and meteorologist, Edward Lorenz, who overthrew the idea that the universe is as predictable as a ticking clock with his groundbreaking chaos theory. Today, Lorenz would be astounded by the technological progress we’ve made, which may disprove his pronouncement about long-term forecasting.

edward lorenz

Edward Lorenz – American mathematician, meteorologist, and a pioneer of chaos theory.

Many scientists now rank chaos theory, alongside relativity and quantum theory, among the greatest scientific revolutions of the 20th century. The theory essentially postulates that a small change in the initial conditions of a complex system results in large changes in later conditions.

Lorenz discovered the effect when he entered some numbers into a computer program to simulate weather patterns and then left his office to get a cup of coffee. When he returned, he noticed a result that would change the course of science.

The computer model was based on 12 variables, such as temperature and wind speed, whose values could be depicted on graphs as lines rising and falling over time. On this day, Lorenz was repeating a simulation he’d run earlier – but he had rounded off one variable from .506127 to .506. To his surprise, that tiny alteration drastically transformed the whole pattern produced by his program.

Lorenz coined the term “the butterfly effect” as a metaphorical example of how the time and formation, as well as the path, of a tornado could be influenced by something as minor as the flapping of the wings of a distant butterfly several weeks earlier. His insight turned into the founding principle of chaos theory, which expanded rapidly during the 1970s and 1980s, affecting fields as diverse as meteorology, geology and biology.

Chaos Theory and Improved Weather Forecasting

The butterfly effect is most familiar in terms of weather because it can easily be demonstrated in standard weather prediction models, which are sensitive to initial conditions.

In making forecasts, meteorologists must take into account many phenomena, each of which are governed by multiple variables and factors. For example, how the sun will heat the Earth's surface, how air pressure differences will form winds or how water-changing phases (from ice to water or water to vapor) will affect the flow of energy. They even have to take into account the effects of the planet's rotation in space. Small changes in one variable in any of these complex scenarios can profoundly affect the weather.

Lorenzo’s work led to improvements in weather forecasting, which he credited to three things: wider data collection, better modeling and “the recognition of chaos” in weather. This led to what’s known as ensemble forecasting. In this technique, forecasters recognize that measurements are imperfect and thus run many simulations with slightly different initial conditions — forming a more reliable “consensus” forecast.

In a 2007 interview, Dr. Lorenz said chaos theory proved that weather and climate cannot be predicted beyond the very short term and that, even with today's state-of-the-art observing systems and models, weather still cannot be predicted even two weeks in advance. However, a major technological advancement has entered the picture since then: machine learning.

Making Forecasts Years into the Future

Machine Learning uses statistical techniques within a computer system to improve performance on a task as the amount of data being analyzed grows. The technique is behind recent successes in artificial intelligence.

Machine learning might figure out how initial conditions from a very basic stage can affect the development of a weather system or weather pattern. I feel optimistic about the potential for longer range forecasting several weeks, months or possibly even years into the future.

In many cases, the equations that chaotic systems are based on aren’t known — crippling scientists’ efforts to model and predict them. On the other hand, machine learning algorithms are based on the data of a solution as it evolves and do not need equations. This means we may one day be able to predict the weather with machine learning algorithms rather than sophisticated atmospheric models.

The key to making a longer range forecast is assimilating extremely detailed data over an extensive time period. The question is, do we currently have that type of data resolution and if not, can we get it in the near future?

The notion of data assimilation (combining observations and prior forecasts) has been an integral part of weather forecasting for several decades. While most climatologists are still using conventional methods to analyze their data, this is changing. The marriage of machine-learning techniques with climate science has only just begun and may even achieve what Lorenz thought impossible.