Originally published in TEPA's Plugged In, March Edition. Written by Aaron Cook.
Growing up in the 80s I was, by every definition, your typical Dallas suburban kid. I spent my summers exploring undeveloped fields behind my house, catching locusts, playing pin dodge in my front yard and riding my BMX bike everywhere my wiry legs would carry me — which was mostly to play PacMan at the arcade or buy a Slurpee from 7-Eleven.
But some of my best childhood memories were forged in the shadows of Texas thunderstorms. I’d sit on the street curb and watch as the wind shifted from warm to cool and the sunlight dimmed under the oncoming storm fronts. I counted the seconds as time lapsed between lightning and thunder, then scurry into the living room to watch the local weather on TV. I remember that Troy, our local weather guy, always wore bow-ties—which somehow made him all the more credible. And as the storms raged outside, he was the voice of calm inside.
In these days of AOL dial up and dot matrix printers, technology was still in its infancy. Suffice it to say, weather forecasts were often…“off.”
Fast-forward 35 years: The weather seems to be acting out in a heightened state of ferocity even bigger than in my childhood. From severe droughts that invite wild fires, hurricanes that give birth to 100-year floods and sub-zero polar vortexes that grind cities to a halt — the weather continues to affect us in huge ways and thus, captivate us completely.
Fortunately, the technology to more accurately predict weather patterns has evolved, giving us a considerable leg up on preparing for Mother Nature’s epic fits of rage.
I recently spoke with Fred Schmude and Ken Carrier at StormGeo, a global provider of weather intelligence, to learn how weather forecasting has become better, more accurate – and why it’s so important to those of us in energy.
Let me first introduce the protagonists of our story. Fred and Ken are a couple of regular guys who have acquired super powers through emerging tech and gobs of infinite weather data. I like to think of them as masters of the maelstroms.
Fred and Ken are both natives of the East Coast — Fred from Maryland and Ken from Massachusetts. Ironically, Fred didn’t see his first snow until after he moved to Houston in the late 1960s. Houston had three significant snowfalls in 1973, which triggered his fascination (and lifelong pursuit) to understand the weather. He’d eventually earn a degree in Geology from Texas Tech and a Meteorology degree from Texas A&M. He now serves as StormGeo’s Senior Scientist in Long-Range Weather Forecasting.
In his prior professional life, Ken was a principal partner at an environmental solutions company in Boston. After moving to Houston and starting a new career with ImpactWeather (now StormGeo), he earned his Weather Forecasting Certification from Penn State. He now traverses the country as Industry Manager creating custom weather solutions for StormGeo clients.
Though based in Houston, Fred and Ken are part of a much larger crew of weather scientists, meteorologists and data scientists around the globe. StormGeo’s headquarters is in The Land of the Midnight Sun (Norway). Serendipity, right? Where else would you expect a company to be based that’s developing technology that’s turning the weather world upside down and inside out?
In 2016, something mind-blowing happened…something that most of us in the States didn’t even notice. An artificial intelligence system called AlphaGo dethroned the reigning world-champion Go player in Seoul. Go was invented by the Chinese more than 2500 years ago and is believed to be the oldest board game continuously played to the present day.
The AI, a form of machine learning developed by Alphabet Inc.'s Google DeepMind in London, beat the champion. The computer used terabytes of data to recognize patterns and make its own strategic choice. This was complex, strategic thinking folks; far beyond the checker and chess games computers had been winning years prior.
The magic of it all of is that machine learning isn’t just being applied to the gaming world. Companies, like StormGeo, are harnessing the power of machine learning to make weather forecasting more accurate.
“The concept of modern-day weather forecast was born in the early 1800s,” noted Fred. “That’s when army hospitals and regimental surgeons began recording weather data in their diaries. Those entries included basic observations of atmospheric conditions that connected to specific patterns. The most a forecast would extend back then was 24 hours. That forecasting model continued up until about 30 years ago, when we began incorporating more sophisticated levels of observations (i.e. wind gusts and water temperatures) and combined them with computer modeling. Short-term weather forecasts became more accurate, and now long-term is following suit.”
The difference today is that sophisticated computers are now analyzing historical weather data and observations dating back 30 years to churn out powerful numerical weather prediction models. A critical factor in modern forecasting models began when we started systematically launching weather satellites into earth’s orbit back in the 1950s. Of course, satellites have become more sophisticated over the last few decades, leading to infinitely larger caches of accurate data being captured at mind-blowing depths and resolutions.
Meanwhile, back on earth, advancing technology is also making data harvesting much easier and more accurate. Observations from the land, as well as offshore observations from ships and buoys are added to the data collected from space.
Between the data collected from the satellites looking down at earth, and the data collected from the scientists looking up from earth, there’s a much more holistic and accurate 360 degree view of what’s driving weather patterns—short and long range (e.g. more than five days)—around the globe.
“Computing power over the last decade has become immensely more affordable and powerful. In addition to our team of meteorologists, scientists and PhDs scattered across the globe, StormGeo has created a research sandbox with four Titan 10 servers, which combined have 14,336 cores that process 44 trillion computations per second. That enables StormGeo to identify complex relationships in several petabytes of data, ultimately churning out predictive machine-learning algorithms. They not only improve weather forecasting accuracy, but also our understanding of how complex systems like electrical grids will react to the forecasted conditions,” added Ken.
These advancements in machine-learning weather forecasting are especially important to power utilities faced with severe thunderstorm events. Florida, for example, experiences 60-70 severe thunderstorms a year, while Texas will have somewhere around 34 to 45.
“We work closely with electric distribution companies that are constantly faced with the threat of severe weather events,” said Ken. “They’re trying to make the best decision ahead of a storm and that’s a tough situation to be in. If they choose not to act, and a storm causes tremendous amounts of outages, there will be a public backlash. If they overreact and nothing happens, they’re left with stranded operation costs and no way to justify it.”
Fortunately, the scientists at StormGeo are making those decisions far easier for the approximately 80 utility companies they work with across the country, especially during extreme weather months.
According to Ken, most weather companies are just pushing out weather data streams to utilities - like temperature forecasts for load forecasting and energy trading. Because they’re measuring their accuracy every hour, the rare extreme events (where the big money decisions need to be made) are often buried in the data. That’s even using a root mean square error (RMSE) verification calculation, which helps to slightly amplify extreme events.
When you look at wind speeds, for example, most forecast providers will offer up a deterministic view, which actually hides all but one potential outcome in a range of possibilities for a particular day or event. Using these types of generic forecasts, a client is likely to make their decision without any awareness as to whether or not there is a significant chance of major event occurring. This can lead to the very problematic outcome where what occurs is significantly different from what was forecast.
Why does this happen? Because the client was using the wrong type of forecast.
A deterministic forecast tells you the singular most likely outcome, while a probabilistic forecast tells you the chances of all of the various possibilities. This is equivalent to being told “you will not be hit by a vehicle if you cross the road at 5:10 PM” (deterministic outlook), versus “you have a 35% chance of being hit by a vehicle if you cross the road at 5:10 PM”, (probabilistic outlook).
Which forecast would you like to use to determine the best time to cross the road? If you don’t start with a thorough understanding of the nature of the decision, then it’s likely that the implemented solution will not fully support optimal decision making.
“We reverse engineer our weather solutions,” added Ken. “Rather than just issue standard weather reports, we determine what kind of decisions a utility needs to make. We may, for example, develop high precision models for wind gusts in a specific region of West Texas with the probability they’ll actually happen. Then, we’ll go one step further and identify when a weather event has a potential outcome of, let’s say, 60 mph wind gusts with a 35% probability those speeds will be exceeded. We can use that information as the input into a risk-response decision model to help utilities make objective, consistent, repeatable, and defensible decisions ahead of significant events. Knowing that wind gusts have a 35% chance of exceeding 60 mph—resulting in more outages—is found gold for a utility. That additional information can affect better decision-making.”
Severe weather events aren’t the only time accurate weather forecasts are needed. ERCOT, the Independent System Operator for most of Texas, relies heavily on probability forecasting to determine generation dispatching requirements to keep the grid stable. Given the significant percentage of variable wind power, and the small but growing solar power in the state, using probabilistic forecast techniques allows ERCOT to reduce the required spinning reserves, thereby reducing fuel consumption at the power plants.
“Utilities also have to predict the availability of solar resources ahead of time,” added Fred. “You can have enormous variations in power generation if a cloud moves over solar panels. So you have to plan 5 minutes, 15 minutes, an hour ahead so you know the level of power you’re going to generate from the solar panels. We’ve developed a sophisticated prediction model for solar forecasting. It’s an algorithm that considers different slices of the atmosphere to track cloud coverage and has proven to be hyperaccurate within a 6-hour window.”
Like solar, long-range wind forecasts are also benefiting from machine learning. Like El Niño and La Niña that show up every few years in the US, there are lots of other key weather patterns around the globe that drive weather patterns.
When the primary patterns are in a neutral phase, the secondary patterns drive wind power. Machine learning has become instrumental in identifying these patterns and forecasting wind power generation 6 to 9 months into the future. According to Ken, they also provide guidance to Wall Street for power generation for the next quarter.
While AI is giving weather forecasting a digital makeover, it hasn’t supplanted the human touch.
Every technological system depends on a whole cadre of tools from software and data processing to the teams of scientists and analysts who curate the output and analyze the results. Charts and white papers can only capture so much. There’s something to be said about a gut feeling. Facts still need feelings, and weathermen with bow-ties tend to be more convincing.
That’s where Fred’s passion for weather analysis comes in to the equation. He occupies the space where man and machine form the cognitive advantage. It’s a relatively new way of weather forecasting. And, it’s working.
With a background in meteorology and geology, Fred and his team study multiple computer models every day. The meshing of all the computer data and human interpretation creates a more accurate depiction of weather patterns—especially long-term predictions.
“There are several key factors we look at in depicting long-term weather patterns,” he added. “Among the primary ones is the ocean’s temperature. Those are critical because they have higher heat capacity than air and they change at a slower rate. When it becomes colder or warmer it will alter the air pressure, which alters the jet stream, which changes land temperate, which affects precipitation, which changes wind patterns.”
The challenge is that water temperature patterns change—a lot. Staying on top of it is literally a daily task. As noted earlier, El Niño and La Niña are weather patterns that overwhelm the system and dominate the weather when they’re in play. But, they’re not always in play. They’re cyclical. So, when the secondary patterns emerge, they’re going to drive weather patterns around the globe.
“Think of it like this,” added Fred. “Tom Brady is El Niño. When Brady’s in the game, you pretty much know how the game is going to be played. Take Brady out—and while you still have a good team—now you’re playing the game with a second string quarterback. The game will change. When looking at weather forecasts, it’s important to understand who the players are in long-term weather forecasts.”
This is especially true for retail energy providers who look to scientists like Fred for accurate long-term weather forecasts so they can plan their energy supplies ahead of severe weather events. “REPs want to know what kind of power demands to expect in the future. If cold weather is going to extend past the normal months, it’s important to know for how long. Same for summer heat. If Texas is going to have another record number of 100+ degree temperatures, the grid needs to prepare months in advance.”
What about this year’s brutal polar vortex? Did Fred’s team see that coming? You bet. Largely because of the warming ocean temperatures.
“Last mid-December we saw the potentiality of a Sudden Stratospheric Warming (SSW) forming with quickly warming water over the Indian Ocean,” Fred noted. “The SSWs take about 30 days to work down from the Stratosphere into the Troposphere where they adversely affect the pressure and flow pattern - shifting the polar vortex farther south than normal into the Lower 48. The result is bitterly cold weather. The SSW event this year, which started in late December, plunged the Midwest and Northern Plains into another polarizing freeze by late January.”
Over the last few decades, Fred has seen countless SSWs happen—including the notorious ones of the late 70s that caused extreme cold weather events. While SSWs are still being studied, the good news is that the professionals, like Fred, can let us know when it’s likely to happen—and better prepare us for them.
It’s clear that we’ve crossed the threshold of a new era of accurate weather prediction—with weather data being processed by super computers and analyzed by super smart specialists. The combination of the two has us teed up for much more accurate weather forecasts, both short- and long-term. That’s especially good news for all of the players in the energy market who lean on future weather events to correctly prepare the grid, energy supplies and customer contracts.
It’s exciting that StormGeo’s brain trust—equal parts human and computer—is building one of the most accurate decision-making models in weather forecasting. While weather forecasting is more accurate and reliable than it’s ever been, human and machine learning promises to have us peering even deeper into Mother Nature's deepest secrets. Buckle up. I’m sure we’re going to see things you can’t even imagine.