My wife said to me the other day, “The Heat broke AI.” At first, I had to think about what she meant. Since the start of the NBA playoffs, many experts, analytics predictions and betting outlets had the Boston Celtics as not only the favorite to win the Eastern Conference but also the NBA Championship outright. My spouse made her astute observation after many of those predictions held firm, even as the Celtics went down 0-2 to the Miami Heat. There are lessons in all of this about weather forecasting and the occasional “busts.”
As I have written many times, public perception of the accuracy of weather prediction does not often match up with reality. As meteorologists, we often hear about how, “weather forecasts are always wrong,” or that, “it must be nice to be in a profession in which you are wrong 50% of the time.” Honestly, these statements often say more about the person than the actual facts. Studies show that many people struggle with concepts of probability and uncertainty, yet we use them in weather forecasting all the time. I routinely see people misinterpret things like “20% chance of rain” or the “hurricane cone of uncertainty.”
According to the National Oceanic and Atmospheric Administration website, “A seven-day forecast can accurately predict the weather about 80 percent of the time and a five-day forecast can accurately predict the weather approximately 90 percent of the time. However, a 10-day—or longer—forecast is only right about half the time.” It is human nature to remember the occasional “bust” that affected your daughter’s outdoor wedding or the big game.
Now let’s get back to the Celtics and Heat, but first, an overview of key steps in how weather forecasts are made:
- Collecting observations of the current state of the atmosphere and boundary conditions (weather balloons, satellites, aircraft observations, automated weather stations, buoys, etc.).
- Quality control of those observations.
- Plotting data (weather maps, soundings, meteograms).
- Initialization of computer models with observations using the underlying physics and math understanding of the atmosphere and how it is expected to evolve.
- Assimilation of new information (via computer algorithms or human interpretation).
- Analysis and interpretation based on historical information, analogues, and model output.
- Presentation and verification of the forecasts.
Down 0-2 in the series, some analytics tools were still giving the Celtics a 65% chance of winning the series. Heck, some of those tools are still giving the Celtics a better than 30% chance of coming back from 0-3. Like weather forecasting techniques, I am sure these analytics are based on data collection, analysis and historical tendencies. However, they are not verifying. Here is where it gets tricky. Even with the best weather models, there will always be some uncertainty and the potential for “misses.”
Weather models cannot always represent all of the details of the atmosphere due to computational limitations. For example, a model with a 10-mile grid spacing will miss a smaller thunderstorm or fine scale processes that may be causing it. To mitigate these blind spots, “parameterization” is used to broadly represent processes the model cannot “see.” In the sports analytic models crunching out projects for the Heat-Celtics series, there are clearly some processes not being resolved or picked up by the numbers. I am not sure they properly “parameterized” the tenacity and will of Jimmy Butler or the contributions of a roster consisting of several undrafted players.
Another aspect of weather forecasting is data assimilation. The European Center For Medium-Range Weather Forecasts uses a method called four-dimensional variational data assimilation (4D-Var), which, according to ECMWF’s website, “iteratively adjusts the initial conditions of a short-range forecast to bring it into closer agreement with meteorological observations in space and time. The adjustments are made in a manner which respects the physical laws.”
Advances in assimilation techniques have dramatically improved weather forecasts. For example, there are scholarly studies that show that efficacy in prediction of Hurricane Sandy (2012) was related to parameterization as well as the quality of satellite data being used in initialization or assimilation. If you are watching the 2023 NBA playoffs, it makes you wonder how much updated information about the Heat and Celtics is being “assimilated” into the analytics to make adjustments to initial conditions.
While most of my analogy so far has been based on technical aspects, there is something else that I see in weather, too. It is called “wishcasting.” Often times, the weather forecast is very clear about what will happen, but people “wishcast” because they want a certain outcome. They may love snow, hope their cookout is not canceled or want to get in that round of golf. I often wonder if there is a bit of “wishcasting” at play with the current Celtics-Heat match-up as well.
As I write this piece, Game 4 of the series is set to be played tonight. The Los Angeles Lakers were just swept by the Denver Nuggets. Will we see another sweep tonight? With the NBA Finals not slated to start until June, it is clear to me that, based on projections and “NBA climatology,” the NBA and television planners anticipated longer series. That’s a forecast bust.