By Michael Roberts
By Amber Taufen
By Patricia Calhoun
By William Breathes
By Michael Roberts
By Melanie Asmar
By Michael Roberts
By Michael Roberts
Judging the talent of a forecaster isn't easy. According to the National Weather Service's Eric Thaler, it would require about a year's worth of data to scientifically rate a predictor's accuracy. As such, tests of broadcast meteorologists are usually conducted by amateurs. Channel 9's Mike Nelson says the viewers who most frequently grade him are elementary school students working on class assignments.
Just over a decade ago, on April 15, 1992, Westword followed in this proud fourth-grade tradition with "Storm Warnings," an article that offered a snapshot of the skills exhibited by the weather personalities at the city's three network affiliates: Channel 4, Channel 7 and Channel 9. From March 17 until April 2 of that year, their forecasts were tracked using temperatures as a guideline. If the extremes on a given day were within a cumulative ten degrees (for instance, a high off by seven degrees and a low wrong by just three), the prediction was judged correct -- or at least correct enough. Anything further awry received a thumbs-down rating.
Back then, all three stations did fairly well one or two days in advance: Channel 4 passed 73 percent of the time; Channel 7 scored with 79 percent; and Channel 9 trailed the field with a still respectable 64 percent. But things went south more often in the three- to five-day range, with Channel 4 ringing the bell for just 38 percent of its guesses and channels 7 and 9 tying at 48 percent. Simply put, forecasts more than two days in advance were considerably wide of the mark over half the time.
In the decade since then, the tools used by weather pros have reportedly improved tremendously. But has this gear translated to better forecasts? To find out, the "Storm Warnings" experiment was repeated over a very similar span, from March 19 to April 3 this year. This time, though, all five major Denver stations were critiqued under two separate criteria. The first duplicates the cumulative ten-degree benchmark used in the original article, and the second employs a slightly different, and more difficult, yardstick. Channel 7's Marty Coniglio says the human body can't feel temperature variations unless the swing is over five degrees. So Westword counted the number of times the high or the low was off by more than that, rendering a negative verdict if either or both failed to meet the standard. (Using this procedure, the example above of a predicted high off-base by seven degrees and a low incorrect by three degrees would be wrong, because the high was askew enough to feel the difference.) Unsurprisingly, this tally wound up lower than the other measure in every case.
Overall, the picture today is very much like the one taken in 1992, despite the fact that the sixteen days analyzed below were fairly uneventful for springtime in Colorado. The outlets were reasonably good at anticipating how things would turn out 48 hours or less in advance, but they proved to be considerably weaker with each succeeding day.
Dave Fraser and Casey Curry helmed Channel 2's weather segments during the assessment period, and they had a bit of a rough go. Predictions two days or less in advance were within ten degrees 69 percent of the time, and the three- to five-day forecasts wound up at 38 percent. Things got even uglier when the station was judged by the five-degree rule. The one- and two-day forecasts landed within range just 41 percent of the time, and only 17 percent of the three- to five-day hypotheses were in this ballpark by both measures.
Still, Channel 2, which wasn't rated a decade ago (to the profound displeasure of then-chief meteorologist Al Fogelman), had relatively few catastrophic busts, usually missing by less than twenty degrees, especially as the actual dates grew closer. (Its worst moment was April 2, when a prediction two days earlier stumbled by a collective 31 degrees.)
Fraser, who's predicted the weather at assorted stations since the 1980s but has been in the Denver market for less than a year, feels he's improving each day. "Everything comes with experience," he says.
Larry Green is one of just three weather forecasters who have managed to survive in Denver TV long enough to have been part of Westword's initial trial; the others are Channel 7's Coniglio, who worked weekends at Channel 4 in 1992, and Channel 9's Nelson. But this time around, Green was on vacation for thirteen of the sixteen test-period days, leaving the job of monitoring the skies mainly to Jennifer Zeppelin and Dave Aguilera. Their efforts led to a mixed result.
One- to two-day predictions were within ten degrees on 70 percent of their tries, as opposed to 73 percent in 1992, but three- to five-day assumptions scored 48 percent, a ten-degree upgrade. Channel 4 also scored well in the five-degree challenge, predicting the high and low on that scale 62 percent of the time one or two days ahead, and registering 36 percent three to five days in advance. The latter may not seem terrific, but none of its competitors even came close.
Perhaps one of the reasons for Channel 4's triumph is its decision to make individual predictions for Denver and DIA -- the only local station to routinely do so. In Green's view, this approach makes the predictions more useful to viewers. "It's the old George Carlin joke: 'Why are they telling us what temperature it is at the airport when nobody lives there?'" Green says, adding, "We want to be where the people are, and we try to be as site-specific as we possibly can."