Credit where it's due
In message , Robb C.
Overfield writes
On a tatty piece of sub-ether Dave Cornwell at
said...
wrote in message
oups.com...
he early warning of torrential rain for the southeast prompted me to
look at the European radar. Hardly anything significant at that time
so clearly the experts expected a development rather than movement of
existing storms.
And weren't they spot on with the area and intensity! Brilliantly
done.
Jack
---------------------
Absolutely NOT I'm afraid. I am right on the edge of the 60% at risk area
and we got 1mm ! The rain was much further east than predicted and there was
virtually none in London and the eastern suburbs. Upminster 1mm, City
Airport - trace.
Dave, Laindon, S.Essex
Never mind Dave, you can be the one to predict these risk areas next. Then
we can say YOU got it wrong. Trying to predict these areas 12-24 hours in
advance cannot be very easy, even given the information on offer.
One of the definitions of a forecast is "to gauge or estimate (weather,
statistics, etc) in advance". That=3Fs all a weather forecast is, an
estimate of what is going to happen, given the data before the forecasters
at any one time. As that data changes so the forecast will change...
Personally, people complaining about weather forecasts being wrong get
right up my rather ample nasal passages, a bit like a bad smell and
there's enough of those about this city as it is.
I have to disagree with you Rob. If the forecast changes because the
input data changes then the initial forecast was a waste of time and
money and the user would have been better off without it.
A weather forecast has no value in itself. It gains value by influencing
the decisions of the users of the forecast. If these decisions turn out
to be bad decisions because the weather forecast was inaccurate then
what was the point in having the weather forecast in the first place. In
many cases it's not possible to undo decisions once they have been
taken.
It's important to be aware of the difference between precision and
accuracy. If, for example, the forecast maximum temperature for tomorrow
is predicted as being between 10C and 20C and the actual temperature
turns out to be 17C then the forecast verifies as being accurate, if
rather imprecise. On the other hand if the forecast had been for a max
between 14C and 16C and the actual was 17C then the forecast would have
been inaccurate even though it was more precise. It depends on the
requirements of the user as to which forecast would have been the more
useful. To obtain higher accuracy the forecasts have to be less precise
and there has to be a lower level of precision below which the forecasts
become of no value, even if accurate.
The Met Office claims something like 85 percent accuracy in its 24-hour
forecasting. You can do all sorts of clever things with statistics but
when it comes down to situations in which it really matters, such as the
extent of the heavy rain this morning, my gut feel is that the true
accuracy is a lot less than that. Where they fall down, time and time
again, is in trying to be much too precise.
As someone who has spent a lot of his career in weather forecasting I
remain unconvinced that forecasts are really useful in anything other
than the very broadest terms and, even then, over only relatively short
forecast periods. I am not at all convinced that all the money that goes
into weather forecasting today is really money well spent. I spend a lot
of my time these days hindcasting past weather events. Frequently I find
that the weather forecasts that were relevant to the event were very
inaccurate and, arguably, the user might have been better off without
having the forecasts.
On the other hand, I think nowcasting is a very valuable exercise but we
don't do that in this country.
I know there are many who disagree with my views but that's life :-)
Norman
(delete "thisbit" twice to e-mail)
--
Norman Lynagh Weather Consultancy
Chalfont St Giles 85m a.s.l.
England
|