Created: 29 October 1998. Most recent update: 18 September 2013 ... fixed an error apparently resulting from my faulty memory.
I've been involved with issues tied to forecaster training for most of my career. Perhaps the saddest part of that lifelong commitment is that I've seen very little substantive change. Harold Brooks and I have said that there are two absolutely essential implications of caring about the quality and value of forecasts: (1) a meaningful forecast verification program, and (2) a substantive forecaster training program. The absence of these in any forecasting service gives unmistakable evidence of the absence of a commitment to forecast quality ... there can be no compromise on this point. It is abundantly clear that operational forecasting in the National Weather Service [NWS] and in many private forecasting enterprises is characterized by the absence of these two elements ... the conclusion is inescapable!
This essay is not the forum for addressing forecast verification. I've talked about it elsewhere. Here, I want to put down my ideas about forecaster training, and indicate where they are today. I have written (and co-written) about forecaster training several times (see, for examples, here and here and here), but all of these have had to be toned down to pass muster for publication. In this document, I intend to call 'em as I see 'em.
The only room for discussion in the preceding revolves around the adjective "substantive", so the rest of this essay is going to describe what it takes to be substantive and explain why none of the existing programs in the NWS are substantive. Further, I'm going to propose an alternative program that I believe is what should be provided.
I am constantly being told by various naysayers to whom I've explained these ideas that I'm unrealistic. There is "no way" that the NWS or any other organization can afford to do what I am proposing. Well, my response is basically that there is no way they can afford not to do something like what I am proposing if they have a commitment to forecast quality! I think the willingness to invest in hardware should be balanced by something representing a decent commitment to the people who are going to use the hardware. Let's say that we expect a commitment to training that is 10 percent of the investment in hardware, up front. Anything less is, in my opinion, unrealistic! Good, successful private corporations often say that training should be at least 15 percent of the cost of new hardware. If the NWS has just invested several billion (with a "b") in new hardware, they should be investing several hundred million in training programs!! The actual investment is about 1 percent of that. Draw your own conclusions, folks. If NWS management can't find a way to fund substantive training, it's not because it's truly impossible -- merely that they don't want to do it.
As a final introductory thought, it was clear from listening to the pilots of American warplanes during the Gulf War that training was the key to their success. Over and over again, during their interviews they would say something like "Well, I was pretty scared, but then the training kicked in. It happened just the way we trained!" We need to take advantage of that same level of meaningful training in developing forecasters.
The NWS training program is a sad story of underfunding, lack of commitment, and faulty assumptions. For all practical purposes, the NWS assumes that obtaining a B.S. degree at Goshwotta University provides the forecaster with all the meteorology he or she will ever need to be a successful forecaster. Of course, many forecasters go on to obtain M.S. degrees and a few even get doctorates. Nevertheless, the idea is that 30 semester hours of meteorology is sufficient education for a forecaster. It's a fascinating topic of conversation to discuss the relationship between education and forecasting ... I've put some thoughts down on this one, too. From where I sit, I find this implicit assumption almost ridiculous. How could anyone think a B.S. is enough education for the most challenging task ever undertaken by a meteorologist anywhere (i.e., weather forecasting)? Without belaboring the point, I believe obtaining an education in meteorology leading to one or more degrees is an important part of becoming a good forecaster, but it doesn't guarantee anything. Why not?
Essentially, universities are not in the business of training ... they're in the business of providing an education. The idea is that education forms a sound basis upon which a training program can be built. Learn basic scientific principles in school and then use those principles to develop methods for practical forecasting applications. Of course, those same principles can be used to develop other applications of meteorology besides forecasting. Hence, the universities properly claim they are not even trying to train their students how to be forecasters.
Again, without going off on another tangent, university programs in meteorology tend to focus a lot on developing theory. Class time is dominated by the derivation of equations. Many students who have their hearts set on becoming forecasters often gripe about the apparent irrelevance of this theory to their job goals ... I already have expressed myself on this one. Nevertheless, that's what they get. In my own experience, it became clear to me as an undergraduate that what I saw forecasters doing in a forecast office was not clearly and obviously related to what I experienced in the classroom. This was of some considerable concern to me, but I was not then experienced enough to understand what was going on. Now I see that universities focus on education because not all their graduates will go on to apply their education in the same venue ... we can argue about how well the universities accomplish their educational goals, but it's clear they are not training institutions! I will say that in most university programs, the staff of the typical meteorology program doesn't include anyone who is really qualified to train students in operational forecasting, even if they wanted to do so. How many university professors have ever been weather forecasters?
In the NWS, at present there are three geographically (and philosophically) distinct "training" programs: the NWS Training Center in Kansas City, MO, the Cooperative Meteorological Education and Training [COMET] program in Boulder, CO, and the Operations Training Branch [OTB] of the WSR-88D Operational Support Facility [OSF] in Norman, OK. Let's consider these in turn.
The NWSTC has a relatively long and checkered history. Its course offerings are dominated by training programs designed to provide specific information about technical aspects of NWS operations, mainly for electronics technicians. However, there have been a few programs aimed at meteorologists. After the Big Thompson flash flood in 1976, Bob Maddox and Charlie Chappell (then of ERL's Atmospheric Physics and Chemistry Lab in Boulder ... an organization that no longer exists) were invited to participate in a Flash Flood Forecasting Course  (FFFC) that was developed by what was then known as the NWSTTC -- the extra "T" stood for Technical Training, which reflected the center's emphasis on training for technicians. The development of the Flash Flood Forecasting Course was at least part of the impetus eventually for the name change, since it was mostly aimed at NWS meteorologists (and hydrologists). After I moved to Boulder to work with Bob Maddox in 1982, he induced me to help out with his part of the FFFC and so began my lengthy connection to the NWSTC. I taught the 2-day segment (that Bob and Charlie developed for the FFFC) anywhere from one to three times per year for roughly ten years. Some interesting results from that experience can be found here.
At some point [I don't even know the year], the NWSTC began the "Forecaster Development" Course (FDC). This was supposed to be entry-level training for its forecasters. I've never taken the course, and never even sat in on it. However, I've been told by most of the attendees that it basically is a silly waste of their time. Essentially, everyone passes, the course is taught in a very sterile classroom setting, with very little in the way of challenging new information. From the reports I've received, it's probably better than no training at all, but only trivially so.
A source [who wishes to remain anonymous] adds:
... the inherent problem with the old FDC course (other than the fact that the course was very watered down like you mentioned) was that the interns who took the course (like I did in 19__ [date censored to prevent possible recriminations!]) came back and did met. tech. duties at their offices (taking radar obs, u/a obs. NOAA WX radio, etc.). They didn't do any forecasting duties to reinforce any operational education or skills they may have learned at FDC. Thus, the course was virtually a waste of time (except for getting out of the office for 2 weeks). In fact the whole intern program had very little professional forecaster development activities other than some very lame paper modules which you were required to take to get your GS rating from a 5 to a 9.
Being in Kansas City, the NWSTC has only tenuous connections to any other educational resources, since there is no university meteorology program in the Kansas City metropolitan area.
There has been virtually no follow-up to any of the training programs offered by the NWSTC, in order to measure their success. Sure, some "feel good" surveys have been done among the "graduates" of NWSTC courses, but no one has ever attempted to measure (a) the extent to which behavior on the job has been changed by the training, or (b) assuming operational behavior was changed, if the results detectably in the direction of forecast improvement.
It's illustrative about the situation to consider events that occurred after my participation in the FFFC ended. I was contacted by someone [who will remain anonymous] at COMET (see below) who requested my participation in developing a training module that would "duplicate the training outcome" from the 2-day program that Bob Maddox had developed and which I helped teach for 10 years. My response was that I would be delighted to attempt to duplicate that training outcome ... if anyone could tell me what that outcome was! How could anyone duplicate an unknown outcome? Well, in one sense, it's easy -- if the outcome is unknown, then the module can't help but be "successful"! Since no one could duplicate an unknown outcome, including me, this pre-condition created an uncomfortable situation. Bob Maddox is a much nicer guy than I am, and COMET worked with Bob (rather than me) to develop their module. This incident illustrates: (1) NWSTC's lack of any real follow-up on the FFFC to determine the training outcome, and (2) COMET's approach to training module development.
Correction (18 September 2013) - It appears my memory of this development is faulty, somehow. Bob Maddox informs me he never
contributed to the development of a COMET training module on flash
flood forecasting, for reasons similar to mine. I apologize for
the error, and feel a little befuddled by my evident memory flaws.
This organization is the product of an interaction between NCAR and the NWS. So far as I can tell, its main reason for being has been to funnel resources from the NWS into NCAR. From where I sit, NCAR is hardly an organization with a long history of meaningful involvement with operational meteorology. At one point in NCAR's history, the very notion of applied research was considered about as welcome as Hanta virus-laden rodent feces. However, it appears that some NWS bureaucrats had eyes on retiring in Boulder, so a bit of nest-feathering looked like a good thing. Everyone knows that scientists can be bought. Presto! COMET was born. Perhaps this might be regarded as hyperbolically cynical. I plead guilty to cynicism in this regard. I simply can't imagine how anyone in the NWS hierarchy ever could have conceived of NCAR as a plausible partner in developing NWS training programs!! It seems strange to me that the NWS overlooked potential partnerships within NOAA in favor of the organization most known for a negative view of applied meteorology, to say nothing of operational applied meteorology.
Owing to my involvement with the NWSTC, I was asked to participate in COMET. I have steadfastly refused to do so. Why? The short answer is that I simply did not want to be associated with this chimera. I did not want to seem to be endorsing its programs by my participation and I didn't want my name connected to it. So why this extremely negative reaction?
Well, consider the fact that NCAR is not exactly bulging with operational forecasting experience. Ahhh, but they bring in guest lecturers! Where do the guest lecturers tend to come from? The universities!! If forecasters were upset because the education at the university they received was inadequate training for operational forecasting, is some miraculous transformation going to occur when those same professors arrive at COMET? Are they suddenly going to become operationally-oriented and able to tune their presentations to NWS operational circumstances? Sorry, I find this rather hard to believe.
What the COMET resident students get is more of whatever they got at the university, not training! This may be useful, but they get this education in little segments, at rare intervals. Moreover, it's essentially up to them to learn how to apply this education. The fact that COMET provides forecasters with some continuing education is probably at least better than nothing. What COMET looks to me to be, however, is something that remains pitifully inadequate in terms of real training, but which allows the NWS to say "We're doing something about training! See this great program with universities we've developed!" Sorry, I didn't (and still don't) want to be connected with yet another "settle-for" NWS training program.
Of course, another thing that COMET has been famous for is creating "distance learning" modules, using various media. There is nothing inherently wrong with distance learning, but the NWS is using it as a panacea for the absence of meaningful training. It's another "settle for" answer ... I'm told in no uncertain terms that there isn't enough funding to support residence courses for the forecasters and the staffing is so critical that forecasters can't be spared to go to residence courses. The "answer" then must be distance learning! I could write a book about what's wrong with this philosophy, but I'll leave that for another rant, sometime.
Suffice it to say that this "answer" is pitiful -- and forecasters still must find time to do the modules among their other responsibilities. Whatever minimal value might be associated with these "M&M-sized" bits of meteorology is reduced even further when forecasters have to work through them a bite or two at a time, rather than getting dedicated time to concentrate on the material. COMET's willingness to crank out these modules is, from where I sit, another reason why I don't want to work with them. Good intentions simply don't make up for the lack of a meaningful training program.
In any case, COMET has been and continues to be a program that I simply can't endorse by participation. It absorbs a substantial chunk of the NWS allocation for training, and gives the forecasters relatively little in return for that investment.
Erik Rasmussen and I have had an e-mail exchange about my position regarding COMET. An lightly-edited version follows, with our respective contributions in different fonts:
Erik: Re: COMET... I can't argue with most of your criticisms. But I do take issue with you for not getting involved to fix COMET. Why do I say this? In my 15 or so years in professional meteorology, I have come to the conclusion that PROGRAMS are never the answer to problems, no matter how well designed, intended, or funded. ...deletia... I believe you are mistaken to hold the personal position you hold regarding COMET.
Me: One of the goals of my essay, apparently unsuccessfully done, was to let folks know precisely WHY I have taken the position that you are disputing. After years of working with (for?) the folks at the NWSTC, it dawned on me that all I was doing was supporting the NWS bureaucrats who provided only this minimal, "settle-for" training. My participation, whatever impact it had on the forecasters who were in my class (and no one knows what that impact was, for reasons I've explained), was underwriting a program in which I didn't believe. It reminded me of some of my unpleasant experiences with PROFS ... I didn't want to appear to be validating what the NWS is doing with respect to training. I still don't.
When it comes to training people to do a better job in the warning process, there are two people who shine... you and Al. Do you shine because of your organization? Hardly. Because of the level of commitment by the NWS to your efforts? (Laugh; I thought I'd die!) You are able to do what you do (however minimal and insufficient it is) because of your special talents, abilities, and experiences. You have great breadth of knowledge of the topic and you have "been there". Don't take this as a compliment!
I am quite unperturbed by this ... it certainly is true that the combination of talents and experiences I've had is unique ... the same can be said of anyone, so I'm not about to get a swelled head over your saying so. And I quite agree that it is the people, not the organization, that matter. See below.
... If the NWS were to dedicate $200M to CPTO (Chuck's Perfect Training Organization), do you think it could be effective at severe weather and flash flood training _without you_? Do you suppose they could take another NWS forecaster and turn him into you for training purposes? I very strongly doubt it. It's your unique abilities in this area that would make for successful training with or without the CPTO bureaucracy, and with or without the $200M.
You make a very good point. Even if the CPTO/FPS is created, it still must be staffed with the right people or it will fail, inevitably. You didn't raise the point specifically, but a problem that the FPS would have is FINDING the right people, and getting them to join the FPS. I don't know if they exist in sufficient numbers even to provide the staff of this one organization, but if they do, they may not want to leave whatever their current situation is to become part of the FPS. My operating assumption is that whereas no organization can succeed without the right people, the right people can be made ineffectual without a proper program. No framework exists right now that can be made to work, right people or no!
However, the problem, as I see it, is NOT the lack of the perfect organization (an obviously silly goal), but (a) the lack of a high-level commitment to training and (b) the absurdly low resources allocated to the training task, allocated among 3 disparate organizations within the NWS. Until they DO something to change those characteristics of their approach to training, then I simply find it unacceptable to put my name into an organization that could never accomplish what needs to be done. I guess I got tired of seeing what needs to be done and then being told repeatedly that I'm "unrealistic" in my expectations of a real training program. Perhaps I can be criticized legitimately for giving up on the NWS ... but my capacity to overcome depression was finite. Ten years of it was quite enough.
Thus my conclusion is that you diss COMET for the wrong reasons. COMET, or any other training organization, with any training program, can't succeed the way you would like them to succeed without rounding up the few, very particularly qualified people that are needed to do the job. And if those people decline to participate because they don't like the organization (or its host organization) or its funding level, then failure is assured.
Agreed, but (a) in effect, I have no interest in seeing COMET succeed! ... although I don't expect an organization to be perfect, this one is so flawed from its inception that I don't think it can possibly be made to succeed. Try to change your organization's agenda as an employee from within and see how far YOU get. My vision of a proper training organization is based on the Canadian model (that no longer exists) ... (b) COMET itself is not interested in rounding up the people to do the job that I want it to do. Those folks have their own concept of what they want to accomplish and without support from the highest levels in the organization, all I could do is snipe at the margins of their agenda. Their agenda may have worthwhile components but it certainly doesn't coincide with mine. Fighting them from within simply isn't a good use of my time.
Or maybe all of the foregoing is just an expression of my cynicism towards organizations and their ugly effect on scientific progress. The best organization to belong to is the one that doesn't even know you're there.
Well ... I think the BEST organization is one that really supports what you are trying to do. I've never been part of such an organization, although I HAVE had the benefit of benign neglect from time to time.
I think I see exactly where you're coming from... I guess I even saw it yesterday but didn't realize it.... if the ORGANIZATION gets in the way, then even the RIGHT PEOPLE can't accomplish their goals. So the thing to lobby for is an organization that, at a minimum, doesn't get in the way; and better yet... an organization that knows enough to promote the activities of the most qualified people.
The OSF's Operations Training Branch has a short history. It was created in response to the need for training focused on the new "NEXRAD" radars ... a need that apparently was even recognized by the NWS! I was involved, to my regret, with the development of some "Pre-NEXRAD" training modules, largely because of the request for my involvement from some friends of mine in the field offices. Reluctantly, then, I had to work with NWS management to help develop these modules. Again, I could write a book about this experience, but will defer most of that to another time [if ever].
Eventually, the NWS cycled a lot of its staff through the actual NEXRAD residence training programs. Amidst the focus on "knobology" some of the more dedicated OSF-OTB folks even managed to sneak in some respectable meteorological training material, and I was pleased to be involved in some small parts of that. There have been some pretty good folks working for the OSF-TB over the years, and I enjoyed working with them when they asked. Eventually, however, that residence training ran out of folks to train and has been eliminated, apparently to be replaced by "distance learning" or whatever. Reports I got from forecasters said the residence course was good but they threw a LOT of material at the students in a short time, and a lot of it was focused on knobology rather than meteorology.
In the time since the NEXRAD residence training was terminated, the OSF-OTB is increasingly an organization that is under pressure. Talk currently is circulating that whatever staff is left is going to be merged with the NWSTC in Kansas City when the latter group moves into its new building. Morale is sinking as it is clear that the OSF-OTB organization may not survive as such much longer, a likely victim during one of the seemingly endless "restructuring" exercises that NWS and NOAA bureaucrats seem to love so much. In spite of my empathy for many of the OSF-TB folks, in the long run, if NWS training is to have any hope for substance, the disparate organizations need to be merged. They also need other things, most of which probably will not be forthcoming any time soon.
My impression from all of this is that the NWS isn't really serious about training ... period. Its training is scattered among three relatively small organizations, with virtually no coordination among the disparate training programs. If there were a training "czar" at NWS headquarters, at a level comparable to the Office of Meteorology, say an Office of Training, then NWS management would be demonstrating a substantive commitment to training. Clearly, if there ever were to be such a high-level training bureaucrat, a lot is tied up in the selection of this individual ... pick the right person, and the system might well flourish ... pick the wrong person, and things might well get worse! In any case, training needs an advocate at the highest levels in the NWS. In the absence of that high-level commitment, the disorganized chaos of the existing programs simply represents a set of showpieces that allow the NWS to say it's doing something, whenever the subject comes up, without actually doing much at all.
Another issue of mine is the "certification" of forecasters. I am constantly told that the certification of forecaster competence at the job is impossible, especially because of the NWSEO (the dreaded union). I won't deny that the NWSEO is a hiding place for incompetent bozos more concerned with their getting a perpetual paycheck for doing essentially nothing -- that doesn't characterize everyone in the NWSEO, I hasten to add. The fact that management is somehow so afraid of the union that they can't do anything about certification strikes me as literally incredible. After all, what could the union do? Federal employees are forbidden to strike and, as the air traffic controllers demonstrated so clearly a few years back, if they did, the government is quite capable of responding. In fact, it's appropriate to refer to the air traffic controllers in this very context. The FAA's air traffic controllers (ATCs) are required to attend a rigorous entry-level training program -- I call it rigorous because lots of those who enter the program wash out before it's done. Those who wash out are simply gone. Every controller has survived a really tough entry-level training evolution. If the FAA can do it, why can't the NWS? I've been asking that very question for 20+ years and have never heard anything even resembling an answer!! I'll have more to say about certification later.
There clearly is no high-level NWS commitment of a reasonable level of resources aimed at providing rigorous training for NWS people, matched to the investment in new hardware. The message is clear: people don't matter. In fact, the goal clearly has been, and continues to be, a reduction in staffing. How can the NWS afford its hardware investment? Obviously, the pact with the politicians has been to reduce the payroll in exchange for that hardware ... I think the gamble that NWS management has made consistently is that staff reductions will be fought by local politicians seeking to keep the same number of Federal jobs in their districts, thus allowing the NWS to have its cake and eat it, too. Bargain away the staffing, and then hope that politicians will fight among themselves to keep Federal staffing level in spite of the bargain. I believe this to be a dangerous game that the NWS is playing, and one that has morale in the field at really low levels ... but I digress.
Essentially, there is no training program, apart from the pathetic "Forecaster Development" course at the NWSTC. The Science and Operations Officers (SOOs)  in the local offices get a COMET course as they enter into their position. Various distance learning modules constitute the continuing education within the system. That's about it. No rigor. No serious effort to get qualified instructors. Trivial resources, scattered among three disparate facilities. No certification. Trivial continuing education/training. Very limited professional growth opportunities for field staff. No training advocacy at the highest levels in the NWS. Not much of a program.
From this background, what I believe to be substantive should be pretty obvious by now. Let me develop this in bullet form first and then I'll expand briefly on each item
A plausible entry level training program should begin with a battery of tests, designed to assess the entrant's meteorological knowledge level and to weed out those candidates who clearly are unsuited to become weather forecasters. Regarding the latter point, in order to do this, we must determine what it takes to be a good weather forecaster. What skills and inherent abilities must a good forecaster have? No one really knows, at the moment. Let me return to my military metaphor. To be a jet fighter pilot, everyone agrees that they must have outstanding vision. It's not sufficient, but it is necessary. Anyone with impaired vision is clearly unqualified to be a jet fighter pilot, no matter how well-qualified they may be otherwise. Everyone I know in the forecaster community agrees that we have the weather forecaster-equivalent of legally blind jet fighter pilots working on the desk right now! Somehow, we must determine how to test for the suitability of an individual to be a weather forecaster and keep those unsuited for the task out of the system from the outset. The task of learning what it takes to be a good weather forecaster certainly will involve outside expertise, say cognitive psychologists, to work with us ... it's not a project that can be done easily but if we care about the quality of forecasts from our human forecasters, it's something that simply must be done. For those whose meteorological education is shown to be inadequate via the entry tests, they would be enrolled in a remedial meteorological education course. Failure in that course, which would be tough and fast-paced (8-h days, 5 days per week, for 2-3 months), means the candidate washes out -- gone!
Presumably, there would have to be some considerable discussion about the details of the exam and the course of remedial education. I know that the exam should not be just a test of how well the candidate has memorized a textbook, or handles mathematical derivations. Rather, the test should be keyed around the ability to do meteorological reasoning. And the remedial coursework should use classical course material but in a new way, to emphasize the application of meteorological concepts and physical understanding in the task of forecasting, preparing the student for the exam (but not just keyed to passing the exam).
Assuming successful completion of the testing, the candidate moves into the training program. In my view, this program must have several attributes:
Again, considerable attention would have to be applied to the task of developing the training plan. I make no claim to have figured all this out in detail, yet. Certain topics would have to be covered, others would be optional (perhaps depending on the progress of the class, or what actually happened in the weather). The idea is not to teach a fixed set of subjects ... this is not an academic exercise! Rather, the main focus is to use sound meteorological principles as they apply to the task of forecasting. Flexibility, not rigid application of "rules" is the key. The important thing is the process of using meteorology in a realistic forecast setting, not getting certain subjects "covered" or having the practice forecasts turn out to be correct.
My views of this have been driven by my own experience. It isn't easy to get students to understand the relationship between theory and practice. They need to see that relationship in the live weather, where the instructor doesn't know the "answer" beforehand. They need to see it many different times, under many different circumstances. They need to see its limitations in practice, as well as its strengths. They need lots of repetitions to convince themselves that principles of meteorological diagnosis actually relate to field forecasting. If live data are going to provide the backbone of the training content, then a program has to last long enough for examples of important phenomena to arise, perhaps several times. For all of these reasons, the entry-level course can't be less than 6 months in duration! This is no doubt a long program, relative to those to which NWS forecasters are accustomed, but I have my doubts they'd be bored with it. The Canadian program was of this order, and they were not bored! If we choose the 6-month option, then the course should be turned around at the solstices, so that the live weather includes both warm season and cool season events, for illustrative purposes.
The value of using operational equipment should be obvious ... it's important to be able to evaluate the student's progress in a way that is comparable to what they will do in the field. A student's shortcomings in performance will be clear; those unable (for whatever reason) to improve on their shortcomings during the training course will wash out, irrespective of their academic accomplishments ... gone. I believe firmly that about 30% of the candidates should be washing out, on average, for the program to have serious credibility ... comparable to the FAA's ATC training program.
With the successful completion of the training, the program is wrapped up in a comprehensive final examination that doesn't simply test the candidate's ability to memorize, but focuses on performance in forecasting situations. Note: "grading" for the course is essentially pass/fail .. this should not be treated like an academic exercise. The student is considered either qualified or not qualified. The grading wouldn't be focused on getting the "right" answer so much as the methods used to obtain an answer. Successful completion of the final exam would be rewarded with the certification of the candidate. Failing the final exam would mean a review of the reasons for the failure and a decision made about the candidate's being allowed to take the exam again ... I'd vote for "two strikes and you're out" but with the proviso that after the first failure, some remedial training might be required before taking the exam a second time. No one would graduate without demonstrating all of the characteristics needed to be a successful weather forecaster. This leads naturally to:
If everyone who becomes a forecaster has survived a really tough program, this tends to build a real esprit de corps among the graduates. Each class builds lifelong friendships within it that can be valuable in the real world. From what I observed about the now-vanished AES training program, most of those who were "on course" together tended to be permanently connected by the experience, and would contact each other even when separated by great distance, because they knew and trusted each other.
In the past I've drawn analogies with the medical communities, but now I want to return once again to a military analogy: the elite military units (like the Navy Seals, or the Marines). Being a certified forecaster would become a thing of pride. No longer would forecasters be considered the washouts of the academic programs. Being a forecaster would become something to which young people would aspire, not something to settle for if you couldn't cut it academically. No longer would forecasters feel like second-class meteorologists with respect to their academic colleagues. The academic types would be forced to look upon forecasters with a new respect. Perhaps most important of all, the quality of forecasts would almost certainly improve dramatically! The staff within an office would simply not tolerate anyone who wasn't as committed to the task as the rest of the team. All of the good things associated with the professionalism of elite military units would apply to forecasters.
Clearly, if certification is associated with successful completion of a very tough program, it is not enough simply to let the process end there. Certification should not be granted for life! Thus, it seems plausible to me that forecasters should be recertified at regular intervals (say, on the order of every 5 years) by having to demonstrate continuing professional growth and forecast competence. I don't know the details of how other professions deal with recertification, but if certification isn't simply a matter of showing up, then recertification shouldn't be, either. Some ideas:
Failure to be recertified would be associated with a period of probation and required additional training and/or education, followed by a second review. Again, I'd recommend the "two strikes" policy -- two failures would result in mandatory termination of employment as a forecaster. Recertification has to be a tough policy, or it has no meaning. Failure to do so sends the message that performance isn't important.
4c. Real connections
A continuing frustration to me with meteorological education, as well as meteorological training, has been its fragmentation. Students learn their material in little "boxes" [courses] where little or no effort is made to connect the content of that particular course with content in other courses. It's mostly up to the students themselves to provide the connections, to develop a synthesis of the field's components. This isn't entirely bad -- for the intensely motivated few, the exercise can be quite worthwhile, fostering self-reliance. However, most students aren't up to the task and this gap often shows up in the inability to put those pieces together in the forecasting arena.
It's especially bad when training is given in this compartmentalized fashion. At no time in the current education and training process are all the component parts of meteorology re-assembled for prospective forecaster. Radar training is given in a module. Training about convective storms in another. Extratropical cyclones in still another. The modules may be separated by years of no training at all. I assert [without proof, since it's never been tried] that a substantive training program must put considerable emphasis on tying together all the disparate pieces. A coherent process, such as I've suggested in section 4a, above, would include enough flexibility to spend time filling in gaps about various topics during a consideration of the meteorological situation as a whole. The map discussions and lectures would provide information in context and, properly done, would make a point of emphasizing connections among the topics all along.
No matter how good a module is, if it is taught in isolation from the other connected topics, then it's woefully inadequate for training. My familiarity is with severe convection ... it ties intimately to a vast range of topics [virtually all of modern meteorology, in fact]. Teaching about how storms respond to their their mesoscale and synoptic-scale environment is a perfect time to discuss how those environments evolve, and what role convective storms play within them. It's also a perfect time to talk about integration of information from diverse sensors to come up with the best possible assessment of those environments. I could go on and on, but the basic idea is that isolated blocks of training are vastly inferior to a lengthy program that includes an emphasis on connections among topics. It takes time and many repetitions to see meteorology as a coherent structure emerging from its component parts. Isolated modules, irrespective of their individual quality, cannot provide this essential piece of meteorological understanding.
4d. Professional growth
In order to require professional growth experiences for recertification, the NWS would logically have to provide opportunities for professional growth. My suggestion is that this take the form of a forecaster's "postgraduate school" along the lines I am going to develop in the following section.
This is an idea that, to the best of my knowledge, should be attributed Bob Maddox. It's a dream that Bob had, shared with me, and now I continue to believe in that dream. It's quite probable, however, that I won't live long enough ever to see it become a reality. To a considerable extent, I've borrowed concepts from my experience to graft onto the basic matrix Bob shared with me. For example, I was privileged to see the marvelous entry-level training program for forecasters that the Canadians once had in the Atmospheric Environment Service (AES). Sadly, the budget guillotine fell heavily on the Canadian training program -- it was gutted and the terrific training system that we used to be able to point to is now just a fond memory, the people scattered to the four winds and the idea stomped into the dust.
Nevertheless, the model lives on in our minds, if nowhere else. What I now envision has several components:
Clearly, all of the training/education programs of the NWS would reside in the FPS. Done properly, it could be accredited such that its courses would be acceptable in universities for graduate-level credit. It might even become a degree-granting institution in its own right, roughly like ETH in Switzerland, although this certainly isn't necessary for what I have in mind. In effect, this becomes the NWS training and educational facility. NWS forecaster training would be part of our investment in NWS people, but outsiders could be invited, say from the private sector, if their participation was paid for -- it might even be possible to run such a training facility as some sort of COMET-like collaboration between the NWS and someone else ... but not at COMET, please!!!
For staffing, I think there should be a mixture of permanent and temporary staffing. I think educators comparable to university professors (certainly with a requirement for doctorate-level education, themselves) would necessarily be permanent staff. These would serve as the educators in the FPS, and clearly their selection would have to be done pretty carefully ... it would be nice to be able to attract some really good folks for these positions, as their importance to the program would be very large. If you could find professors with operational experience, that might be best, but lacking that, I'd want to pick folks who have a track record of doing operationally relevant work. Having exceptional educators with an operational slant would be a major asset for the FPS ... presumably, attracting these folks would require some commitment of substantial resources, not just for salaries but for facilities at the FPS.
The temporary staffing would be the instructors for the entry-level training course and the other residence training courses. These should be field forecasters who demonstrate the highest levels of competence at weather forecasting -- the instruction by real, live weather forecasters with a successful track record would be invaluable to the candidates, giving that instruction a credibility no university program could ever have. I'd suggest that the period of residence should be three years, with the option of renewal for at most one more year. Following their residence at the FPS, the instructors would return to the office from which they came. Thus, being an instructor at the FPS would not be a dead-end job in the NWS ... they'd return to the field and continue forecasting.
A big reason for a forecaster wanting to go be an instructor at the FPS would be the perception of the experience as an exceptional one, reserved for the creme de la creme, the "top guns" in the field, in terms of professional growth. It's widely accepted that teaching forces a conscientious teacher to know the material as thoroughly as possible. Moreover, a substantial program of having visiting meteorologists spend time at the FPS would be a chance for "training the trainers". Rather than having guest instructors come in for a day or two to teach in classes, the visitors would spend their time imparting their knowledge to the FPS instructional staff. Such visits could be short, say on the order of a week, or they could last as long as a year. In fact, it would be ideal for university professors and other research meteorologists to be able to come to the FPS on sabbatical leave to interact with the staff and do forecast-related research. The best field forecasters would seek to avail themselves of this opportunity simply because of its exceptional chance for professional growth, and the local office management would be eager for those "top guns" to return to their office at the end of their FPS stay ... both for their expertise in forecasting and for sharing what they've learned with the staff. By being instructors at the FPS, those "top guns" would have the chance to get some real training in how to be a trainer! Thus, the SOO's responsibility for local training could be reduced (some of the veterans of the FPS might eventually transfer to SOO positions, of course).
If there was a real commitment to distance learning as a tool in a diverse and substantive training program, the skills necessary to produce high-quality modules would not necessarily be skills grafted onto a meteorologist. Hopefully, some folks who really have the education and qualifications to produce training materials would be on the staff, working along with the meteorologists to create effective programs. I am not philosophically opposed to distance learning, but it should not be the sole basis for continuing education. Rather, it's viable as an adjunct to other more conventional training programs.
It would be useful to have the FPS send out a "road show" at regular intervals, to put on short courses in the field -- bring the courses to the field rather than only having the field forecasters come to the FPS. This gives the staff the opportunity to visit other offices and see for themselves what is really going on, as well as to share their knowledge. It would also be cheaper than having forecasters come to in-residence FPS courses, but the latter should still be offered and budgeted for as a routine part of NWS training programs!
Finally, a critical part of the program is a substantial commitment to assessing the impacts and value of the training. This means a systematic, objectively-measured look at forecaster behavior before and after any training evolution. If verification of forecasts is necessary, then so is evaluation of the effectiveness of training. If the training and educational programs of the FPS can't be shown to affect forecaster performance in a positive way, then those programs can't be justified! Putting this kind of burden on the staff is tough, but it's necessary. The only valid reason for having training and education, after all, is improved performance by the forecasters. If you can't show that this "bottom line" is being met, then the program isn't worth having. Techniques for doing this are not even well-established, and I don't mean to imply that I know the answers to questions about how best to do this. Like problem with identification of the attributes associated with being a good forecaster, we need to involve other disciplines to help us design a meaningful way to measure the performance of our training programs in ways that actually help to improve that training.
Clearly, to cycle the existing staff through a year (or half-year) long program would mean incredible dislocations and a huge cost to operations. This, I agree, is probably something that can't be done. Assuming that 30 percent of the existing staff would fail the training course, it's not obvious what to do with Federal career employees who are no longer viewed as qualified to do their job. And of course there's the union.
I believe the NWS would have to "grandfather" the existing staff -- or, perhaps, run them through some sort of "distance-learning" exercise to get them certified. They might still be held accountable for recertification five years from the first, short-cut version of certification for them, however it might be done. But the FPS (or its equivalent ) has to be in place before any of this can go forward, so that the existing staff has a chance to take advantage of the newly-developed, substantive training programs. At the very least, this means that more funds for all sorts of training programs (including in-residence training) have to be allocated, once the certification program begins.
If an entry-level program ever gets started, as the new graduates disperse to the forecast offices, in 20 years or so the NWS will be transformed. Moreover, I believe that the new graduates will begin to have an observable impact much sooner than that! Unfortunately, this requires that the FPS be funded and the programs developed ... it's difficult a way to move gradually toward the FPS. The FPS is a process that depends on having all its parts working before it can work at all.
To me, the key to the whole problem is developing a framework in which substantive training can occur. If the NWS keeps telling itself that it can't be done, it certainly won't, and they'll remain in the position of not having a process in place to make the transformation I believe needs to happen. "Business as usual" will prevail. The mere existence of the process is no guarantee (as Erik has noted, the process needs to be associated with the right people to be successful), but I can guarantee that the absence of that process means that no substantive progress will ever be made! If training is ever to be given its proper emphasis, there is no escaping what needs to be done. All that's needed is the will to make it happen.
I've focused on the notion that the NWS would be the organization in which the FPS would develop. In this view, the private sector folks who would like to have the training would have to pay tuition to support their enrollment. Such a program, if it existed, might help underwrite the development of the FPS. However, it could be that the NWS would choose to not be a leader in forecaster training ... after all, their history is dominated by a lack of interest in the subject.
That makes another option possible: the FPS could be created outside the NWS! Perhaps some enterprising university or a private company would create a substantive forecaster training program, as a sort of enterprise venture. Of course, they might have the same desire that colleges/universities have to avoid washing anyone out of the program ... cuts into their tuition-based profits. A private-sector FPS might have a similar lack of motivation to be rigorous. That's not to say that rigor is something automatically excluded ... sometimes high quality can be an important selling point for the output from a company/university. It would be interesting to see how this void in the training is filled; that is, to see if someone decides to try to use the failings of NWS training as a spur to create a private sector program. There probably would have to be some sort of accreditation process to certify the quality of the certifiers. At this point, I can't imagine how this might happen, although presumably some enterprising university could form the core of such a program. If the program was good enough to produce notably superior forecasters to those produced in the absence of such a program, perhaps the NWS would be forced to compete by sending their forecasters to such a non-NWS program, and they'd end up being the ones to fork over tuition money. Lacking the will to create such an organization, they'd have to ante up to keep up anyway!
In such a private-sector FPS, there could be a graded sort of certification; several levels of certification with increasingly difficult qualifications at each level. Perhaps there is a market for different levels of forecasting skills in the private sector; I have no idea, of course. Graded levels of certification might permit entry into the forecasting field at a relatively low level, and allow some folks to work their way up the forecasting "ladder" created by graded certification. All sorts of possibilities exist.
I would have liked to have provided more feedback from my readers, notably from my friends and critics in the NWS. I'd love to present point and counterpoint in this context. However, my attempts to "publish" their discussions on this page have resulted in considerable back-peddling and expressions of "discomfort" over being seen on this page. It seems that those few willing to provide feedback to me are afraid for themselves and their friends -- speaking out is apparently asking for some sort of disastrous consequences [unnamed, but real to those who fear them]. I'm sure the NWS bureaucrats will deny any efforts to browbeat their staff into quiet submission. They'e recently done so in public with respect to an incident I can't even mention for fear of recriminations on the individual involved -- in spite of clear evidence that it is happening. Some of my readers were afraid that training gains somehow would be forfeited if they even were merely referenced on my page. The message is "Be afraid. Be very afraid!"
I wish I had the same power to create positive change that's attributed to me for negative change! I'm told that my words can be used against those trying to accomplish positive ends in training, and elsewhere. If I were to withhold my words in the cases where those words might be misinterpreted or misapplied, then I could never say anything! I can't possibly say something that someone won't be able to misinterpret or misuse. Sorry, folks -- I have to be who I am, and if that means someone turns my words against you, I'll be in the vanguard of those decrying such an error. I'll not censor myself, even if you've been so cowed into submission that the bureaucracy no longer needs to censor you overtly ... you censor yourselves!
Although the situation has not changed substantially within the NWS bureaucracy during my career, it seems the climate of fear that always has permeated the NWS has been heightened by recent events, perpetrated by new folks at the top of the chain of command. A new hostility to the staff is being perceived -- even if it's an error in perception (and I don't think so), that perception remains. At a time when input from the staff would be most valuable, its public expression is being suppressed, apparently so that it can't be misused by the politicians. Even the fear-creators are afraid, it seems. I can only say that I'm appalled at the heavy-handed suppression of any public discussion of the important issues confronting the NWS. It speaks ill of their democratic values and commitment to the quality of their productive efforts.
I challenge anyone in the NWS bureaucracy above the local office level to present your side of the issues I've discussed -- on this page! Your unwillingness to discuss the issues or even to allow the issues to be discussed by anyone at the staff working level in the NWS is shameful cowardice and blatant political gamesmanship at a time when real leadership is so desperately needed, when respect for and interest in, the opinions of your staff is so important in working toward a really modern NWS! The climate of fear that is so pervasive at the working level should be an embarrassment to the entire organization!