Two misconceptions threaten future of radiology training

2014 03 03 12 35 25 601 Training Return Button 200

What is required in radiology now is bravery to question orthodox modern training and its inherent misconceptions. Be warned, however, that the medicopolitical climate is against deregulation of any form. A degree of regulation is sensible and prudent, but we must recognize and fight professionally demeaning regulation. First though, we need to get our own house in order and address two incorrect assumptions.

Misconception 1: More structure in education is better

Although we understand much about how adults learn, we are all unique learners. There are a multitude of educational opportunities and media now available for radiologists to learn or stay up to date. Each of us has preferences in how we actually go about learning. Similarly, the factors that motivate us to learn are complex, tacit, and often deeply embedded in unique social and personal contexts.

Dr. Paul McCoubrie is a consultant radiologist at Southmead Hospital in Bristol, U.K.Dr. Paul McCoubrie is a consultant radiologist at Southmead Hospital in Bristol, U.K.
Dr. Paul McCoubrie is a consultant radiologist at Southmead Hospital in Bristol, U.K.

We have different pre-existing knowledge and skills. Some of us learn certain notions or techniques more rapidly than others. Furthermore, we are adaptive learners; we will adopt whatever learning style necessary to achieve knowledge or competency in the fastest possible time.

We also know that good postgraduate education should aim to enrich, stimulate, and enthuse. It should be flexible enough to encompass disparate motivations, personalities, learning styles, and rates of learning.

However, the more structure there is in education, the less flexible it becomes. Hence, few people find that a heavily guided or prescribed course suits them. Note that this is both trainees and trainers alike. Thus, educational rigidity promotes mediocrity, as it is difficult to excel if shoe-horned into an inflexible "one-size-fits-all" structure.

I am not advocating abandoning high educational standards. Quite the reverse. But I am saying that it is the end product of a competent or expert practitioner, which is crucial. We should be very clear about the standards we expect to be reached or maintained. And we should assess trainees very carefully to see that they have done so. I am also saying that the method used to arrive at such competencies should not be mandated. Sure, suggest a framework of how one might become competent, but recognize that the route taken to get there varies hugely between individuals.

Like the sound of this? Well, like it or not, these are the principles of competency-based training. Radiologists should embrace them to take radiology training into the next century.

Misconception 2: Counting cases is valid

Another assumption is that competency comes with repetition. With a task that uses largely motor skills, there is some validity in this. Perform a physical task a number of times and you get slicker. For interventional techniques consisting largely of psychomotor tasks, such repetition is important, but complex nonmotor tasks such as the visual perceptive skills of a radiologist don't necessarily follow this.

There is also the concept of "getting your film miles." This is the ill-explained, nonfoveal visual expertise that comes after having seeing tens of thousands of chest radiographs that enables you to spot left lower collapse in 0.01 seconds. This concept also includes the million and one radiographic factors, normal variants, and common abnormalities that become second nature.

So, yes, experience counts for a lot. Without experience, book-based knowledge remains just that. But experience must be varied, broad-ranging, of varying difficulty or complexity and reinforced by feedback that comes from opinions of other experts, clinico-pathological correlation, or follow-up. This well-established cycle of experience, learning, and reflection ensures that the trainee moves linearly and safely from conscious incompetence to conscious competence.

Thus, simple repetition alone is potentially harmful. It induces confidence, but basic errors may be perpetuated and there is no guarantee of expertise developing. Conscious incompetence may become unconscious incompetence. Not knowing the boundaries of your expertise is a route to medicolegal hell and damnation.

Much of radiology training focuses on producing a logbook detailing sufficiently high numbers of films reported, procedures performed, and so on. It ignores, and hence sidelines, other sources of learning such as consultant-led teaching, film and Web-based archives, textbooks and journals, and courses and conferences.

Why does training focus on numbers so much? I would like to introduce you to the McNamara fallacy. This can be described as follows:

The first step is to measure whatever can easily be measured. This is OK as far as it goes. The second step is to disregard that which can't be easily measured or to give it an arbitrary quantitative value. This is artificial and misleading. The third step is to presume that what can't be measured easily really isn't important. This is blindness. The fourth step is to say that what can't be easily measured really doesn't exist. This is suicide.

So, counting numbers is easy but crude as it is only one aspect of radiology training. It lacks validity as it ignores the quality of the experience, doesn't assure competence, and ignores other methods of learning. Some have said that they find "educational bean-counting" demeans professional training.

However, I am much, much more worried about the Law of Educational Cause and Effect. This states: "For every evaluative action, there is an equal (or greater) (and sometimes opposite) educational reaction." Sometimes called consequential validity, it explains the occasional bizarre and/or maladaptive behavior that trainees exhibit in the run-up to some form of assessment. Phrased metaphorically, "the assessment tail wags the curriculum dog;" or put more crudely, "grab the students by the tests and their hearts and minds will follow."

Radiology training becomes a primary quest for achieving logbook numbers. The totality of experience is short-circuited by this lust for padding out the columns. Less scrupulous trainees "guestimate" their numbers, leading to some pretty inaccurate figures. Some frankly dishonest trainees may "join the dots," implying they have done a lot more than they really have. On the contrary, scrupulous trainees who have recorded everything in meticulous detail may have logbooks that look a bit patchy. Yet, independently verifying logbook numbers is virtually impossible. So, as an assessment method, it is not only crude and educationally invalid, but is easily fooled and therefore neither sensitive nor specific.

The supremacy of logbook numbers has had its day. Witness the dawn of the era of competency-based training; let us welcome it into our hearts.

Dr. Paul McCoubrie is a consultant radiologist at Southmead Hospital in Bristol, U.K.

The comments and observations expressed herein do not necessarily reflect the opinions of AuntMinnieEurope.com, nor should they be construed as an endorsement or admonishment of any particular vendor, analyst, industry consultant, or consulting group.

Page 1 of 1259
Next Page