The Right Stuff
As the Airman Certification Standards (ACS) project has evolved over the past five years, I have occasionally heard long-time flight instructors (CFIs) or designated pilot examiners (DPEs) tell me how much they like the ACS as a guide for teaching and training. But … then comes the comment that, well, the Practical Test Standards (PTS) — and now the ACS — is supposed to be a TESTING standard, not a TRAINING standard. Next comes the concern that the ACS will promote “teaching to the test.”
Two Sides of the Same Coin
Such comments have always puzzled me. Call me simplistic, but aren’t teaching and testing supposed to match up? At the most fundamental level, testing and training are opposite sides of the same proverbial coin. Testing is intended to measure whether, as well as to what extent, the test-taker has mastered the knowledge or skills presented through teaching and training. Since testing almost inevitably involves sampling, the content of a test also provides an indication of what the test-giver considers to be the most important things a person should know or be able to do.
Those factors alone make testing a complex endeavor with many moving parts and balances to strike. It is not a good practice to test learners with questions or maneuvers tasks that are identical to what they have seen in the teaching and training environment. But nor does it make sense to test people on topics that are outside that scope, either in fundamental content or degree of complexity. So, as anyone who has ever tried can attest, it takes a lot of thought and work to develop tests that really measure the learner’s mastery of the “right stuff.”
Tying It All Together
Good learning starts with defining the “right stuff” to be taught and tested. And that’s why the ACS is such an important improvement to both training and testing for airman certification.
As you know, the regulations list required areas of aeronautical knowledge and flight proficiency for each airman certificate and rating. The FAA developed the PTS to define the skill performance metrics — that is, the “right stuff” — for each Area of Operation and Task. Until the advent of the ACS, though, there was no corresponding set of defined knowledge test standards (KTS) metrics to define the “right stuff” for aeronautical knowledge.
The lack of a KTS to define and standardize aeronautical knowledge resulted in test questions that were disconnected from the knowledge and skills needed for safe operation in today’s National Airspace System (NAS). If you have previously taken a knowledge test, taught ground school, or tested applicants for a certificate or rating, you probably encountered questions that were out-of-date, such as the many questions on “fixed card ADF” navigation. Others were overly complicated, requiring multiple interpolations to calculate impossibly (and unnecessarily) precise performance values. And some were simply irrelevant.
Because of questions like these, training for the knowledge test involved too little guidance on things that really are useful and important for a pilot to know, and too much rote memorization of things you would never use.
That brings me to the issue of “teaching to the test.” If the training program is focused on what truly constitutes the “right stuff,” and the test is both properly aligned with the material being trained and constructed to test mastery of concepts rather than recall exact words and phrases, then teaching to the test is not such a bad thing. The process of teaching to the test is a problem when the testing process is so disconnected from the “right stuff” and from training that rote memorization is the only way to pass.
Thanks to years of careful thought and hard work by a number of dedicated people in the aviation industry and the FAA, the ACS defines the “right stuff” standards for both the knowledge test and the practical test. It demonstrates how knowledge, risk management, and skill are connected, and it provides the means to align training with testing. And that, as they say, is A Very Good Thing. (FAA Safety Briefing – JulAug 2016)