What went wrong with MOS Testing?
Instead, try proficiency testing by pros, reorder career maps, & slow down promotions
Soldiers from across the 25th Infantry Division and U.S. Army Hawaii test their proficiency in basic infantry and Soldier tasks in the hopes of earning the Expert Infantryman Badge or the Expert Soldier Badge in April, 2021. (U.S. Army photo by Spc. Jessica Scott)
In 1982 I was stationed with 2nd Bn 30th Inf in Schweinfurt, GA when late in the year it was our unit’s turn to take the relatively new MOS proficiency test. It was a written exam for our maintenance MOS personnel’s turn to be evaluated on our knowledge of performing what were considered critical tasks for our MOS. The Marne Division at the time was being fielded the “Big 3” combat systems, the Apache, the Abrams, and the Bradley, we were at the front of the line for Army fielding. Already, the tankers at Conn Barracks had been issued their M1s and "Fonzi" was all over the front pages showing him in the TC hatch of a tank while the driver was tearing up the TA during a PAO event. The mechanized infantry battalions were eagerly awaiting our Bradley's, and my recent AIT graduate-buddies and I were among some of first Bradley mechanics in the division, if not the Army. Our unit had been drawing all the latest gear in a pre-rapid fielding initiative world in a theater who was just coming off extreme underfunding, and so the spigot was just being turned on to the 3ID for resources. So here were a bunch of us trying out a test for new MOS.
Created in 1977, the Skill Qualification Test (SQT) was a test on your grasp and memory of critical tasks in a given MOS or career field. One of the problems for many branches and MOSes was that the TTPs, doctrine, or equipment they were testing us on was outdated. Our mechanical MOS was so new that there really was not an exam, and instead we were evaluated on the previous non-Bradley MOSes in the mainstream Army units, which were MOS 63C and 63F, which most of my peers had formerly been involuntarily reclassified from. So, we were evaluated on the M48 tank, the GOER, the gasoline powered 5-ton cargo trucks, and more outdated equipment, or at least none that I was ever trained on, or that was in use in the battalion. But at least I had seen a gamma goat before I had to describe how to adjust the brakes on it. The scores across the battalion on MOS testing were quite abysmal, or so went the scuttlebutt. I didn't hear about SQT again until I got to the 24th Division, then at Fort Stewart, who had recently been designated as the Army test division as they readied to be next up for Big 3 fielding.
For me and soldiers in my career field, MOS testing was always a flop, and one that seemed unorganized, not useful, and measured the wrong skills. And unfortunately for the Army, it took a lot of resources to develop each test, proctor, administer, score, and return the results, all for which really had no effect on my career. That is not to say it was that way for everyone, as for a few years it was a factor in evals and promotions. But it seemed for so long it, and it and its replacement, were under review, were in trials, or “wouldn’t count” this year or the next. If you think back to how the ACFT was rolled out, you could see some similar the parallels.
From my time in 3rd ID up to at least 2007 I was somewhat always involved in unit testing of indidual skills and competencies, either common task testing (CTT), SQT, or the last dying gasp, skills development testing. I was either the one taking those tests, a tester, or a part of the unit NCO support channel planning and executing testing. And then later after they had mostly faded away as operations tempo increased, I was selected as a part of a select group of senior Army NCOs assisting the research community in studying the concept of reinstating testing. And let me tell you that my point of view from then to now has not softened. And that is, the Army was not able to pull it off testing then, and on its own, probably couldn’t do it now. I am not one to throw cold water on conversations on bringing back MOS testing, but as one who has lived it and studied the concept from all angles. I say before we rush to failure that it’s important to first look back at the past to see why it failed, removed those barriers, and then factor in modern challenges before a serious attempt is made. And most importantly, decide why you need it, and when! And the once you get it right, tie it to advancement. But first, are there other options?
Well, if there remains no MOS testing, how else could we objectively score technical competence in our NCOs? Well, I suggest we should not be considering that an issue after we already selected and promoted a soldier to the next grade! To me, it makes more sense that an NCO demonstrate competency at the next grade BEFORE we promote them. It’s time to scrap promotions based on “potential” and shift to promotion upon demonstrated competency. Yeah, that concept will throw our NCO development systems on their ear, but today the Army is in a holding pattern for a relook on structured NCO development continuum anyway, this is the perfect time to consider innovative ideas. During this pause, lets relook potential based promotions and instead investigate promotion on skill mastery instead. And we can using testing to demand mastery at each grade before we possibly advance people to quickly, or if they are not ready.
Our promotion system and methodology appears backwards, and it may be time to slow down promotions to allow NCOs to grasp certain tasks before they upgrade. Instead of selecting, training, educating and promoting I suggest we need to consider first training in operational assignments and then we annually test soldiers for MOS proficiency, externally. When an established level of proficiency is attained then promote, and send for basic, and later advanced, professnal military education. Sprinkle in certifications and higher education that matches their career field andno need to add a cutesy label, just call it THE NCO development program. Besides simply reordering the recipe, but this time we add in the missing ingredient. In this case to overcome earlier problems we add-in external exams and testing.
A problem I have seen with testing in the unit is the Army itself, meaning TRADOC in this case, could never keep up with the field. The institutional side of the Army is almost always underfunded and getting the left-over chicken…TRADOC seems to get the remaining dregs for DOTMPF. The best people, the newest equipment, the most money, the best buildings all go to the warfighters. How can MOS proponents write tests on gear, TTPs and standardized procedures if they have not seen, used or have access to?
And do not forget, the NCO and pre-NCO cohort, E4 to E-8, who probably most need this type of system, is pushing on 50 percent of enlisted strength. And this is across almost every MOS that is always in constant transformation. The scale and size of managing technical related competency exams is an undertaking not fully understood by many, considering what that would entail. By design, outside of say West Point or the civilian cohort, TRADOC doesn't have professional educators but instead operates on TDA assigned borrowed military manpower. Back in my day, the problems were that amateurs were creating the tests.
To get to a new mindset and achieve a more robust plan we must first reframe it not as an in service exam like the old SQT or SDT, but as a gateway to the next grade, and test before you get in position, not after. So first we must fix the promotion system. But after that, it's about facing reality that no matter how much we wish for TRADOC to do this task for the Army, it is time to hand off this important work to professionals.
I suggest that Army award a “no compete” contract direct award to the American Center for Educational Testing (ACET), the same agency who delivers the SAT and GRE, to develop and administer promotion exams in collaboration with MOS and branch proponents. Or the Educational Testing Service, the same ones who seem to manage CLEP without letting the test answer key get out. And whoa boy, those old MOS SQT answer keys were rumored to be for sale or obtainable back in the day. Today's dynamic exam tools that professionals like ACET and ETS protect their tests, their processes, and their reputation like their business model depends on it. Because it does.
And, once you get a years’ worth of data that the promotion tests create, now TRADOC and the research community have this new and extremely valuable commodity; raw data that only needs to be analyzed and converted to actionable knowledge. The Army can use test score data to determine what subjects to teach at NextGen PME, to continue to be created and administered by NCOLCoE and other proponents. Need a critical task selection board? Just look at raw test results from the prior year promotion tests, proponents will know in a damn hurry where deficiencies and shortcomings are. No more pet projects or “just in case training” being forced into POIs, as long as leaders hold the line on what gets allowed into course management plans and restrict insertions to test result priorities.
When the Army studied returning back to testing prior to 911 some of the biggest concerns we voiced was that the institution could not keep up with the changing dynamics within the force to keep the testing relevant and secure. That was the same in 1982, and unless you think history does not repeat itself, you can bet without intervention it will happen again in 2025. I have held out hope that as technology and the way we manage knowledge would bring about improvements to our testing capability. The plain facts are that military service and the tools of our trade keep getting more complicated. And after continued cuts our personnel and education systems that were never properly resourced in the first place to keep up with the changes in the operational force unfairly never seem to catch up.
We need competency testing, but it's going to take a major change to how we select, promote and educate NCOs, and it may need to be in an entirely new model compared to what we have been using for generations. Is it time to let go of the centralized and semi promotion systems and finally break through resistance to promotion testing first, and then pick a route for testing that uses industry leaders instead of trying to “roll our own” on the cheap? And unless we consider the size and the scale of the entire Army NCO corps, active, guard and reserve, and the resources it takes to manage testing for a cohort of that size, it is not until all that is considered that you begin to see how truly underfunded NCOPD has been. Let us test but make it right from the get-go, as a barrier to promotion.
Dan
/Topsarge
Any thoughts on the Army Aviation maintenance training program? TC 3-04.7l AMTP. Built in progressions, no notice and annual evaluations, Commander Evals to designate skill levels. AMTP is not perfect, but it is a good start to what you are talking about.