Dave Eubanks, a long time critic of assessment’s indifference to data quality, has an excellent guest post today on John Warner’s “Just Visiting” blog at IHE.
In it he explains his path from assessment believer to sceptic. Very few people in the assessment world actually have training in stats, math or data science. Eubanks is a mathematician by training.
It became clear that the peer-review version of assessment is almost perfectly designed to fail as data-generating exercises. If you sat down to “backwards design” a plan to waste enormous amounts of time and money, you could hardly do better than the assessment report-writing machine that we have now.
He also points to new guidance from the Department of Education that directs accreditors and assessment offices to stop wasting everyone’s time and money.
The Department of Education recently weighed in on this topic. They rewrote the handbook for accreditors encourage more freedom in meeting student achievement standards (pg. 9):
These standards may include quantitative or qualitative measures, or both. They may rely on surveys or other methods for evaluating qualitative outcomes, including but not limited to student satisfaction surveys, alumni satisfaction surveys, or employer satisfaction surveys.
This new language is important because it challenges two of the report-writing machine’s rules, viz: that course grades don’t measure learning, and that survey data (“indirect assessment”) isn’t useful on its own.
More explicit language telling peer reviewers to back off can be found in the proposed rules for accreditors:
Assessment models that employ the use of complicated rubrics and expensive tracking and reporting software further add to the cost of accreditation. The Department does not maintain that assessment regimes should be so highly prescriptive or technical that institutions or programs should feel required to hire outside consultants to maintain accreditation. Rather than a “one-size-fits-all” method for review, the Department maintains that peer reviewers should be more open to evaluating the materials an institution or program presents and considering them in the context of the institution’s mission, students served, and resources available. (Section 602.17b, pg. 104)
In other words, scrap the machine.
My guess is that the assessment machine will rumble on for long time. Too many people make upper-middle class incomes from it for it to just go away and too many senior administrators have been pretending to believe in its efficacy for too long to reverse course without eating a lot of crow.