Three Interesting Articles

Lip Service or Actionable Insights? Linking Student Experiences to Institutional Assessment and Data-Driven Decision Making in Higher Education

From the abstract:

However, the institutional adoption of policies related to the collection of assessment data or the application of data-driven decision making appears to have no relationship with student experiences or outcomes in the first year of college. Thus, findings from the current study are consistent with the small, but growing, body of literature questioning the effectiveness of accountability and assessment policies in higher education.

And an older study:

“Improving Ratings”: Audit in the British University System

 

And this:

Rigorous Large-Scale Educational RCTs are Often Uninformative : Should We Be Concerned?

 

Abstract:

There are a growing number of large-scale educational Randomized Controlled Trials (RCTs). Considering their expense, it is important to reflect on the effectiveness of this approach. We assessed the magnitude and precision of effects found in those large-scale RCTs commissioned by the EEF (UK) and the NCEE (US) which evaluated interventions aimed at improving academic achievement in K-12 (141 RCTs; 1,222,024 students). The mean effect size was 0.06 standard deviations (SDs). These sat within relatively large confidence intervals (mean width 0.30 SDs) which meant that the results were often uninformative (the median Bayes factor was 0.56). We argue that our field needs, as a priority, to understand why educational RCTs often find small and uninformative effects.

 

 

 

Sullivan and McConnell, the VALUE Rubrics and the Future of your Assignments

I recently published an article (paywalled) in the Chronicle that uses data generated by an article in Change: The Magazine of Higher Learning by Dan Sullivan and Kate McConnell.  The original draft I sent to the Chronicle included a critique of that article and another article in the next month’s issue of the same journal.  My editor asked me to cut that part of the article.  Below is some of the material that was edited out:

Something of value may have just emerged from the VALUE project. A research project using the VALUE rubrics has produced results that call into question the utility of rubrics in general and challenge the foundational assumptions on which the learning outcomes industrial complex is built.

VALUE stands for “Valid Assessment of Learning in Undergraduate Education” and it’s a project of the American Association of Colleges and Universities (AACU). The name itself suggests a tacit admission that what passes for assessment at most institutions, the every-program-every-year-mandatory-loop-closing-based-on-fragmentary-data-derived-from-small-samples, is not really valid. The VALUE project looks like an honest attempt to address that shortcoming. The central and best known feature of the VALUE project has been the creation of sixteen rubrics that are meant to address outcomes that the architects of the project consider that “all students need for success in work, citizenship, and life.” These outcomes are things like critical thinking, ethical reasoning, quantitative literacy, problem solving and so on.   The rubrics are, in the language of the creators, “transdisciplinary” in that the critical thinking rubric should be able to assess that quality in a student’s work whether that work was done in an English, philosophy, or physics class.

Continue reading “Sullivan and McConnell, the VALUE Rubrics and the Future of your Assignments”

California increases oversight of for-profit colleges with input from Bob Shireman

An article in Inside Higher Ed looks at moves in California to increase oversight of the for-profit sector of higher education.  Bob Shireman and the Century Foundation have been instrumental in convincing law makers in California to step up state-level oversight as the federal Dept. of Education has reduced it’s regulation of the sector.

From the article:

Some of the proposals, which Robert Shireman and the Century Foundation helped craft, appear to go after specific institutions. Shireman, a former deputy undersecretary of education in the Obama administration and a senior fellow at the foundation, said states watched as DeVos dropped Obama-era student aid regulations that largely targeted the for-profit sector. And he said some state lawmakers subsequently approached him and other groups for guidance.

“It has become much clearer that the federal government is not interested in actually policing the federal loan program and has actually reversed a lot of the guardrails that were set up by the Obama administration,” said Shireman. “Much of the legislation we’re seeing is a combination of addressing some of the problems we’re seeing right now, but also preventing future Corinthians, future Ashfords, future Argosys.”

It says something about the nature of accreditation (which of course is heavily assessment focused) that all these schools are able to get accreditation.  Legitimate colleges spend huge amounts of time and money on accreditation, but some how the worst of the for-profits still get accredited too.  How meaningful can that process be if these academically weak and often predatory schools are able to meet the same accreditation standards as real colleges.

Dave Eubanks on Assessment and Data Quality

“Closing the loop” refers to making meaningless curricular changes based on bogus data and it is at the heart of the assessment project.  I tried to make that case a few years ago in Inside Higher Ed, but I don’t know much about stats.  Dave Eubanks is a mathematician and he does know a thing or two about the use and abuse of statistics.  So it was nice to that the AACU was willing to give him some space in the latest issue of Peer Review  to offer a critique of the way that stats are abused in assessment.

Here is sample.  The full article is here.

Assessment practice also fails at empiricism. What is typically accepted in assessment reviews has little to do with statistics and measurement. Nor could it be otherwise. The 2016 IPEDS data on four-year degrees granted shows that half of the academic programs (using CIP codes) had eight or fewer graduates that year. Such small samples are not suitable for measuring a program’s quality, given the many sources of variation in student performance. By my calculations, fewer than 5 percent of four-year programs had enough graduates to make a serious measurement attempt worthwhile. It’s safe to conclude that most of the 80,000+ bachelor’s degree programs in the United States are not producing trustable statistics about student learning, regardless of the nominal value of their assessment reports.

AAUP on the Nunez case

Here is the email that the AAUP sent about the Nunez case.  Note that there will a Facebook Live report tomorrow afternoon.

 

 

Dear Robert,

According to an AAUP investigative report released today, the most plausible explanation for the dismissal of a faculty member from Nunez Community College was that it occurred as a retaliatory measure, violating his academic freedom. Professor Richard Schmitt, a nontenured associate professor of English with twenty-two years of service at the institution, had disagreed with the administration over the accuracy of an accreditation report.

Schmitt was informed during a conference call that his appointment was not to be renewed. In blatant disregard of commonly accepted standards in higher education, he was given no due process for contesting his termination, no dismissal hearing, and no reason for the decision not to renew his contract.

A little about the work of the investigating committee, on which I served as chair: AAUP investigating committees are appointed in a few select cases annually in which severe departures from widely accepted principles and standards of academic freedom, tenure, or governance have been alleged and persist despite efforts to resolve them. Investigating committees are composed of AAUP members from other institutions with no previous involvement in the matter; Professor James Klein of Del Mar College served on the Nunez investigating committee with me.

To learn more about the case, join me for a brief Facebook Live on Thursday, February 14, at 2 p.m. ET, where I’ll discuss the investigation and its implications. RSVP here.

A recorded version will be available on the AAUP’s One Faculty, One Resistance site and Facebook page after the conclusion of the broadcast.

Nunez Community College, located in Chalmette, Louisiana, does not have a formal tenure system, and appoints all of its instructors on contracts of one year or less, in violation of the widely accepted academic standards codified by the 1940 Statement of Principles on Academic Freedom and Tenure. That statement, jointly formulated by the AAUP and the American Association of Colleges and Universities, has been endorsed by more than 250 scholarly and educational groups. Because he had served well past an acceptable probationary period, AAUP standards recognize Schmitt’s appointment to be with de facto continuous tenure. Accordingly, he should be dismissed only for cause or as a result of institutional financial exigency or program closures for educational reasons.

The administration’s abrupt nonrenewal of Schmitt’s appointment, without stated cause, after more than two decades of service, constitutes a gross violation of the protections of academic due process, and in the absence of any stated cause for the administration’s actions and on the basis of the available information, must be deemed a retaliatory measure that violated his academic freedom.

You can read the full report here.

At its June meeting, the AAUP’s Committee A on Academic Freedom and Tenure will consider whether to recommend to the AAUP’s annual meeting that censure be imposed on the Nunez Community College administration for substantial noncompliance with AAUP-supported standards of academic freedom and tenure.

Nicholas Fleisher
Chair of the AAUP Investigating Committee
Associate Professor, University of Wisconsin–Milwaukee

P.S. You can support the work the AAUP does fighting for academic freedom. If you’re not already a member, join today.

Twitter AAUP Website Facebook
Sent via ActionNetwork.org. To update your email address, change your name or address, or to stop receiving emails from American Association of University Professors, please click here.

Robert Shireman
Senior Fellow

Did a Lousiana Community College punish a professor who asked too many questions about accreditation?

An article in Insider Higher Ed today describes an AAUP report about Nunez Community College’s apparent retribution  against a faculty member (he was fired) who asked that his name be removed from a report to SACS that may have contained “reconstructed” data.

I hope to have more on this soon, but off the top of my head this raises some interesting questions:

  1. Will SACS investigate the use of what may be fake data?
  2. Will the Nunez respond that virtually all assessment data is similarly fake or at least equally meaningless?
  3. Academic freedom is part of the accreditation process.  Will SACS revisit its accreditation of Nunez in light of its apparent contempt for academic freedom? Or will it conclude that enforcing faculty compliance with assessment is more of a priority than academic freedom?

 

California Governor seeks to track student performance from kindergarten to the workforce

John Warner of Inside Higher Ed has written a blog post that urges Gavin Newsom, the new governor of California, not to spend $10 million creating a computer surveillance system that will track students as they move through the education system and into the workforce.  Warner argues that doing this effectively will be way more complicated than Newsom thinks, will cost way more than 10 million bucks, and won’t work anyway.  To get at that third point he has a and nicely annotated reading list for Governor Newsom.  Some of it will be familiar to Bad Assessment readers.  Some is new to me and thus might be new to you.

This is why I’ve compiled a reading list for Governor Newsom to consider as he makes his final decision.

For background on the limits of data and algorithms I would like Governor Newsom to read Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy by Cathy O’Neil, and The Tyranny of Metricsby Jerry Z. Muller.

Brisk and readable by the layperson, both books make a case for how human performance cannot be reduced to quantifiable measurements.

Next,I would like Gavin Newsom to read three books more specifically dealing with education:

The Testing Charade: Pretending to Make Schools Betterby Daniel Koretz. Harvard education professor Koretz shows how our thirty-year obsession with standardization and assessment has not only led to no appreciable gains in student achievement, but how perverse incentives to improve scores have driven out subjects like art, physical education, music, and recess, while resulting cheating and short term prep that has no lasting impact on learning.

Better Test Scores: A Better Way to Measure School Qualityby Jack Schneider. In this book Schneider, an Assistant Professor of Education at UMass Lowell reveals the shortcomings of the kinds of measurements we tend to use when we judge schools. How we think of a particular school is rooted in value judgments about what’s important to the individual. A tracking system will inevitably crowd out this nuance.

Troublemakers: Lessons in Freedom from Young Children in School by Carla Shalaby. In this portrait of students who are deemed “troublemakers” Shalaby demonstrates how subjecting students to a system which seeks standardization and quantification is damaging even to those who toe the line, and disastrous to those who exist at the margins.

There is more in his recommended reading list that you can see by clicking the link above.

If there is one thing Newsom’s proposal brings home to me it’s that assessment in K-12 and assessment in Higher Ed are increasingly related  issues.  It seems that states and accreditors are anxious to replicate the “success” of No Child Left Behind and Race to the Top at the college level.  It’s sure to work this time…

The Long Reach of Accountability Culture Meets Poetry

Jerry Muller’s work reminds us that learning outcomes assessment is but one facet of a broader effort to track, quantify and surveil production and performance. His work looks at everything from police record keeping to the performance of hospitals to course assessement.

So it should come as no surprise that universities that are embracing the culture of accountability not just with respect to learning but faculty productivity too.  The best response to this I have seen so far is this new poem by Susan Harlan.  This now officially makes her my favorite living poet.

A POEM ABOUT YOUR UNIVERSITY’S NEW AND TOTALLY NOT TIME-WASTING REVIEW PROCESS FOR TENURE AND PROMOTION

 

I am in the wrong the business

I was flying back from a visit to Washington, DC yesterday and was seated at the front of the economy section.  The flight crew did not pull the curtain across the aisle like they usually do in what I have to assume is an effort to tamp down resentment in the back of the plane. Because of this oversight,  I could see into the first class section.  In the aisle seat across and just forward of me was a woman working on a PowerPoint about assessment.  I half expected her to be carrying a big bag of cash marked “Lumina Foundation.”  Clearly, advocating for assessment pays a lot better than criticizing it.