There’s an old aphorism that says “may you live in interesting times.” Its origins are apocryphal, though it’s often incorrectly presented as an old Chinese curse. And, frankly, it’s understandable why that myth persists; “interesting times” are rarely all they’re cracked up to be for those living through them. (Trust me on this: I’m a historian.) For those of us who live and work in higher education, the interesting times are in full swing. Uncertain economic prospects, the vicissitudes of often misguided education “reform,” discourses that pit research against teaching as if the two were mutually exclusive–and that was just a regular Tuesday in the Chronicle of Higher Education!
Keeping track of our higher-ed world, and the ways in which it shapes how we and our students engage with each other and with our society, can be a full-time job. There’s a dizzying amount of information, data, and reflection available to us. Unfortunately, it’s also available to polemicists who acquaint themselves only superficially with it before they write an op-ed in the New York Times or Inside Higher Ed drawing sweeping, dramatic conclusions about how we’re all Doing It Wrong. It’s not easy to keep a positive outlook, or a sufficiently wide perspective, when we’re inundated with gloomy pronouncements intoning nothing but doom and gloom for the academic enterprise:
This week’s links are intended to help us get a better handle on the state of our field. Several important studies have recently been published, offering us some actual data-driven conclusions about our changing academic environment. But all data has a story, and making sure we tell that story both correctly and well is an important responsibility. Also offered here, then, are some of the more compelling perspectives that address how data can (and often is) misused in treatments of larger issues with which we’re wrestling.
This week, Ithaka S&R–one of the most active and important higher-ed consulting firms in the field–released the findings of its 2015 Faculty Survey, which “provides the higher education community with a regularly updated snapshot of its faculty members at a moment in time, as well as trend analysis of changes.” The full report is worth a look, as it contains some new findings in several areas like information technology, research habits, and data management.
The Pew Research Center’s recent study of technology and lifelong learning was also released this Spring, and it draws some interesting conclusions about the ways in which technology changes both the act and duration of learning. But not so fast, argues Audrey Watters; the report doesn’t engage with the very real inequalities that shape how learners interact (or, more pertinently, don’t interact) with technology. Watters’s critique is a model of how to effectively contextualize data to inform our practice.
Late last Fall, the most recent findings of the National Survey on Student Engagement (NSSE, or “Nessie”) were released, and there are a number of interesting (and some troubling) trends revolving around “rigor,” which have some significant implications for the work we do. Grand View participates in NSSE, and we’ll have a session at this year’s Summer Institute in which we’ll present our institutional results and have the opportunity to talk about what next steps they may indicate for us.
But as we look at this data, we need to bear in mind that it’s not only the answers we’ve received, but the questions that were asked. A lot of these types of studies present their results in quantitative terms. This helps us synthesize large amounts of information in an accessible and concise manner, but some degree of caution is in order. Can we improve something just by measuring it regularly? Sociologists use the concept of the “McNamara Fallacy” (named after Defense Secretary Robert McNamara’s obsession with quantitative measurements of US ‘victories’ in Vietnam) to describe how quantitative data can give us the illusion that anything not measurable is not worth studying. It’s only a few short steps from there to the conclusion that anything which cannot be quantified easily does not exist. And we then lose sight of the important array of qualitative information that can guide us from the classroom to the institutional level.
Finally, as we see all of these sweeping declarations about what higher education is or is not, or whether the research school or liberal arts college is the more effective agent of social change, we should remember that this is not a new debate, and the positions people have staked out aren’t novel ones, either. Norman Jones reminds us that, in many ways, we’ve been down this road before. Just as the model of collegiate education we practice at GV has persevered in the past, it will likely do so in the future. But, as the theorist of critical pedagogy Henry Giroux warns, that persistence can only spring from the dedicated efforts of faculty working with and among students. Giroux posits that the antidotes to coldhearted reliance on disembodied metrics, and assessments divorced from pedagogical context, is pedagogy that enables students to critique structural inequities and all of us to speak truth to power.
That’s a lot of heavy lifting for the week. But it’s incumbent on all of us to be aware of the larger environment in which we operate, as well as how that environment shapes what we do and how we do it. As large and abstract as these issues can appear, the reality is that they are fundamentally based on people like us and our students. What we do matters, in the most essential and meaningful of ways.
Looking for resources? Ideas? Help?
And, finally, may we all be as happy as this dog, who has made several new friends.