Evidence for Learning

“So you’ve built a better mouse trap,” said the mathematician over cocktails. “But are you catching any mice?”

The question caught me off-balance. Like most professors, I was used to talking about teaching and learning in ways that are strikingly different from how we talk about other forms of scholarship. The standards of evidence are lower; our credulity far higher. So when colleagues would ask how my new survey course was coming along, I felt little need to bother with evidence. “I think students are getting a lot out of it,” I would say. How did I know? The usual ways: Random student comments. A few anecdotes. Wishful thinking. And more than anything else, meticulous facial reconnaissance.

So when the mathematician asked me how I knew whether students in my course were learning anything worthwhile, I parried the question with the truth: “By the intelligent looks on their faces at the end of the term,” I said. In that moment I realized I would need better sources of evidence than that.

Which is why I had come to Palo Alto, California, to the offices of the Carnegie Foundation for the Advancement of Teaching, to meet the mathematician and twenty-six other colleagues in the Carnegie Scholars Program in the first place. Our charge: to collaboratively investigate and document significant issues and challenges in the teaching of our fields. I wanted feedback on my ideas for the history survey and guidance on how to investigate whether “uncoverage” was a viable, transferable, and effective concept. Beginning with the mixer where I was grilled by the mathematician, my experiences in the Carnegie Scholars Program led to the research program outlined here.

If the standards in academia for authenticating learning are as a rule rather low, it is also true that the bar can be set too high. Teaching and learning are fundamentally elusive activities. This does not mean that they cannot be studied, but it does mean that knowledge about them will generally be incomplete, open to doubt, and subject to interpretation. In other words, investigating learning is a lot like investigating the past. Both activities call for a process of reasoning more capacious than narrowly drawn scientific demonstration. Conclusions will be persuasive to the degree that they demonstrate a plausibility resting on multiple strands of evidence.

My intent here is not to make large claims for the effectiveness of my “uncoverage” survey. Rather, I simply intend to report some of the data I have gathered, offer my interpretations of the evidence, and invite others to draw their own conclusions. But this much I am willing to hazard, based on the evidence I have studied: At the very least, I can say with complete confidence that my survey does undergraduates no harm. The significance of this statement can be measured against the claim that traditional “coverage” surveys willy-nilly teach very little about history, except for the most regrettable and even dangerous misconceptions, thereby depriving students of certain cognitive tools everyone needs to live well.

The Plan: How did I investigate student learning?>