administratosphere: December 2005 Archives

D-fence! D-fence!

| | Comments (6)

Two of our extended ABD family returned this week, to complete the final step in the process that involves moving them from our program's student page to our alumni page. Which is to say, somewhat euphemistically, that we had two students return from their positions elsewhere to defend their dissertations, and both did so successfully. Congratulations to both of them.

The dissertation season tends to be driven by graduation deadlines, and so typically, we look at the end of July and the end of November as our peak times in that regard. And as these events have rolled around, my attention is more easily grabbed by references to dissertation work than they might be otherwise. The latest issue of Academe, for example, has an article misleadingly titled "How to Grade a Dissertation." I say misleadingly because the article is really less about "how to grade" one than it is the results of a study that attempts to make more explicit the standards by which dissertations have been graded.

Attempts, and largely fails. While there's a mildly interesting chart or two at the back of the essay promising "criteria" by which dissertations are graded, these criteria are entirely predictable, and even a little insulting in their predictability. For example, would it surprise you to learn that "outstanding" dissertations



  • are original and significant, ambitious, brilliant, clear, clever, coherent, compelling, concise, creative, elegant, engaging, exciting, interesting, insightful, persuasive, sophisticated, surprising, and thoughtful;

  • are very well written and organized

  • are synthetic and interdisciplinary

  • Connect components in a seamless way

  • Exhibit mature, independent thinking


Probably not so much. Heck, we all sit down with most of the above as our goals when we write. The subtitle of the article asserts that professors "owe it to their students to make those standards explicit," which is only half true. The standards that we use are only half the story, because for the most part, they are the standards that our students themselves use to evaluate writing, whether their own or others'. A more accurate claim, I think, would be: we owe it to our students to teach them to be able to achieve these standards. And I'm not sure that we are any more explicit about how to achieve these standards than we are about the standards themselves.

Part of the difficulty with a study like this is that professors' self-reporting is going to be no more accurate than any self-reporting--by and large, we are going to offer up what we believe to be the appropriate criteria rather than the ones we use.

And this article bears this out. "The focus groups indicated that most of the dissertations they see are “very good,” which is the level of quality the faculty members said they expect of most graduate students. Consequently, they had less to say about very good dissertations than about the other quality levels." Ahh, how nice. Most of the dissertations fall into this category, about which the faculty studied have the least to say. (Also, the lowest number of criteria for this category are offered.) "Very good dissertations are solid and well written, but they are distinguished by being “less”—less original, less significant, less ambitious, less exciting, and less interesting than outstanding dissertations." If I were still a graduate student reading this, I'd be both a little depressed and a little angry--the majority of dissertations these faculty read, and all they can say about them is, "well, they're not quite outstanding"?!?!

I'm pretty sure I have a finger for that.

It'd be a lot more useful if the "criteria" offered here didn't basically parallel the categories themselves so completely:

  • Outstanding: Is very well written and organized
  • Very Good: Is well written and organized
  • Acceptable: Is workmanlike
  • Unacceptable: Is poorly written

But in order to gather any sort of meaningful data about dissertations, these criteria would have to be triangulated with the dissertations themselves, and they'd have to be studied by people who don't already have a vested interest in the answers being sought. And I suspect that the results would have to be separated out by discipline a bit--the study above surveys faculty in "four science disciplines (biology, electrical and computer engineering, physics or physics and astronomy, and mathematics); three social science disciplines (economics, psychology, and sociology); and three humanities disciplines (English, history, and philosophy)." I'm pretty sure that an electrical engineering dissertation looks a little different, say, from one in philosophy, and that the corresponding faculty mean very different things even when they're using the same language.

There are a few interesting tidbits in this article, but they come in the form of asides more than they occupy center stage. My colleagues in writing studies will find fascinating, I'm sure, the heavy emphasis placed on rubrics, both as a teaching tool and as a way of archiving dissertations. I don't disagree that making expectations explicit is worthwhile--far from it, in fact--but it's curious to watch yet another instance of current-traditional writing pedagogy being offered up, in a nationally circulated publication, no less.

Because, you know, I gave this article a 5 for clarity, a 5 for coherence, a 3 for compelling, and a 4 for concise. How that information would help improve the article or provide a record of anything other than my own opinions I do not know. Ah well. I'm being snottier about this than I'd originally intended. I think that there are good intentions behind a project like this, but an unrealistic estimation of what an aggregation of self-reported, unverified criteria can accomplish.

That is all.

Archives

Pages

Powered by Movable Type 4.1

About this Archive

This page is a archive of entries in the administratosphere category from December 2005.

administratosphere: November 2005 is the previous archive.

administratosphere: March 2006 is the next archive.

Find recent content on the main index or look in the archives to find all content.