(I actually wrote this back in May and then for some reason never posted it. A conversation I had last week when I was at Northwestern reminded me of it, and I'm putting it up now, especially as I'm presently too preoccupied with certain other matters to exhibit any original blog-oomph.)
So, consider the question: What would be the single not-that-costly thing that could be done that would do the most to improve the lot of the average sociology graduate student nationwide?
My answer: Conduct a Survey of Sociology Graduate Student Satisfaction every year. This would be a very short online survey with a link sent to all sociology graduate students, and then the results would be posted publicly.* As things are now, sociology faculty have very little knowledge about the extent to which their students are relatively happy or unhappy compared to other departments, which leads to the easy conclusion that whatever malcontented students are in a department are just the regular allotment that is inevitable for any graduate program. To be sure, I've heard people offer first- or second or n-hand characterizations of the relative happiness of graduate students in various programs, but these always seem to me like they very strongly reflect the dispositions of those making the characterizations.
Departments that ranked highly in this survey would be able to tout this fact when competing for graduate students with departments which did less well. And those departments that did not rank highly might reflect upon what they could do to change this.
(That said, one thing a department could do would be to select students who seemed less likely to provide gloomy satisfaction ratings later. Indeed, an interesting question would be whether some areas of sociology attract disproportionately more dispositionally dissatisfied people than others. Then again, if one asked a question about area on the survey, one could adjust for this.)
My second answer would involve there being a resource online that contained information about how well departments placed their students and rates of early/late attrition, median-years-to-completion, or whatever else.** Again, my guess is that if systematic information on these things was out in the open--and especially, on the web--departments would do more to attend to them.
* I like the acronym SSGSS because it is a palindrome. Online posting of results would also need to include response rates, which themselves probably would say something of the engagement of students with a department.
** Graduate students will often talk about attrition rates like they are, themselves, indicative of the quality of student life in a program. Making too much of attrition rates per se just provides an incentive for late attritions versus early attritions, which is exactly the opposite incentive for what is in the prospective students' interest. As far as I can tell, early graduate student attrition isn't an especially negative outcome at all from the student's perspective, and departmental structures that hasten students' concluding they would be happier doing something else are, I think, desirable if the student would in fact be happier doing something else.
Subscribe to:
Post Comments (Atom)
21 comments:
This would really shake-up the current rankings process. Consequently, it won't happen at an institutional level.
So, ahem, some enterprising professor -- perhaps a blog-savvy one with at least a passing interest in Sarah Vowell -- should undertake this project.
Sociology graduate students everywhere would thank you.
We'd be happy to pay in girl scout cookies or weight watchers subscriptions, depending on whether you're on or off the wagon. (I'd take the Thin Mints, personally.)
cool. but if i launch such a survey, i'm sure to find that minn grads are happiest -- and i'll be hammered as self-serving. that's why you've gotta do this survey, jeremy.
on a related point, why don't we poll random samples of our undergrads at tenure time, rather than soliciting letters from our 3 or 4 favorite students?
Systematic info about placement would be awesome. Then again, some prospectives have absolutely no idea what to make of this info. I remember talking to one about this during my time at Princeton. At that point, our department had placed people in the following departments (in the preceding few years), just to name a few: Arizona, UVA (2), Harvard, UCLA (2), Indiana, Minnesota, etc. After I listed some of those schools to her she said: "yeah, but what about top programs, don't people from here get jobs at those places?". Huh?! I was speechless (and felt really sorry for her thinking that she had absolutely no idea what she was getting into).
The thing about first job placement: is it an indicator of the quality of training or the quality of the social networks of the faculty? Yes yes, both. Yes yes, social networks are a big part of the discipline. I'm just saying. Might not be a great quality of training indicator. It seems to me that the first hiring is kinda a hopeful shot-in-the-dark. So networks play a key informational role. And given that everyone has an interest in placing their students (famous people included) and some folks have better networks and are more trusted in this information problem, you might find a stronger network than quality effect. You all are going to trust Jeremy Freese over Shamus Khan. Never heard of Shamus Khan? That's me! And it's my point.
And the problem with the "second" job is: why should it count for the graduate institution and not the first job/training department. Useful for young scholars would be info on how well departments train/aid assistant professors. If I were doing the survey I'd also want info on that: where do their junior people end up. Some places would get hit relatively hard by that info, I suspect. But for good reason (or for what Shamus thinks is a good reason).
this is an interesting idea. It would be very useful to have especially with the attrition rates. Here at the K-State Stat. dept. we lose maybe 15% of our first year grads. but after that year those who stay finish almost always.
Seems like a good indicator would be students' publications, and how early in the program they publish.
Also, I think that while satisfaction is definitely something to consider for students applying to departments and so this information should definitely be publicly available, it should be interpreted with caution. Students have different degrees of ambition/expectation, and departments with more demanding students might have higher degree of dissatisfaction. Also, students sometimes are clueless about what kinds of skills and resources will be important for them as professionals until very late in the program.
I second L.'s opinion. this is a typical subjective vs. objective measurement problem. suppose i'm a grad in a 3rd-tier department. i'll be a lot happier than those guys in top programs, such as michigan, berkeley, princeton, harvard, chicago, northwestern, wisconsin, columbia, among a bunch of others. for them, not publishing anything is sort of suicidal thing; publishing in an ok journal is just minium; publishing in a top journal is a big thing. for me, never worry about publishing and will be happy if i can land a job at a community college. who's happier? who knows? probably i will have better mental health conditions than them. which program is better?
The argument is not that the department with the most satisfied students is anything other than the department with the most satisfied students. When I talk to prospective graduate students, whether they will be happy does indeed seem like something they actively wonder about. Right now, there is no information about that. Moreover, faculty have no idea how the subjective experience of their students compares with subjective experience elsewhere.
I think Shamus overestimates the effects of advisor's social networks on first job placement. By far the biggest factor is the applicant's CV, productivity, and perceived quality of job talk. Social networks matter, but not to every hire and less than might be assumed. I would say advisor's reputation matters more than advisor's social networks.
i agree with eszter. almost every program is likely to put the best placement on website, but doesn't provide honest statistics. they will say, "look at our placement! two guys went to princeton, one girl went to harvard"--yes, that was true. but it happened ten years ago, and the proportion is 3 out of 40.
A certain department I could name had a brochure that listed selected places students had interviewed, rather than actual placements, allowing them to use the success of three students to justify naming like 8-9 top departments.
I'm a grad student at a top department. For a top department, I'm only mildly successful, in mild ways. And you know? I don't really care. And I swear, I'm not alone. Anecdotal evidence, sure. All I'm saying: just because you're at a top department doesn't mean your ambition is through the roof.
so, a Placement is a Placement?
Fabio wrote:
"Admit year Placements by '07
1997 - 23 6
1998 - 8 0
1999 - 14 3
That says it all."
I dunno that this says it all, but it certainly gives a clue into Fabio's alma mater (not that he hides it). Who else but a Chicago PhD would come up with an example in which only 20% of admits (9/45) has a job after an average elapsed time of 9.2 years? :)
Why, a Berkeley person, of course.
the chicago statistic is true or just hearsay?
While it's probably an exaggeration to say that "a placement is a placement," in some senses it's true, depending on students' expectations going into a program.
Nonetheless, some type of comprehensive listing of who gets what jobs each year and where they graduated from would be invaluable for students who are grad-school shopping, especially if combined with satisfaction ratings and publication info.
Brian Leiter attempts to gather placement information every year for philosophy, but not in a very systematic way.
The periodical Lingua Franca used to publish placement and tenure/promotion data. Not sure they're still in print, but if so, it would be a good (though not necessarily complete) source of info.
Alas, Lingua Franca is no longer in print.
anon @ 10:38: although Fabio can tell us for sure, I think it's safe to assume that he was just making these figures up. My semi-informed sense is that Chicago's time to completion is a bit longer than other departments, but not that much longer; and Chicago PhDs do tend to get jobs. Kim.
I encountered your weblog while searching for PhD programs in sociology. For what it's worth, I would deeply appreciate "student life-satisfaction" surveys, because I have no interest in attending a school where the candidates are looking at death as a better-looking option every day.
Post a Comment