Monday, May 7, 2012

A few more unkind words about Martha Nussbaum's Not for Profit since I forgot to include them last time: one of her chapters is devoted to the psychology of children.  She doesn't take much time to integrate the ideas about childhood psychology into either her views on democracy or those on educational reforms, but it's a topic she's previously published on, so there are plenty of opportunities to refer to her own work.  Even more startling than the chapter's irrelevance, her ideas don't seem to have evolved in several decades.  Once she has referred to the work of Solomon Asch, it's no surprise that on the next page she discusses Stanley Milgram.  Turn the page again, and voila, there's Philip Zimbardo.  Now when Prof. Nussbaum and I were in graduate school, Asch, Milgram, and Zimbardo were the Hart, Schaffner, and Marx of psychology: where there was one, the others were sure to follow.  But that was over forty years ago, and all of their studies have long been criticized, modified, or discredited.  Surely, if we're to take Nussbaum's chapter seriously, there needs to be an awareness on her part that Mssrs. Asch, Milgram, and Zimbardo are hardly the last word even on their one narrow interest, which is itself very tenuously related to Nussbaum's argument.

I thought of this because a current equivalent, this time in the realm of moral philosophy, seems to be the "tunnel" problem.  Twice in the last week I've seen a Harvard philosophy professor on TV using the example, and I keep encountering it in reading as well.  In brief, the supposedly exemplary situation is this: you are in a tunnel, driving a train which is out of control and headed toward five people on the track.  If the train hits them, they will die.  Suddenly you notice that there is a siding onto which you can direct the train, but there is one person on that track.  What is the moral thing to do?  And if we conclude that we should divert the train to spare the five, but kill the one, what general consequences follow from this decision?  And then the scenario can be complicated endlessly.  What if the five are members of Al Qaeda and the one is Nelson Mandela?  What if we only suspect that the five are members of Al Qaeda?

The antagonist in all the discussions that use the tunnel problem is "moral relativism," that is, the philosophers are searching for a universal basis for morality.  My two-bit argument would be that moral relativism (which doesn't equal subjectivism) is a straw dog, because it's the only possible stance that we can hold.  Unless we're an omniscient and omnivoyant god, our perspective is always limited, always situated, and thus always relative.  Nor, on the other hand, does John Rawls' popular idea of a "veil of ignorance" get us anywhere because there is no possibility that we can ever make moral choices blind to everything but general principles.  We're neither all-seeing nor blind, and there is no possibility that we can ever be in either position.  We're never not situated, and we all make ad hoc choices depending on our (literal) situation.

Here's my version of the tunnel problem, which I"ll call the professor's dilemma.  Scenario #1:  I'm teaching my last class ever, and I'm tired of grading.  As someone who's left-handed, I believe that we southpaws are generally discriminated against in classes.  There are never enough left-handed desks in classrooms, for example .  Righties who end up in one of the rare desks whine.  We lefties just suck it up and use the numerous right-handed desks.  I decide to give all the left-handed students A's and all the right-handed students C's.  A few students will be very happy; many more will be unhappy, but they'll still pass the course.  I won't have to grade any papers, and I'll have a satisfying sense of justice achieved.  Unfair?  Agreed.  What if I give all the right-handed students D's?  Even more unfair since now they will have to re-take the class?  Yep.  What if I reverse my solution and give the numerous right-handed students A's and the lefties C's?  Less unfair (as in the tunnel problem)?  Are there unchanging principles that we can derive from this example?

Scenario #2:  I realize the error of my ways and decide to introduce a more democratic solution (that still does not entail my grading papers).  This time I put it to a vote: if the class agrees, all the right-handed students get A's, all the lefties get C's.  Left-handed students get to vote; we include them in our democracy.  (Would this be "less fair" if they were excluded?)  Is there a difference between a right-handed student who votes yes and a left-handed student who votes no?  Each is motivated by self-interest after all.  Has a right-handed student who votes no made a more ethical decision than one who votes yes?

Are these scenarios sillier than the tunnel problem?  Do any of them lead us to immutable, generalizable  moral stances?

Just asking.

No comments:

Post a Comment