Wednesday, November 8, 2017

The trolley car problem

[Spoken by a professor to a class]

So we'll start out by considering an old standard.

There's a train.  There are people on the train.  Say a hundred, maybe a bit more, maybe a bit less.  Could be a hundred plus the square root of pi, could be ten.  The point is, train: people on it.

This train is about to go over a bridge on one of two tracks.  Track A.  The clouds part or light glints off the water in just the right way or . . . whatever, and you suddenly notice absolute and undeniable proof that Track A is borked.  Really borked.  Inescapably borked.  It is so borking borked that it might as well be Richard Nixon's first Supreme Court nominee of 1987.

If the train goes on Track A there will --not might, will-- be a catastrophic bridge failure killing any people on the train.  And remember there are people on the train.  About a hundred, or maybe it was ten, or eight point five.

Fortunately there's Track B which is supported by an entirely different truss.

And you, as contrived happenstance would have it, are at the ideal position to switch the train to Track B thus saving everyone one the train from completely certain death.

Unfortunately there are people standing on Track B.  The whole reason that the train is taking Track A is because Track B is closed for scheduled maintenance.  The very sort of maintenance that could have prevented this whole mess if it had been done to Track A.

At the moment Track B is fully functional, and structural tests have passed, so it can totally be used to safely transport the train.  However, since everyone knows that the train is taking Track A, the workers have been ignoring the telltale signs (notably sounds and vibrations) of an oncoming train.  The first thing they did when arriving was check that the rail was switched so the train would go on Track A and they did another check when they first noticed the train.

If you switch the track they will not be prepared and some of them will die.  How many?  However many are on the track.  One is good if there's ten people on the train, so by comparison ten is good if there are a hundred, but one works just as well.  Ten is not good if there are eight point five people on the train.  There are most definitely fewer people on Track B than there are on the train.

Notice, please, that all ideas about responsibility and accountability and such are shoved to the side.  The bridge inspector is not in this hypothetical, neither is whoever put you at the right place to switch the track without consultation.  Are you here because of coincidence, providence, because it's your job to decide who lives and who dies in such unlikely scenarios?  Don't know.  Don't care.

The point is that you have a choice to make.  You didn't notice until it was too late to get anyone's attention, and even if you did the workers won't have time to get out of the way and the train definitely won't have time to stop.  Switch the track or don't switch the track.  That is the whole of your universe because other options, numerous though they would intuitively seem to be, do not exist.

The question, therefore, is this:

In the next ten minutes, how many problems can you spot in this bullshit scenario?  How many holes can you poke in the set up?  How many ways can you take down the entire concept of bringing this up in a scathing screed?

Get into small groups, no more than five, talk amoungst yourselves, and see what develops.  Feel free to be creative, some variations on this hypothetical involve shooting people with a train so it's not like liberties haven't been taken before.

This is your very first assignment, welcome to Ethical Philosophy 101, do not confuse it with The Philosophy of Ethics 101 which is down the hall and to the left.  Get to work.


  1. Many likes.

    In my universe, people who try to present real-world problems as trolley problems get a special kind of hell.

  2. Richard Nixon's first Supreme Court nominee of 1987.

    Did u do this on purpose? It it a Watchmen ref?

    1. Not even close.

      I should probably claim otherwise, but: nope.

      It should be Reagan but when I think "Bork" (the name, not the verb) I think "Nixon" since Bork was the Nixon toady who finally said, "Yes, I have no soul; I will fire that guy for you," in the "Saturday Night Massacre".

      I'm way more familiar with Watergate than Reagan's Supreme Court nominations, so "Bork-->Nixon" apparently just took over in my mind.

  3. Also yeah, this BS is tiresome, and it obscures real vaguely similar choices people do face and live with the consequences of every day.

    1. A friend of mine shared this article with me (definitely worth a look, especially the "If you pay me..." example of a better problem) and something that occurred to me is that while the problem itself will tell you nothing useful about morality, using the problem as an example of how not to think about things could be useful.

      Tearing the problem apart could be instructive as to all the ways you should think about things if you want to be a moral individual.

  4. Honestly, now that I've looked it up on Wikipedia, the problem is that the Trolley Problem™ was never meant to bear the great moral weight that it is commonly treated with. It was never meant to be a clickbait-y Do You Have The Moral Clarity™ And Intellectual Discipline® To Choose™® Correctlyℵ©¢ - in the original invention, it was one of a half-dozen throwaway hypotheticals illustrating the difference between "do a bad thing intentionally toward a good end" and "do a good thing knowing that there will be bad consequences you can't avoid".

    Taken out of context? And embroidered with a hundred "no, you can't avoid our Tragic Dilemma that way, play the game I'm demanding you play" ornaments? It's meaningless.

    And the reason why it gets treated like it has meaning is the same reason why John W. Campbell forced Tom Goodwin to rewrite the short story "The Cold Equations" three times because "Godwin kept coming up with ingenious ways to save the girl!" It's an excuse to feel like one is doing a moral thing by killing somebody. And we shouldn't be looking for those.

    1. There is no universe in which "Cold Equations" makes sense. The extra mass already changed everything at launch. It's too late to put things back to how they were meant to be. Either utter doom is inevitable, so let the stowaway stay on board so she and the pilot don't have to die alone, or it's not, in which case fight it.


      All of these things are about figuring out what it takes to make someone think that human sacrifice via murder is a good thing. It's not. It never will be. Why are people trying to convince other people that it is?


      And we know this. The distribution of finite resources will leave some people dead in some situations, and yet even in a triage situation you don't fucking kill people.

      And when there are (extremely rare) situations where everyone will die unless some subset is sacrificed, human beings actually have a traditional solution set:
      a) Hold out for a miracle (sometimes one comes) and thus have an all or nothing gamble
      b) Some subset of the right size volunteers to sacrifice themselves.
      c) A larger than necessary subset volunteers. They draw lots.


      In other news, I love the aleph in the clickbait title.

    2. Also, the argument over what the right solution is happens to be intellectually dangerous.

      The problem as contrived is one where there can only be wrong solutions. "Terrible, but hopefully not as terrible as the alternative" may be the best you can manage, but that doesn't make it right. Think of it as the right answer and you're training yourself to forget what "right" and "wrong" mean.

      I'm not suggesting that people who get stuck in shitty no-win situations be wracked with guilt for the rest of their lives. Just that it's important to remember that "less bad" is not identical to "good". (There's overlap, for sure, but there's also plenty of "less bad" that's still "fucking bad".)

    3. Indeed.

      (I was pretty pleased to sneak the aleph in there, too. ^_^)

  5. Yeah, a lot of SF of that period is about coming up with clever solutions to "impossible" problems. And indeed, the trajectory should have been noticed at lift-off.

    But I think the popularity of it, and of trolley problems, is related to the popularity of horror: it's a frightening thing, in this case on a moral level, that you can turn over in your mind. We wouldn't be talking about TCE today if it had been another "engineer saves the day" story.

    The inherent contradiction in the trolley problem (is it worse to kill by doing something than to kill more by doing nothing) is one that needs to be slightly explored, but by stripping it down to what appear to be its essentials you lose the importance. Consider vaccination for major diseases, where making it compulsory might save millions and kill tens; there are apparently some people who will feel that the action makes them culpable for those deaths and therefore they shouldn't do it.

    They probably need an ethical philosophy course. Or to get a different job.

  6. My version of this problem:

    Track B is empty, except that someone tied Jesus to the rails.

    Would you kill Jesus to save the people on the train?


    1. Jesus is like the ONE PERSON for whom being run over by a train is not necessarily an intractable problem.

    2. If you meet the Buddha tied to the tracks…

    3. Ok that reminds me my parents are watching a show about traveling the Mekong River and in one episode there was a baby (endangered?) bear that had been named Jesus because he almost died multiple times but came back.

  7. My day is better for having read this and the responses. You folks rock.