WEBVTT 00:00.510 --> 00:03.810 PROFESSOR: So there's two things that we need to do in 00:03.810 --> 00:04.800 today's lecture. 00:04.800 --> 00:09.590 The first is to finish up our discussion of deontology, 00:09.590 --> 00:11.950 which was necessarily quite rushed. 00:11.950 --> 00:14.380 We're trying to do Kant in roughly a lecture. 00:14.380 --> 00:18.610 But I do want to get to the end of that discussion. 00:18.610 --> 00:22.090 And the second, which will allow you to use your clickers 00:22.090 --> 00:25.730 and express your opinions, is to talk through the structure 00:25.730 --> 00:30.020 of Judy Thomson's trolley problem paper. 00:30.020 --> 00:35.470 So you recall from last lecture that our goal in 00:35.470 --> 00:39.100 understanding the very brief selection from Kant's 00:39.100 --> 00:42.800 Groundwork to the Metaphysics of Morals that we read was to 00:42.800 --> 00:48.840 try to make sense of the three claims that he makes in the 00:48.840 --> 00:53.820 first book, first chapter, of that volume. 00:53.820 --> 00:57.130 And those claims, to which I've now added some 00:57.130 --> 00:59.440 underlining, are the following. 00:59.441 --> 01:02.951 The first is the claim that in order to have moral worth, an 01:02.954 --> 01:07.634 action needs to be done from duty. 01:07.630 --> 01:10.390 And the distinction that Kant is making there is the 01:10.390 --> 01:13.970 distinction between doing something in keeping with 01:13.970 --> 01:18.390 duty, that is, something that conforms to what morality 01:18.390 --> 01:21.570 demands of you, and doing something not merely in 01:21.570 --> 01:25.560 keeping with, but also from duty. 01:25.560 --> 01:29.970 And Kant's picture is that the moral worth of an action is 01:29.970 --> 01:35.060 determined not merely by it being in keeping with duty. 01:35.060 --> 01:39.320 That's a necessary but not a sufficient condition. 01:39.320 --> 01:42.270 What determines the moral worth of an action is that it 01:42.270 --> 01:47.350 be done in keeping with duty for the sake of being in 01:47.350 --> 01:52.510 keeping with duty, that is, it be done from duty. 01:52.512 --> 01:56.022 The second thing that Kant says, the second proposition 01:56.020 --> 01:59.600 which he seeks to defend in the Groundwork, is the claim 01:59.595 --> 02:01.475 that an action done from duty-- 02:01.480 --> 02:03.630 that's the thing we were talking about in the first 02:03.630 --> 02:07.580 claim-- an action done from duty has its moral worth not 02:07.580 --> 02:12.440 in the purpose that is to be attained by it, not in what 02:12.440 --> 02:17.390 the Greeks would call its telos, its goal, its aim, but 02:17.390 --> 02:22.050 rather in the maxim according to which the action is 02:22.050 --> 02:23.210 determined. 02:23.210 --> 02:26.090 That is, what determines the morality of the action on the 02:26.090 --> 02:29.850 Kantian picture is the description under which the 02:29.850 --> 02:32.620 action is performed. 02:32.620 --> 02:35.900 Now, a number of you came to office hours yesterday, and we 02:35.900 --> 02:39.370 had a rather lively discussion of how it is that one goes 02:39.370 --> 02:43.450 about determining what things count as maxims. And I 02:43.450 --> 02:46.070 encourage those of you who are interested in that question to 02:46.070 --> 02:49.710 take an ethics course, where you can work through Kant's 02:49.710 --> 02:52.430 writings on this question in more detail. 02:52.430 --> 02:56.360 For the purpose of our class, all we need to hold on to is 02:56.360 --> 03:00.310 the simple idea that what Kant is interested in here are acts 03:00.310 --> 03:03.700 under a description, and that that description is going to 03:03.704 --> 03:07.144 have to satisfy a certain sort of test, as we'll 03:07.140 --> 03:08.650 find out in a moment. 03:08.645 --> 03:11.045 So that's Kant's second claim. 03:11.054 --> 03:15.164 Kant's third claim is that duty, which is the central 03:15.160 --> 03:17.390 notion to deontology -- 03:17.390 --> 03:21.010 deon, duty is at the core of deontology -- 03:21.010 --> 03:27.890 duty is "the necessity of an action done out of respect for 03:27.890 --> 03:33.860 the law." And the idea here is that not only do you need to 03:33.855 --> 03:38.025 act with the goal of conforming to the law in mind, 03:38.030 --> 03:42.070 not only do you need to do so in a way that you articulate 03:42.070 --> 03:48.990 your actions as falling under that norm, but you do so 03:48.990 --> 03:55.490 because you take the moral law to be morally binding upon 03:55.490 --> 04:01.120 you, because you recognize that it is what rationality 04:01.120 --> 04:03.480 demands of you. 04:03.480 --> 04:11.300 The moral law turns out be the law of governing your behavior 04:11.300 --> 04:17.100 that you set for yourself as a rational being. 04:17.100 --> 04:21.790 It is the only aspect of your behavior on the Kantian 04:21.790 --> 04:26.380 picture that isn't determined by the contingent forces of 04:26.380 --> 04:28.160 the world around you. 04:28.156 --> 04:34.536 It's determined by your recognition of your role as 04:34.540 --> 04:40.070 somebody capable of binding themselves to a law that they 04:40.070 --> 04:43.380 themselves set. 04:43.380 --> 04:48.540 So as I said, the reason we were interested in these three 04:48.540 --> 04:53.160 principles was to get to Kant's famous formulation of 04:53.160 --> 04:54.860 the categorical imperative. 04:54.860 --> 05:00.130 And we closed lecture last time by meeting one member of 05:00.130 --> 05:05.110 the categorical imperative family in response to Kant's 05:05.110 --> 05:10.560 question: "What sort of law can it be, the thought of 05:10.560 --> 05:13.220 which must determine the will without reference to any 05:13.223 --> 05:16.543 expected effects so that the will can be called absolutely 05:16.540 --> 05:20.260 good without qualification?" Kant sets himself, as I said, 05:20.263 --> 05:25.083 this cliffhanger of a question, and answers it with 05:25.080 --> 05:28.910 the first member of the categorical imperative family. 05:28.910 --> 05:32.580 It's the will's "universal conformity of its actions to 05:32.580 --> 05:38.870 law as such." Only thereby, says Kant, can one act 05:38.870 --> 05:43.910 autonomously and not heteronomously. 05:43.910 --> 05:46.780 What does it mean to act autonomously as opposed to 05:46.780 --> 05:48.180 heteronomously? 05:48.180 --> 05:52.910 Let's look at the words: autonomous, heteronomous. 05:52.910 --> 05:55.560 You'll notice that they both have here the word 05:55.560 --> 05:58.800 nomos, that is, law. 05:58.800 --> 06:03.950 And they distinguish the law to which you're subjected by 06:03.950 --> 06:08.690 saying that in one case, it's an auto-nomos, and in the 06:08.690 --> 06:13.160 other case, it's a hetero-nomos. 06:13.160 --> 06:15.040 Auto-nomos. 06:15.040 --> 06:16.730 What could that be? 06:16.730 --> 06:18.620 Let's think of other words where we 06:18.620 --> 06:20.030 have this prefix auto-. 06:20.030 --> 06:22.820 How about automobile? 06:22.820 --> 06:24.720 What's an automobile? 06:24.720 --> 06:31.680 An automobile is something that is self-propelled. 06:31.680 --> 06:35.570 It is propelled by its own strength. 06:35.570 --> 06:41.650 To act autonomously, on Kant's picture, is to act on the 06:41.650 --> 06:47.090 basis of a law that you yourself have imposed. 06:47.090 --> 06:55.450 You are auto-nomos, subject to a law that comes from within. 06:55.450 --> 07:01.130 By contrast, what is it to be hetero-nomos? 07:01.130 --> 07:06.110 Well, what is it to be heterosexual? 07:06.110 --> 07:11.790 What it is to be heterosexual is to be attracted sexually to 07:11.790 --> 07:17.130 individuals that have a different gender than you do. 07:17.130 --> 07:19.910 So what is it to be hetero-nomos? 07:19.910 --> 07:24.000 It's to have law given unto you that comes from outside of 07:24.004 --> 07:29.254 you, that comes from something different from you. 07:29.250 --> 07:36.980 So Kant's picture is that autonomy, self-lawgiving, is 07:36.980 --> 07:42.720 possible only when the law to which you conform your 07:42.720 --> 07:48.950 behavior comes not from the contingencies of the world, 07:48.950 --> 07:52.310 but comes from within. 07:52.310 --> 07:56.820 In this way, Kant is concerned with the same sorts of 07:56.820 --> 08:01.630 questions that Epictetus and Boethius are. 08:01.630 --> 08:05.490 Both of them are profoundly concerned with how it is that 08:05.490 --> 08:11.520 human freedom is possible, and Kant's picture is that human 08:11.520 --> 08:18.570 freedom becomes possible when you govern your actions on the 08:18.570 --> 08:25.410 basis of what do you yourself decide to be, norms that you 08:25.409 --> 08:28.419 want to conform to. 08:28.419 --> 08:33.019 In particular, when you conform your actions to the 08:33.020 --> 08:37.290 categorical imperative, which says, in the formulation that 08:37.290 --> 08:40.350 we see at the beginning of this section, that you should 08:40.350 --> 08:46.110 "never act in such a way that you cannot also will that your 08:46.110 --> 08:49.560 maxim"--there's your maxim again--"should become a 08:49.560 --> 08:56.310 universal law." When you act in such a way that you don't 08:56.310 --> 09:00.470 take the contingencies of your situation into 09:00.470 --> 09:05.800 consideration--but rather that you think of yourself as one 09:05.800 --> 09:11.750 among any number of beings who in your situation would do 09:11.750 --> 09:18.210 exactly what you do--only then do you become free of the 09:18.210 --> 09:21.430 contingencies of circumstance. 09:21.430 --> 09:26.090 So the picture is that in some ways, by stepping beyond the 09:26.090 --> 09:29.710 bounds of the contingent features of your experience, 09:29.710 --> 09:36.160 by stepping beyond the bounds of yourself, you thereby gain 09:36.160 --> 09:42.340 freedom from the contingencies of the world around you. 09:42.342 --> 09:48.942 And Kant suggests that if you take this on has a picture, 09:48.940 --> 09:54.590 you will come to see that it conforms with the rules of 09:54.590 --> 09:56.800 rationality. 09:56.801 --> 10:00.011 So, he says, suppose you're confronted with a very 10:00.010 --> 10:04.440 particular case of an act that you want to perform under a 10:04.440 --> 10:06.170 particular maxim. 10:06.170 --> 10:09.520 The maxim you set for yourself is, "when I make a promise and 10:09.520 --> 10:11.840 it's going to be a pain in the neck for me to keep that 10:11.840 --> 10:14.930 promise, I'll break that promise. 10:14.930 --> 10:20.670 That is, the maxim is, it's OK for me to make lying 10:20.670 --> 10:22.730 promises." 10:22.730 --> 10:27.310 And Kant asks, suppose that you made, under that 10:27.310 --> 10:31.920 description, a promise that you didn't intend to keep. 10:31.921 --> 10:36.561 Can that maxim and be universalized? 10:36.560 --> 10:38.490 Well, he says, suppose that it were. 10:38.490 --> 10:44.450 Suppose everybody, when they made promises, did so only 10:44.450 --> 10:45.920 with the thought that they would keep them when 10:45.920 --> 10:50.240 convenient and not when they were inconvenient. 10:50.240 --> 10:55.430 Were that to happen, says Kant, there would be no such 10:55.430 --> 10:58.980 thing as reliable promising. 10:58.980 --> 11:00.740 Why? 11:00.740 --> 11:06.690 Well, because promises are like balconies. 11:06.690 --> 11:12.460 We don't step out on a balcony if there's a good chance that 11:12.460 --> 11:16.460 the balcony will break when we step out on it. 11:16.460 --> 11:20.720 And we don't step out on promises, on commitments that 11:20.720 --> 11:27.040 others make to us, unless we are close to certain that that 11:27.040 --> 11:30.880 promise will be preserved. 11:30.876 --> 11:36.926 So, says Kant, since the practice of promising would 11:36.930 --> 11:44.130 break down if everybody who found it convenient made lying 11:44.130 --> 11:51.010 promises, it is not in keeping with what the moral law 11:51.010 --> 11:57.580 demands of us that we make a lying promise. 11:57.582 --> 12:04.582 And, Kant suggests, this framework can be extended to 12:04.580 --> 12:10.130 all the kinds of duties that there are. 12:10.130 --> 12:15.370 There are, suggests Kant, two categories of obligation that 12:15.370 --> 12:18.490 we have towards ourselves and to others. 12:18.490 --> 12:23.940 We have duties to ourself and duties to other people. 12:23.935 --> 12:27.135 And, in addition, says Kant, we have perfect duties, things 12:27.140 --> 12:32.630 that we need always to do, and imperfect duties, things that 12:32.630 --> 12:35.900 we need sometimes to do. 12:35.900 --> 12:40.550 In all four of these cases, says Kant in the reading that 12:40.550 --> 12:47.160 we did, we can see that the categorical imperative gives 12:47.160 --> 12:51.020 us guidance as to whether an action 12:51.020 --> 12:54.040 under a maxim is permitted. 12:54.044 --> 12:57.464 The action under the maxim is going to be permitted if it 12:57.460 --> 13:00.340 can be universalized, and it's going to be 13:00.341 --> 13:03.031 prohibited if it can't. 13:03.030 --> 13:07.540 So if we ask this question, is it all right to make lying 13:07.540 --> 13:10.370 promises, and we say to ourself, suppose everybody 13:10.370 --> 13:15.500 made lying promises, we discover that the act of lying 13:15.500 --> 13:19.660 promises is prohibited by the categorical imperative, 13:19.660 --> 13:22.250 because it can't be universalized. 13:22.250 --> 13:26.890 Suppose we ask the question, "is it OK to commit suicide 13:26.890 --> 13:32.250 when feeling frustrated with the world?" And Kant says, 13:32.250 --> 13:36.040 suppose everybody did that. 13:36.040 --> 13:39.560 The practice, the convoluted argument would break down, 13:39.560 --> 13:43.630 because there would be nobody enough to kill themselves. 13:43.632 --> 13:45.522 So goes the argument. 13:45.520 --> 13:49.410 Suppose we ask ourselves whether we have a duty to 13:49.410 --> 13:52.340 cultivate our talent? 13:52.335 --> 13:56.415 Kant says: suppose nobody cultivated their talents. 13:56.420 --> 14:02.330 The world in which we live would be one in which nobody 14:02.330 --> 14:04.590 would want to live. 14:04.590 --> 14:09.710 And consequently, we have a moral obligation to do. 14:09.708 --> 14:14.018 Finally, he asks, "do we have an obligation to give money to 14:14.020 --> 14:18.680 those in need?" And asks again, what would happen if it 14:18.680 --> 14:21.550 were a universal law that nobody gave 14:21.550 --> 14:23.230 money to those in need? 14:23.225 --> 14:28.435 And again we discover a breakdown of an ordered world 14:28.440 --> 14:30.790 in which we want to survive. 14:30.790 --> 14:34.270 Now, there's room for questioning-- 14:34.270 --> 14:37.210 in fact, there's room for questioning all four of these 14:37.210 --> 14:40.470 derivations, though it's generally accepted that the 14:40.465 --> 14:45.035 lying promise derivation is the most effective of them. 14:45.040 --> 14:51.490 But let's look instead at what Kant says about them, if it 14:51.490 --> 14:55.480 were to turn out that these derivations worked. 14:55.476 --> 15:00.476 Kant says, "these are some of the many actual duties whose 15:00.480 --> 15:05.590 derivation from the single principle above is clear." 15:05.590 --> 15:08.480 "It's clear" is a bit of a stretch, but we can see how 15:08.480 --> 15:10.840 that derivation would go. 15:10.840 --> 15:15.150 What does this tell us about morality? 15:15.146 --> 15:22.316 Kant says it tells us that when we take an act and try to 15:22.322 --> 15:26.912 determine whether it's moral, we need to check to see 15:26.910 --> 15:31.270 whether we're making an exception for ourselves. 15:31.270 --> 15:34.810 When we act, we need to be able to will that a maxim of 15:34.810 --> 15:38.490 our actions become a universal law. 15:38.490 --> 15:43.500 When we transgress, says Kant, we don't will that our maxim 15:43.500 --> 15:46.720 should become a universal law, but rather that the opposite 15:46.720 --> 15:51.400 of this maxim should remain a law universally. 15:51.400 --> 15:57.790 So suppose you like to sit in the last two rows of this 15:57.793 --> 16:03.213 classroom, even though you arrive not late to class. 16:03.210 --> 16:08.300 Can you will that this become a universal law, or is this 16:08.300 --> 16:13.390 something that has works for you only if others are willing 16:13.390 --> 16:15.480 to sit further in so that there's room for 16:15.480 --> 16:18.770 people on the stairs? 16:18.770 --> 16:24.060 Kant would say that the moral law demands of you that you 16:24.055 --> 16:29.075 move inward, because you're sitting in those last two 16:29.080 --> 16:34.250 rows, despite your not late arrival, depends on other 16:34.250 --> 16:38.220 people doing something different. 16:38.220 --> 16:42.630 Stealing depends on other people 16:42.630 --> 16:46.750 respecting the laws of properly. 16:46.750 --> 16:51.740 Your not paying the toll on the subway depends on other 16:51.740 --> 16:55.250 people paying the toll so that there's enough money to keep 16:55.250 --> 16:57.520 up the subway. 16:57.520 --> 17:04.070 When you make an exception for yourself, says Kant, you 17:04.069 --> 17:06.749 violate the moral law. 17:06.750 --> 17:11.000 And we'll return to this at the opening of our discussion 17:11.000 --> 17:13.700 of political philosophy, when we talk 17:13.700 --> 17:16.840 about prisoner's dilemma. 17:16.840 --> 17:21.870 Now, I mentioned in passing that you had met one member of 17:21.870 --> 17:26.550 the categorical imperative family, that you had met 17:26.545 --> 17:29.735 what's sometimes called the formula of the universal law: 17:29.740 --> 17:33.130 that one should "act only in accordance with that maxim 17:33.130 --> 17:36.110 through which you can, at the same time, will that it become 17:36.110 --> 17:39.800 a universal law." 17:39.798 --> 17:46.768 Kant put the categorical imperative four different ways 17:46.770 --> 17:48.520 for a number of reasons. 17:48.520 --> 17:53.400 One of which, he says, it that in certain cases, it's easier 17:53.400 --> 17:57.470 to see how to apply the categorical imperative if we 17:57.470 --> 18:01.240 frame it in a slightly different way. 18:01.240 --> 18:05.780 And I want to introduce you to the other three, largely 18:05.780 --> 18:09.710 because the second of these is going to play a central role 18:09.710 --> 18:14.280 in the second half of today's lecture. 18:14.280 --> 18:21.940 So Kant claims that it is equivalent to saying that "you 18:21.940 --> 18:24.620 should act only in such a way that you can will your maxim 18:24.620 --> 18:30.410 to be universal" to say that "you should act so as to treat 18:30.410 --> 18:34.490 humanity, whether in your own person or that of any other, 18:34.490 --> 18:43.690 in every case as an end, and never merely as a means only." 18:43.690 --> 18:51.020 Do not use yourself as a means to an end, and do not use 18:51.020 --> 18:58.020 others in your interactions with them merely as means. 18:58.020 --> 19:04.420 Treat humanity and all others as ends in themselves. 19:08.100 --> 19:12.890 Equivalent to that, says Kant, is the formula of autonomy, 19:12.890 --> 19:15.580 which we've already talked about recently. 19:15.582 --> 19:19.392 "Act so that through your maxims, you could be a 19:19.390 --> 19:24.870 legislator of universal laws." Act in such a way that you are 19:24.870 --> 19:30.380 self-lawgiver with respect to rules that reason endorses. 19:30.380 --> 19:34.990 And finally, a rather complicated notion, sometimes 19:34.990 --> 19:38.240 called the kingdom of ends formulation, "act in 19:38.240 --> 19:42.650 accordance with the maxims of a member giving universal laws 19:42.650 --> 19:47.260 for a possible kingdom of ends,"--a harmonious society 19:47.260 --> 19:49.330 in which everybody exists according to the 19:49.330 --> 19:52.080 laws that you give. 19:52.080 --> 19:56.650 As I said, we have only about an hour's worth of Kant, so we 19:56.650 --> 19:59.310 won't focus on the third and the fourth. 19:59.310 --> 20:02.980 But I do want to call your attention to the formula of 20:02.980 --> 20:06.580 humanity, because as I said, it's going to play a central 20:06.580 --> 20:12.460 roll in Judy Thomson's ultimate diagnosis of what may 20:12.460 --> 20:17.090 be going on in our intuitions about trolley cases. 20:17.090 --> 20:22.050 So let me close the discussion of Kant by trying to connect 20:22.050 --> 20:28.240 it back to the mini-unit of which it is a part. 20:28.240 --> 20:32.240 You'll recall that we began this section of the course on 20:32.240 --> 20:36.720 the seventeenth, that is, last Thursday, with thinking about 20:36.720 --> 20:40.410 consequentialism as a moral theory. 20:40.412 --> 20:46.662 And the question that I want to ask is: what is there that 20:46.660 --> 20:52.700 is common to the two concrete moral theories that we've 20:52.700 --> 20:56.350 taken a look at the beginning of this unit? 20:56.350 --> 20:59.860 We've looked so far as some of the differences between 20:59.860 --> 21:01.760 consequentialism on the one hand and 21:01.760 --> 21:03.360 deontology on the other. 21:03.355 --> 21:07.085 But I think it's important, in moving on to some of their 21:07.090 --> 21:09.240 practical applications, to think about what 21:09.240 --> 21:10.920 they have in common. 21:10.920 --> 21:16.820 And what they have in common is that both teleology, 21:16.820 --> 21:19.300 consequentialism, utilitarianism in the 21:19.300 --> 21:24.540 particular form that we found it--and deontology prohibit 21:24.540 --> 21:28.250 first-person exceptionalism. 21:28.245 --> 21:35.025 Kant says: my desire may serve at bases for willed actions 21:35.030 --> 21:39.420 only if I can, at the same, coherently will that others in 21:39.420 --> 21:43.710 similar circumstances would act in a way that I am 21:43.710 --> 21:45.720 choosing to act. 21:45.720 --> 21:50.620 I'm only allowed to do things that I'm going to assume other 21:50.620 --> 21:53.940 people are also allowed to do. 21:53.940 --> 21:58.920 And Bentham, quoted in Mill, Bentham, the great grandfather 21:58.920 --> 22:04.540 of utilitarianism, says, "everyone is to count for one, 22:04.540 --> 22:08.630 no one for more than one." Mill, in his greatest 22:08.630 --> 22:13.270 happiness principle, speaks of the happiness of all, not the 22:13.270 --> 22:17.020 happiness from the subjective perspective. 22:17.020 --> 22:25.340 So the challenge of morality is that of viewing the world 22:25.340 --> 22:31.750 not from the stance of your own needs as the most central 22:31.750 --> 22:36.350 set of needs in the world, but rather from the perspective of 22:36.350 --> 22:44.780 your own needs as one set of needs among those of six 22:44.780 --> 22:49.940 billion equally sentient beings. 22:49.940 --> 22:56.220 Now the problem for morality is that the tendency towards 22:56.220 --> 23:00.420 first person exceptionalism, the tendency to take one's own 23:00.420 --> 23:07.330 needs as more important than the needs of anybody else is 23:07.330 --> 23:11.730 perhaps the most widespread and pervasive 23:11.730 --> 23:14.750 psychological bias. 23:14.750 --> 23:19.310 And when we get to the unit on political philosophy after 23:19.310 --> 23:24.550 March break, we'll talk about ways in which social 23:24.550 --> 23:30.110 structures are put into place to help deal with 23:30.110 --> 23:34.000 this sort of tension. 23:34.000 --> 23:38.180 Even in the passages from Mill that we read for last 23:38.180 --> 23:43.410 Thursday, Mill talks about what sorts of attitudes it's 23:43.410 --> 23:48.150 important to cultivate in individuals so that they begin 23:48.145 --> 23:52.195 to view the world from a moral perspective. 23:52.200 --> 23:56.150 In the selections that we read from book ten of Aristotle's 23:56.150 --> 24:01.420 Nicomachean Ethics, Aristotle asked, in what way should 24:01.420 --> 24:05.070 society be structured to make it easy for 24:05.070 --> 24:07.860 people to act morally? 24:07.860 --> 24:11.060 And in some ways, the question with which we're going to 24:11.060 --> 24:16.070 close the course--how does rational versus nonrational 24:16.070 --> 24:21.010 persuasion work, what are the roles of literary as opposed 24:21.010 --> 24:23.300 to argumentative representations of the good 24:23.300 --> 24:32.070 life--is a version of this dilemma, operationalized. 24:32.070 --> 24:36.520 How is it, given an inevitable human tendency to take one's 24:36.520 --> 24:41.520 needs as more important than others, how is it possible to 24:41.520 --> 24:44.400 structure society in such a way that the 24:44.400 --> 24:47.510 needs of all are met? 24:47.510 --> 24:51.510 So that's what I want to say by way of the Kant. 24:51.510 --> 24:54.570 And in the remainder of lecture, I want to ask you to 24:54.570 --> 25:02.120 take out your clickers and enjoy the ride. 25:02.120 --> 25:09.400 So as you know, the paper which we read for today is a 25:09.400 --> 25:15.320 great and intricate paper by the philosopher Judith Jarvis 25:15.320 --> 25:21.310 Thomson written in the mid-1980s in response to an 25:21.310 --> 25:25.170 earlier paper by another mid-century woman philosopher, 25:25.170 --> 25:27.150 Philippa Foot. 25:27.145 --> 25:31.975 And what Philippa Foot and Judy Thomson are interested in 25:31.980 --> 25:38.810 in these papers is a systematic exploration of a 25:38.810 --> 25:46.280 number of cases which seem to evoke, in most subjects, 25:46.276 --> 25:50.296 pretty powerful intuitions about what the right thing to 25:50.300 --> 25:56.090 do is, but which seem to adduce intuitions the 25:56.085 --> 26:02.845 explanation for which is hard to systematize. 26:02.850 --> 26:06.750 So as you know well, the first case, which we'll 26:06.750 --> 26:09.020 call trolley driver-- 26:09.020 --> 26:10.420 there's the driver-- 26:10.420 --> 26:12.250 is this. 26:12.250 --> 26:17.310 There's a trolley, hurtling down a track, in such a way 26:17.310 --> 26:22.430 that going to kill five people. 26:22.430 --> 26:27.950 But it turns out that there is a second track onto which the 26:27.950 --> 26:30.720 trolley could be diverted, where 26:30.720 --> 26:35.260 there is only one person. 26:35.260 --> 26:36.660 And the question is this. 26:36.660 --> 26:40.240 The trolley driver is driving the trolley. 26:40.240 --> 26:44.010 It's heading towards the five people in such a way that he 26:44.010 --> 26:46.730 is going to kill the five. 26:46.730 --> 26:51.010 Should he, is he morally required to, morally 26:51.010 --> 26:56.690 prohibited from, or perhaps neither prohibited nor 26:56.690 --> 27:00.750 required but rather just permitted, to turn the trolley 27:00.750 --> 27:04.930 onto the track where there is only the one? 27:04.925 --> 27:08.475 Now notice that though this is framed as an idealized 27:08.480 --> 27:14.270 problem, the diversion of threat is something that 27:14.270 --> 27:18.750 decision-makers face all the time. 27:18.750 --> 27:24.230 Suppose there is an airplane that has become incapacitated, 27:24.230 --> 27:27.620 that's falling in such a way that it's going to land on a 27:27.620 --> 27:32.020 large city, and it would be possible to divert the plane 27:32.020 --> 27:37.530 so that it falls on a less populated area instead. 27:37.530 --> 27:42.500 Suppose that there is an illness which is taking the 27:42.500 --> 27:48.650 lives of many people, but if one quarantines those who are 27:48.650 --> 27:53.520 ill, causing that number to die, the rest of the 27:53.520 --> 27:57.140 population will be spared. 27:57.140 --> 28:02.660 In case after case, we face dilemmas with roughly the 28:02.662 --> 28:05.582 structure of trolley. 28:05.580 --> 28:10.070 So though these cases are idealized, in the sense that 28:10.070 --> 28:14.060 we're granting ourselves that we know with certainty what 28:14.060 --> 28:20.190 the outcomes will be, it is, I think, not a useless exercise, 28:20.190 --> 28:23.840 even if our concern is real-world morality, to think 28:23.840 --> 28:27.890 through what the right thing to do is in these cases. 28:27.890 --> 28:33.610 So let's start with the case of the trolley driver, driver 28:33.610 --> 28:34.920 of the trolley. 28:34.920 --> 28:38.210 The trolley is heading down the track in such a way that 28:38.210 --> 28:42.910 the driver will, with the trolley, kill five people if 28:42.910 --> 28:44.740 he doesn't turn it. 28:44.740 --> 28:48.990 He faces the choice of turning the trolley onto the track 28:48.990 --> 28:50.750 where there is the one. 28:50.750 --> 28:51.930 Question. 28:51.930 --> 28:56.660 Is it morally mandatory for him to turn a trolley from the 28:56.660 --> 28:58.660 five to the one? 28:58.660 --> 29:02.720 Is it morally permitted, but not morally mandatory, for him 29:02.720 --> 29:05.470 to turn the trolley from the five to the one? 29:05.470 --> 29:09.880 Or is it morally prohibited for him to turn the trolley 29:09.880 --> 29:12.180 from the five to the one? 29:21.830 --> 29:23.290 OK. 29:23.290 --> 29:26.170 So let's see how the numbers come out. 29:26.170 --> 29:28.240 I'm going to write these down, because we're going to need 29:28.240 --> 29:29.490 them for later. 29:29.490 --> 29:35.060 So 7% of you, very, very small number, think that it's 29:35.060 --> 29:39.110 morally prohibited for him to turn the trolley. 29:39.110 --> 29:43.600 The vast majority of you, close to two thirds, think 29:43.601 --> 29:46.801 that it's morally permitted, but not morally mandatory, for 29:46.800 --> 29:48.580 him to turn the trolley. 29:48.580 --> 29:52.650 And about 30% of you, roughly a third, think that he is 29:52.650 --> 29:57.590 morally required to make that turn. 29:57.590 --> 30:00.670 Case number two. 30:00.670 --> 30:01.920 Transplant. 30:03.600 --> 30:06.390 You're running a hospital. 30:06.390 --> 30:11.560 Five people show up at the hospital, all of them destined 30:11.560 --> 30:15.800 to die, because one needs a lung, and one needs a heart, 30:15.800 --> 30:19.710 and one needs a kidney, and one needs a liver, and one 30:19.710 --> 30:20.960 needs a brain. 30:23.470 --> 30:27.080 So they're all going to die, and you are the doctor. 30:27.080 --> 30:33.030 And into the emergency room walks a perfectly healthy 30:33.030 --> 30:36.550 young man who has a heart and a lung and a liver and a 30:36.550 --> 30:40.860 kidney and a really good, active brain. 30:40.861 --> 30:47.301 And if you were to chop up that man and give his parts to 30:47.300 --> 30:52.070 the five suffering individuals, you could save 30:52.070 --> 30:54.860 the five at the cost of the one. 30:54.860 --> 30:55.550 Question. 30:55.550 --> 31:00.330 For the doctor, is it A, morally mandatory to chop up 31:00.325 --> 31:04.695 the healthy man to save the five, B, morally permitted but 31:04.700 --> 31:09.770 not morally mandatory to chop up the healthy man, or C, 31:09.770 --> 31:13.590 morally prohibited to chop up the healthy man? 31:17.318 --> 31:21.188 So let's see how numbers so come out. 31:26.370 --> 31:27.620 So. 31:30.300 --> 31:35.950 85% of you think it is morally prohibited to cut up the one 31:35.950 --> 31:37.950 to save the five. 31:37.950 --> 31:40.820 9% of you think it is morally permitted, 31:40.820 --> 31:44.100 but not morally mandatory. 31:44.104 --> 31:46.624 And 6% of you-- 31:46.620 --> 31:47.870 off to med school-- 31:50.360 --> 31:53.930 think that it is morally mandatory to chop up the 31:53.930 --> 31:55.760 healthy man. 31:55.760 --> 31:59.100 So what's going on here? 31:59.096 --> 32:02.926 So Philippa Foot, who was the person who initially presented 32:02.930 --> 32:05.150 this juxtaposition, has a hypothesis. 32:05.150 --> 32:07.440 And her hypothesis is this. 32:07.440 --> 32:10.440 That in the trolley driver case, the choice that the 32:10.435 --> 32:15.745 driver faces is between killing one and killing five. 32:15.750 --> 32:19.750 Whereas in the transplant case, the choice that the 32:19.750 --> 32:22.400 doctor faces is between killing one and 32:22.400 --> 32:24.590 letting five die. 32:24.590 --> 32:27.510 And if we were to graph these on what I'll call the 32:27.510 --> 32:32.950 bad-o-meter, which tells us how bad things are, we would 32:32.950 --> 32:42.870 discover that letting five die is bad, but killing one is 32:42.870 --> 32:48.310 worse, and killing five is even worse. 32:48.310 --> 32:51.350 And so this seems to give us the answer that since killing 32:51.350 --> 32:54.100 five is worse than killing one, then in the trolley 32:54.100 --> 32:58.540 driver case, it's OK for him to turn the trolley, but since 32:58.540 --> 33:01.640 killing one is worse than letting five die, then in the 33:01.640 --> 33:06.000 doctor case, it's not OK to chop up the one man. 33:06.000 --> 33:08.710 Because in the doctor case, you have to kill once to save 33:08.710 --> 33:12.440 five, whereas in the trolley case, the driver has to kill 33:12.440 --> 33:16.740 one in order not to kill five. 33:16.740 --> 33:19.230 And that seems to accord pretty well with your 33:19.225 --> 33:20.515 intuitions. 33:20.520 --> 33:24.040 93% of you think it's permitted, in the trolley 33:24.040 --> 33:28.770 case, to turn the trolley, whereas only 14% of you think 33:28.770 --> 33:33.430 it's permitted in the doctor's case to kill the one. 33:33.430 --> 33:36.860 So it looks like this bad-o-meter is pretty well 33:36.860 --> 33:38.750 capturing the intuition of those 33:38.745 --> 33:40.425 of you in this classroom. 33:40.425 --> 33:43.475 And in fact, empirical studies that have been done on 33:43.480 --> 33:46.160 thousands and thousands of people throughout the world 33:46.156 --> 33:48.866 suggest that your intuitions are pretty much in line with 33:48.870 --> 33:51.280 the intuitions of most. 33:51.280 --> 33:54.790 But there's a problem. 33:54.790 --> 33:58.290 Case number three, trolley bystander. 33:58.290 --> 33:59.300 Here's Jim. 33:59.295 --> 34:00.525 Poor Jim. 34:00.530 --> 34:01.560 Really bad luck. 34:01.560 --> 34:03.860 First he shows up in this Latin American town, and he's 34:03.860 --> 34:05.430 supposed to shoot some Indians. 34:05.429 --> 34:08.499 Now here he is, next to a trolley which is hurtling down 34:08.500 --> 34:11.240 a track, about to kill five people. 34:11.239 --> 34:14.839 But here there is, in the middle of the track, a switch 34:14.840 --> 34:19.580 that if Him turns will cause the trolley to kill the one. 34:19.582 --> 34:20.662 Question. 34:20.659 --> 34:26.609 For Jim, the bystander, is it morally mandatory for him to 34:26.610 --> 34:29.520 turn the trolley so that instead of the trolley hitting 34:29.519 --> 34:32.219 the five, it hits the one? 34:32.215 --> 34:36.145 Is it morally permitted, but not morally mandatory for him 34:36.147 --> 34:37.347 to turn the trolley? 34:37.350 --> 34:38.770 That's answer two. 34:38.770 --> 34:45.920 Or is it morally prohibited for him to turn the trolley? 34:45.920 --> 34:48.120 See how those numbers come out. 34:58.280 --> 34:59.050 OK. 34:59.050 --> 35:00.400 And here are your numbers. 35:00.400 --> 35:02.940 15, 70, 15. 35:02.940 --> 35:08.020 These are very, very similar to the distribution of answers 35:08.020 --> 35:11.050 that you gave in the driver case. 35:11.050 --> 35:15.350 In the driver case, 63% of you thought it was morally 35:15.350 --> 35:17.940 permitted, whereas here 70% of you think 35:17.940 --> 35:19.770 it's morally permitted. 35:19.770 --> 35:22.630 In the driver case, 30% of you thought 35:22.630 --> 35:24.240 it was morally mandatory. 35:24.240 --> 35:25.660 Here slightly fewer of you think it's 35:25.660 --> 35:29.200 morally mandatory, 15%. 35:29.200 --> 35:32.880 And in the driver case, 7% of you thought it was prohibited. 35:32.880 --> 35:36.270 Here 15% of you think it's morally prohibited. 35:36.270 --> 35:40.330 So there's a little change, but not a lot of change. 35:40.330 --> 35:43.900 Here's the problem. 35:43.900 --> 35:49.200 Remember that in Foot's analysis of the case, we knew 35:49.200 --> 35:53.120 that letting five die was a little bit bad, that killing 35:53.120 --> 35:56.650 one was worse, and that killing five 35:56.650 --> 35:59.120 was worse than that. 35:59.120 --> 36:02.080 Trolley driver faced the choice of killing one versus 36:02.080 --> 36:02.920 killing five. 36:02.920 --> 36:06.160 In transplant, you face the choice of killing ones versus 36:06.160 --> 36:08.370 letting five die. 36:08.370 --> 36:14.220 But what's going on in the bystander case? 36:14.220 --> 36:19.550 Well, in the bystander case, Jim, Jim of the bad luck, 36:19.550 --> 36:24.790 faces a choice between killing one, diverting the trolley 36:24.790 --> 36:31.310 onto the track in such a way that Jim kills that guy--or 36:31.310 --> 36:34.460 letting five die. 36:34.460 --> 36:38.070 Letting the trolley hit the five that it's going to hit 36:38.070 --> 36:40.420 inevitably. 36:40.420 --> 36:46.400 But in contrast to the doctor case, where 85% of you thought 36:46.400 --> 36:51.460 it was prohibited to kill the one in order to save the five 36:51.460 --> 36:58.470 who would otherwise die, in this case, 85% of you think 36:58.472 --> 37:04.342 it's at least permitted to kill the one in order to let 37:04.340 --> 37:05.430 the five die. 37:05.430 --> 37:07.370 Let me do that again for you. 37:07.370 --> 37:13.230 85% of you--watch the bad-o-meter--think that it 37:13.230 --> 37:15.560 goes the other way. 37:15.560 --> 37:16.800 Now what's going on? 37:16.800 --> 37:19.140 We thought we had a solution to the problem. 37:19.140 --> 37:22.530 The solution to the problem that differentiated transplant 37:22.530 --> 37:25.730 from trolley driver was the distinction between killing 37:25.730 --> 37:27.320 and letting die. 37:27.320 --> 37:30.500 And all of a sudden, there's a whole bunch of you who seem to 37:30.500 --> 37:35.890 be saying about bystander that letting five die is worse than 37:35.890 --> 37:37.270 killing one. 37:37.270 --> 37:39.750 You must think that, or you wouldn't think that it's 37:39.750 --> 37:45.780 morally, at least, permitted for him to turn the trolley. 37:45.780 --> 37:52.190 Moreover, stuff gets even worse. 37:52.190 --> 37:58.500 Suppose that the hospital case comes about as follows. 37:58.500 --> 38:03.540 Five healthy individuals show up at the hospitals, and a 38:03.540 --> 38:08.690 doctor--either because he's tired or because he wants to 38:08.690 --> 38:12.850 get the insurance benefits of which he is a beneficiary if 38:12.850 --> 38:14.420 there are a lot of sick patients in his 38:14.420 --> 38:18.980 hospital--poisons the five who show up, in such a way that 38:18.980 --> 38:20.720 one of them needs a liver, one of them needs a kidney, one of 38:20.720 --> 38:22.910 them needs lungs, one of them needs a heart, and one of them 38:22.905 --> 38:24.435 needs a brain. 38:24.440 --> 38:30.590 And so as a result of what that man has done, these five 38:30.590 --> 38:33.620 individuals will die. 38:33.620 --> 38:37.420 And it's a few hours later, and he thinks, "Oh! 38:37.422 --> 38:39.672 I forgot about the categorical imperative! 38:39.670 --> 38:41.730 Shoot! 38:41.730 --> 38:46.620 What am I going to do?" And up shows a healthy individual, 38:46.620 --> 38:50.700 and he thinks, "Oh, God, I've got a solution, here, I can 38:50.700 --> 38:51.450 chop him up. 38:51.450 --> 38:56.320 Heart, lung, kidneys, liver, brain, and I 38:56.320 --> 38:58.480 can save the five. 38:58.480 --> 38:59.760 OK." 38:59.760 --> 39:00.260 Question. 39:00.260 --> 39:02.250 For the doctor who has poisoned the five individuals 39:02.250 --> 39:04.900 who earlier showed up at the hospital, who now faces the 39:04.900 --> 39:09.560 option of saving their lives by killing the one, is it A, 39:09.560 --> 39:13.910 morally mandatory to chop up the healthy man, B, morally 39:13.905 --> 39:16.185 permitted, but not morally mandatory, to chop up the 39:16.190 --> 39:20.360 healthy man, or C, morally prohibited to chop up the 39:20.360 --> 39:20.990 healthy man? 39:20.990 --> 39:23.830 And let's see how the numbers come out. 39:32.130 --> 39:33.510 All right. 39:33.510 --> 39:35.490 So. 39:35.490 --> 39:38.710 82%, 11%, 7%. 39:38.710 --> 39:41.550 So your numbers here are almost identical to what they 39:41.550 --> 39:44.580 were in the original doctor case. 39:44.580 --> 39:50.050 There it was 6, 9, 85, here it's 7, 11, 82. 39:50.050 --> 39:52.910 Almost no difference. 39:52.910 --> 39:57.000 But let's go back to our bad-o-meter. 39:57.000 --> 39:59.880 We are going to set aside the killing and letting die 39:59.880 --> 40:03.620 question, and just look at the kill one versus kill five. 40:03.620 --> 40:07.020 So we know from trolley driver that killing 40:07.020 --> 40:09.360 one is pretty bad. 40:09.360 --> 40:14.940 But according to most of you, according to you 93% of you, 40:14.940 --> 40:20.000 killing five is worse than killing one. 40:20.000 --> 40:21.720 OK. 40:21.720 --> 40:24.090 Poison doctor. 40:24.090 --> 40:28.600 So here's what choice the doctor faces. 40:28.600 --> 40:31.880 He can kill five-- 40:31.880 --> 40:32.550 right? 40:32.550 --> 40:36.190 He's poisoned them, and now they're going to die. 40:36.194 --> 40:39.094 Or he can kill just one--that one healthy guy 40:39.090 --> 40:40.780 who just showed up. 40:40.782 --> 40:44.552 And then the other five won't die. 40:44.550 --> 40:52.520 82% of you told me that it was better for him to kill five 40:52.520 --> 40:53.810 than to kill one. 40:53.810 --> 40:57.370 Let me show you this again, on the bad-o-meter. 40:57.370 --> 41:03.230 85% of you thought that it was better for him to kill five 41:03.230 --> 41:06.300 than to kill one. 41:06.300 --> 41:09.020 So we have these two super-duper, excellent 41:09.020 --> 41:12.350 principles that seem to explain what was going on in 41:12.345 --> 41:13.815 the trolley case. 41:13.820 --> 41:17.460 On the one hand, that killing one was worse than letting 41:17.460 --> 41:19.550 five die, and then all of a sudden, bystander made us 41:19.550 --> 41:22.970 think, oh no, we don't have that intuition. 41:22.971 --> 41:26.231 And then we have the intuition that at least killing one was 41:26.230 --> 41:30.340 better than killing five, and the poison case made us 41:30.340 --> 41:33.150 rethink that as well. 41:33.150 --> 41:37.660 Now, there's an obvious issue that maybe making the moral 41:37.660 --> 41:38.990 difference here. 41:38.990 --> 41:42.390 There's a temporal difference between the killing of the one 41:42.390 --> 41:45.190 and the killing of the five happened. 41:45.190 --> 41:49.440 And perhaps, says Thomson, perhaps that's what explains 41:49.440 --> 41:53.450 our intuition in the doctor case. 41:53.450 --> 41:58.280 Perhaps it is because the killing of the five has become 41:58.280 --> 42:02.910 a letting die, as the result of time, that it's misleading 42:02.910 --> 42:08.020 to describe this as a kill one versus kill five case. 42:08.020 --> 42:13.180 But the temporal is not going to help us with the transplant 42:13.180 --> 42:15.880 versus bystander case. 42:15.880 --> 42:21.770 Those seem pretty clearly to be both cases where one faces 42:21.770 --> 42:26.920 the choice between killing one and letting five die. 42:26.920 --> 42:30.530 And whereas it seemed pretty clear to most of you in 42:30.530 --> 42:33.370 transplant that killing one was worse than letting five 42:33.370 --> 42:36.810 die, it seems pretty clear that for most of you in 42:36.810 --> 42:41.830 bystander, it's the other way around. 42:41.830 --> 42:46.920 So the trolley problem is the problem raised by these 42:46.920 --> 42:49.690 dancing arrows. 42:49.691 --> 42:55.711 How is it that we systematize our intuitions about killing 42:55.710 --> 42:59.360 and letting die, given that they appear to come apart in 42:59.360 --> 43:01.150 these cases? 43:01.150 --> 43:06.390 So Thomson suggests that whereas the transplant case 43:06.390 --> 43:08.680 gives us a choice between killing one and letting five 43:08.680 --> 43:13.740 die, as does bystander, there is a potentially relevant 43:13.740 --> 43:16.190 difference between them. 43:16.190 --> 43:20.910 And that is that in the case of transplant, you are using 43:20.910 --> 43:25.050 the one as a means. 43:25.050 --> 43:29.720 You're using the one as a way to achieve the outcome of 43:29.717 --> 43:31.717 saving the other five. 43:31.720 --> 43:35.370 Whereas in bystander, when you're diverting the trolley 43:35.370 --> 43:38.750 onto the track where that one individual is, you're not 43:38.750 --> 43:41.290 using that individual as a means, and--oh my goodness, I 43:41.290 --> 43:43.770 told you we'd get back to Kant, and we have! 43:43.770 --> 43:45.810 What did Kant's formula of humanity say? 43:45.805 --> 43:49.395 Kant's formula of humanity said, "so act as to treat 43:49.400 --> 43:52.910 humanity, whether in your own person or that of any other, 43:52.910 --> 43:58.370 in every case as an end, and never merely as a means only." 43:58.370 --> 44:00.680 So maybe that's our solution. 44:00.680 --> 44:05.690 Maybe the problem in bystander is that since you're not 44:05.690 --> 44:09.450 treating him as a means, it's OK to kill the one. 44:09.450 --> 44:11.990 Whereas in transplant, since you are treating him as a 44:11.994 --> 44:16.704 means, it's not OK to kill the one, and consequently, you're 44:16.700 --> 44:21.210 morally obliged to what the five die. 44:21.210 --> 44:25.960 Well, says Thomson, that can't be quite right. 44:25.960 --> 44:30.200 Suppose that you're Jim, standing next to the trolley. 44:30.200 --> 44:32.420 Trolley's on its usual path to kill the five. 44:32.420 --> 44:35.490 But here, instead of the straight track on which the 44:35.490 --> 44:40.300 one sits, there's a looped track, and the one is in the 44:40.300 --> 44:44.840 middle of the track in such a way that if you divert the 44:44.840 --> 44:49.770 trolley, it will hit him, thereby saving the five. 44:49.770 --> 44:51.020 Question. 44:51.020 --> 44:56.790 In the case that Thomson calls "loop," is it A, morally 44:56.790 --> 45:00.000 mandatory to turn the trolley, that's one. 45:00.000 --> 45:03.400 B, morally permitted, but not morally mandatory, or C, 45:03.400 --> 45:04.150 morally prohibited? 45:04.150 --> 45:06.360 So remember, trolley's heading down the 45:06.360 --> 45:08.450 track towards the five. 45:08.450 --> 45:09.800 You're Jim. 45:09.800 --> 45:12.810 The trolley's going to hit the five, or you can divert the 45:12.807 --> 45:16.757 trolley onto the track with the one, and because the one 45:16.760 --> 45:23.710 is there, you will cause the trolley to stop. 45:23.710 --> 45:23.930 OK? 45:23.925 --> 45:26.935 So let's see how the numbers come out on this. 45:37.910 --> 45:39.420 OK. 45:39.420 --> 45:43.830 So 68, 18, 14. 45:43.830 --> 45:47.250 Not a lot of difference. 45:47.250 --> 45:52.280 You've answered bystander in loop in almost the same way 45:52.280 --> 45:54.750 that you've answered all our other trolley cases. 45:54.750 --> 45:58.900 Generally the distribution has been 15, 70, 15. 45:58.900 --> 46:00.150 That was bystander. 46:00.150 --> 46:03.370 Here it's 14, 68, 18. 46:03.370 --> 46:08.100 But notice that in this case, you were using the guy on the 46:08.100 --> 46:10.260 loop track as a means! 46:10.260 --> 46:12.580 You're using him as a means to your end. 46:12.580 --> 46:17.650 You're trying to stop the trolley by using his body. 46:17.650 --> 46:21.400 Kant didn't help us enough! 46:21.400 --> 46:22.380 Restock. 46:22.380 --> 46:25.100 Let's take stock again, says Thomson. 46:25.100 --> 46:28.640 Perhaps some of the work is being done by 46:28.640 --> 46:32.200 some notion of rights. 46:32.200 --> 46:36.470 Perhaps what's going on in the transplant case, the one where 46:36.470 --> 46:39.110 you guys are not going to let the doctor chop up the healthy 46:39.110 --> 46:42.860 man, is that you would be violating that man's rights. 46:42.860 --> 46:46.970 And perhaps it's true that "rights trump utility." That 46:46.970 --> 46:50.830 is, that when somebody has a right to bodily integrity, 46:50.830 --> 46:54.750 that takes precedence over the needs of the many. 46:54.750 --> 46:58.330 The utilities of the five that are going to saved. 46:58.330 --> 47:02.190 And we'll close this lecture with the final example, one 47:02.190 --> 47:05.600 that's meant to test that hypothesis. 47:05.600 --> 47:08.580 And we'll begin next lecture like by talking about some of 47:08.580 --> 47:11.810 the reasons that people tend to give this response. 47:11.810 --> 47:14.730 So suppose now that instead of the looping 47:14.730 --> 47:17.740 track, there's a bridge. 47:17.740 --> 47:22.170 And suppose that on that bridge is our 47:22.170 --> 47:24.870 fairly large gentleman. 47:24.870 --> 47:29.390 And you are now faced with the following dilemma. 47:29.390 --> 47:31.020 The trolley's heading down the track. 47:31.020 --> 47:32.950 It's about to kill the five. 47:32.950 --> 47:36.970 And here's how Jim the bystander could stop it. 47:36.970 --> 47:41.930 He could push the fat man off the bridge, and thereby cause 47:41.930 --> 47:45.850 the trolley to be stopped in its tracks by his weight. 47:45.850 --> 47:47.170 Question. 47:47.170 --> 47:51.770 For the bystander in fat man, is it morally mandatory to 47:51.770 --> 47:55.020 push the fat man, morally permitted but not morally 47:55.020 --> 47:58.170 mandatory, or morally prohibited to 47:58.170 --> 47:59.890 push the fact man? 47:59.890 --> 48:02.990 And let's take responses, and I'll leave you with those 48:02.990 --> 48:07.760 numbers and a remark about them as our close. 48:07.760 --> 48:13.050 So let's see whether we get any shift in the fat man case. 48:13.050 --> 48:14.260 My goodness! 48:14.256 --> 48:16.556 That looks awfully different. 48:16.560 --> 48:18.370 What is going on? 48:18.370 --> 48:21.120 So remember, our classic distribution is that we have 48:21.115 --> 48:25.165 roughly 70% here, and no more than 10% in the prohibited. 48:25.170 --> 48:28.620 All of a sudden, 78% of you think it's prohibited. 48:28.620 --> 48:30.230 What's going on? 48:30.232 --> 48:31.602 Cliffhanger! 48:31.600 --> 48:34.420 We'll talk about it on Tuesday.