WEBVTT 00:00.510 --> 00:05.360 PROFESSOR: So we left ourselves at the end of the 00:05.360 --> 00:10.110 last lecture in a somewhat perplexing situation. 00:10.110 --> 00:15.780 We had thought through the particular scenarios that Judy 00:15.780 --> 00:19.310 Thomson presents us with in her trolley paper. 00:19.310 --> 00:21.810 And we had discovered the following apparently 00:21.810 --> 00:26.640 perplexing feature about the class's responses. 00:26.640 --> 00:31.650 In what's called the Classic Bystander case-- 00:31.650 --> 00:35.820 the case where there's a bystander standing next to a 00:35.820 --> 00:39.860 trolley that's hurtling down a track about to hit five 00:39.860 --> 00:43.780 people, and the bystander could if he chose turn the 00:43.780 --> 00:45.810 trolley onto a track where the trolley will 00:45.810 --> 00:47.480 only hit one person-- 00:47.480 --> 00:50.510 your responses were as follows. 00:50.510 --> 00:54.820 Roughly 15% of you thought he was morally required to turn 00:54.820 --> 00:57.480 the trolley from the five to the one. 00:57.480 --> 01:01.300 70% of you thought he was morally permitted to do so. 01:01.300 --> 01:05.130 And only 15% of you thought that it's a morally prohibited 01:05.130 --> 01:07.790 act for him to turn the trolley from 01:07.790 --> 01:10.600 the five to the one. 01:10.600 --> 01:14.700 By contrast, we ended class with Thomson's 01:14.700 --> 01:17.030 famous Fat Man case. 01:17.030 --> 01:20.360 This is a case where our bystander is standing next to 01:20.360 --> 01:23.720 the trolley as before, the trolley is hurtling down the 01:23.720 --> 01:27.940 track about to kill the five, and the bystander has 01:27.940 --> 01:31.590 available to him a means for stopping the trolley. 01:31.590 --> 01:34.260 In this case, rather than turning it onto a different 01:34.260 --> 01:37.920 track, the means he has available to him is to push a 01:37.920 --> 01:41.480 fat man off a bridge, thereby stopping the 01:41.480 --> 01:43.340 trolley in its tracks. 01:43.340 --> 01:47.810 And your responses in this case exhibited a highly 01:47.810 --> 01:51.680 different distribution than they did in the first case. 01:51.680 --> 01:54.950 Whereas in the first case, 15% of you thought it was 01:54.950 --> 01:59.520 prohibited to stop the trolley from hitting the five by 01:59.520 --> 02:02.960 killing the one or by causing the trolley to kill the one, 02:02.960 --> 02:08.580 in the Fat Man case, 78% of you-- 02:08.580 --> 02:10.940 4/5 of the class-- 02:10.940 --> 02:14.420 thought that the act of turning or of stopping the 02:14.420 --> 02:19.820 trolley by putting in its way another person was morally 02:19.820 --> 02:21.100 prohibited. 02:21.100 --> 02:24.220 Now the puzzle that this raises, as you know from the 02:24.220 --> 02:27.860 end of last class, is that it seems that in both the 02:27.860 --> 02:31.100 Bystander case where one-- 02:31.100 --> 02:31.820 sorry. 02:31.820 --> 02:32.740 The puzzle is this. 02:32.740 --> 02:37.920 In the Bystander case, it seems clear to most people 02:37.920 --> 02:41.980 that killing one person is bad, but that 02:41.980 --> 02:45.940 letting five die is worse. 02:45.940 --> 02:49.870 Whereas in the Fat Man case, it seems 02:49.870 --> 02:52.800 to be just the inverse. 02:52.800 --> 02:57.860 So what Thomson asks us at the end of that paper, having run 02:57.860 --> 03:00.330 through a number of cases, including some that I didn't 03:00.330 --> 03:03.900 go over in this summary right now, is what could possibly 03:03.900 --> 03:07.590 explain the difference in our reactions to the Bystander 03:07.590 --> 03:09.400 case and the Fat Man case? 03:09.400 --> 03:13.880 And what she suggests is that whereas utility prohibits 03:13.880 --> 03:15.940 letting the five die-- 03:15.940 --> 03:20.260 that is, it would be better for the number of lives saved 03:20.260 --> 03:23.930 if we saved five than if we saved one-- 03:23.930 --> 03:29.720 the notion of a right is what prohibits killing the one in 03:29.720 --> 03:31.350 the Fat Man case. 03:31.350 --> 03:34.890 So what has to happen, says Thomson in the Fat Man case, 03:34.890 --> 03:39.120 is that you interfere with his right not to have his person 03:39.120 --> 03:43.220 used as a means to the end of saving another. 03:43.220 --> 03:48.920 Whereas in the case of Bystander, there's no right 03:48.920 --> 03:51.300 that is infringed upon. 03:51.300 --> 03:57.020 And, suggests Thomson, rights trump utilities. 03:57.019 --> 04:02.119 So what the right prohibits is what is mandated 04:02.120 --> 04:04.480 in the Fat Man case. 04:04.480 --> 04:10.600 So that's where we were at the end of class last time. 04:10.600 --> 04:14.180 And the solution that Thomson proposed there is what we 04:14.180 --> 04:19.100 might call a classic solution to trolley-type dilemmas. 04:19.100 --> 04:23.000 It's a solution that assumes that Fat Man case and the 04:23.000 --> 04:27.270 Bystander case carry different moral mandates, and that the 04:27.270 --> 04:30.550 reason they carry those different mandates is because 04:30.550 --> 04:35.940 of a deep moral difference that those cases encode. 04:35.940 --> 04:39.370 So the difference between our response to Fat Man and our 04:39.370 --> 04:43.870 response to Bystander, says Thomson in that 1985 article, 04:43.870 --> 04:46.030 is one that we should respect. 04:46.030 --> 04:49.080 And the reason we should respect that difference, she 04:49.080 --> 04:52.300 contends, is that that difference is tracking a 04:52.300 --> 04:57.280 profound moral difference between them, namely that in 04:57.277 --> 05:00.727 the case of Fat Man but not in the case of Bystander, the 05:00.730 --> 05:05.240 rights of an individual are violated. 05:05.240 --> 05:09.600 What I want to do in class today is to go through with 05:09.600 --> 05:16.000 you three non-classic responses to the trolley case. 05:16.000 --> 05:19.420 And I'll be giving you the chance to use your clickers in 05:19.420 --> 05:22.160 the first and third of these. 05:22.160 --> 05:25.200 So if you get your clickers out, we'll be prepared for 05:25.200 --> 05:26.840 what's going to happen in a few minutes. 05:26.840 --> 05:29.290 So what are the three classic responses? 05:29.290 --> 05:32.170 Remember, in a classic response, the claim is that 05:32.170 --> 05:35.130 Fat Man and Bystander carry different moral mandates, and 05:35.130 --> 05:37.900 that that difference can be traced to a deeper, morally 05:37.900 --> 05:40.270 relevant difference between them. 05:40.270 --> 05:44.490 So two of the responses that we'll consider today are ones 05:44.490 --> 05:48.430 that suggest that Fat Man and Bystander in fact don't carry 05:48.430 --> 05:50.740 different moral mandates. 05:50.740 --> 05:54.380 So the first example that I'm going to run through with you 05:54.380 --> 06:00.590 is Judy Thomson's rethinking of trolley cases in a 2008 06:00.590 --> 06:05.960 paper in which she ends up assimilating the Bystander 06:05.960 --> 06:08.700 case to the Fat Man case. 06:08.700 --> 06:14.280 And suggesting that in neither of the cases is it permissible 06:14.280 --> 06:17.960 to kill the one to save the five. 06:17.960 --> 06:23.170 The second view that we'll consider is Josh Greene's view 06:23.170 --> 06:27.220 that the right thing to do in the Fat Man case is the same 06:27.220 --> 06:31.050 thing as the right thing to do in the Bystander case, namely 06:31.050 --> 06:35.710 that in both cases, the right thing to do is to stop the 06:35.710 --> 06:39.030 trolley from hitting the five and cause it 06:39.030 --> 06:41.430 instead to kill the one. 06:41.430 --> 06:42.900 And finally-- 06:42.900 --> 06:46.080 I'm shoe-horning this a bit, because in truth, Sunstein is 06:46.080 --> 06:48.890 a little closer to Greene than he is to Thomson. 06:48.890 --> 06:55.360 But we might use his thinking to maintain the position that 06:55.360 --> 07:01.900 though our responses to the cases differ, the cases are in 07:01.900 --> 07:04.400 some more fundamental sense the same. 07:04.400 --> 07:07.160 And what Sunstein is going to suggest we need to do is to 07:07.160 --> 07:08.960 push the fat man. 07:08.960 --> 07:12.050 So what we have are three views here. 07:12.050 --> 07:14.480 Thomson's saying the cases come together, and they come 07:14.480 --> 07:17.100 together in telling us never to kill the 07:17.100 --> 07:18.780 one to save the five. 07:18.780 --> 07:21.120 Greene's saying the cases come together, and they come 07:21.120 --> 07:25.220 together in telling us always kill the one to save the five. 07:25.220 --> 07:29.490 And then, perhaps, Sunstein's view telling us that the cases 07:29.490 --> 07:31.450 come apart. 07:31.450 --> 07:34.320 But these three non-classic responses are interesting not 07:34.320 --> 07:37.700 just for the difference in their content, I think they're 07:37.700 --> 07:42.470 interesting for the purposes of this class because each of 07:42.470 --> 07:46.000 them makes use of a slightly different kind of 07:46.000 --> 07:48.560 argumentative methodology. 07:48.560 --> 07:52.340 And there's no reason that the methodologies and the answers 07:52.340 --> 07:55.490 needed to line up in the way that they did. 07:55.490 --> 07:58.590 So one of the things that I want you to think about as we 07:58.590 --> 08:04.280 go through today's lecture is what use might be made of each 08:04.280 --> 08:06.710 of these methodologies to make one of 08:06.710 --> 08:08.510 the alternative arguments. 08:08.510 --> 08:13.280 So Thomson's contention that in the Bystander case we 08:13.280 --> 08:16.990 shouldn't turn the trolley is one that she makes on the 08:16.990 --> 08:19.180 basis of inviting you-- 08:19.179 --> 08:20.849 as I will do in a moment-- 08:20.850 --> 08:25.410 to consider additional hypothetical cases, and then 08:25.410 --> 08:34.290 asking you to be consistent about your responses to cases 08:34.290 --> 08:36.960 that fail to differ in moral ways. 08:36.960 --> 08:41.070 So Thomson's methodology is the same as it was in her 1985 08:41.070 --> 08:45.270 paper, there's just a new case that she's thought about. 08:45.270 --> 08:50.030 Sunstein's methodology is to canvas a large array of 08:50.030 --> 08:53.870 literature in the heuristics and biases tradition, and to 08:53.870 --> 08:58.640 suggest that moral reasoning is no different than any other 08:58.640 --> 09:00.710 sort of reasoning. 09:00.710 --> 09:05.790 And Josh Greene's method is of course to make use of 09:05.790 --> 09:10.970 neuroimaging results and on that basis to argue in favor 09:10.970 --> 09:16.760 of his view that what is morally mandated of us is a 09:16.760 --> 09:19.770 certain kind of utilitarian stance. 09:19.770 --> 09:21.540 So let's start-- 09:21.540 --> 09:23.870 and here you'll need your clickers-- 09:23.870 --> 09:27.730 with the additional hypothetical cases that 09:27.730 --> 09:32.410 convinced Judy Thomson, and may convince you, that it's 09:32.410 --> 09:36.650 not OK to turn the trolley in Bystander. 09:36.650 --> 09:40.330 So the case that Thomson presents us with is a case 09:40.330 --> 09:44.290 that we'll call Bystander's Three Options. 09:44.290 --> 09:48.680 So here's poor Jim, deeply regretting that he ever 09:48.680 --> 09:53.590 enrolled in this class, standing by the trolley in a 09:53.590 --> 09:56.720 usual Bystander dilemma where the trolley is about to hit 09:56.720 --> 09:59.350 the five and Jim has the possibility of deflecting it 09:59.350 --> 10:01.220 to hit the one. 10:01.220 --> 10:06.480 But because Jim lives his life in Judy Thomson's thought 10:06.480 --> 10:12.020 experiment, she has, in rather dastardly fashion, introduced 10:12.020 --> 10:16.810 a third track at the end of which, rather unfortunately 10:16.810 --> 10:22.790 for Jim, Jim is standing. 10:22.790 --> 10:27.210 Now here's Jim's three-way dilemma. 10:27.210 --> 10:36.820 One, allow the trolley to continue on its original path 10:36.820 --> 10:40.330 killing the five. 10:40.330 --> 10:44.840 Option two, deflect the trolley so that it 10:44.840 --> 10:46.620 hits the other guy. 10:49.120 --> 10:51.990 Option three, deflect the trolley from 10:51.990 --> 10:53.990 the five to the one-- 10:53.990 --> 10:57.940 oh, except the one is Jim. 10:57.940 --> 11:00.070 Question. 11:00.070 --> 11:07.010 In three-way Bystander, if Jim decides to turn the trolley-- 11:07.010 --> 11:08.510 so we're ignoring the case where he 11:08.510 --> 11:11.150 lets it hit the five-- 11:11.150 --> 11:15.010 he's made the decision to turn the trolley, the question is 11:15.010 --> 11:16.500 the following. 11:16.500 --> 11:21.880 Is it morally required for him to turn the trolley onto the 11:21.880 --> 11:25.430 track where it hits the other guy instead of himself? 11:25.430 --> 11:29.820 Is it morally permitted, but not morally required, for him 11:29.820 --> 11:33.140 to turn the trolley onto the track where it hits the other 11:33.140 --> 11:35.450 guy instead of himself? 11:35.450 --> 11:40.220 Or is it morally prohibited for him to turn the trolley 11:40.220 --> 11:43.570 onto the track instead of to himself? 11:43.570 --> 11:47.190 So we're assuming that Jim has made the decision to turn the 11:47.190 --> 11:48.080 trolley from the five. 11:48.080 --> 11:51.740 After all, it's a straight Bystander case. 11:51.740 --> 11:53.310 If he doesn't turn the trolley, it's 11:53.310 --> 11:54.810 going to hit the five. 11:54.810 --> 11:59.280 78% of you have previously told me that what one ought to 11:59.280 --> 12:01.650 do, or at least what one is permitted to do in this case, 12:01.650 --> 12:02.830 is to turn the trolley. 12:02.830 --> 12:06.830 How come there's no responses coming, guys? 12:06.828 --> 12:08.108 STUDENTS: [INTERPOSING VOICES]. 12:08.110 --> 12:09.360 PROFESSOR: It's not working? 12:09.360 --> 12:10.100 Oh, my goodness. 12:10.100 --> 12:10.410 All right. 12:10.410 --> 12:12.700 So why is it not open for you? 12:12.700 --> 12:13.420 Let's try. 12:13.420 --> 12:14.660 Is it open now? 12:14.663 --> 12:15.613 STUDENT: No. 12:15.610 --> 12:16.710 PROFESSOR: Tragic. 12:16.710 --> 12:19.280 This is really, really, very, very horrible. 12:19.280 --> 12:20.620 That did not work. 12:20.620 --> 12:26.290 OK, the whole lecture today depends upon these working. 12:26.290 --> 12:29.240 So let's try this again. 12:29.240 --> 12:34.710 And tell me now whether this works. 12:34.710 --> 12:36.130 Is it working? 12:36.130 --> 12:37.740 OK. 12:37.740 --> 12:38.990 Is it working now? 12:41.810 --> 12:42.460 No? 12:42.460 --> 12:43.710 Still no? 12:45.890 --> 12:47.140 No? 12:49.240 --> 12:50.660 All right. 12:50.660 --> 12:51.790 Hm. 12:51.790 --> 12:54.550 We're going to have to run-- 12:54.550 --> 12:56.900 I think there's nothing I can do. 12:56.900 --> 13:00.330 I'm going to try resetting once more 13:00.330 --> 13:01.340 and see if that works. 13:01.340 --> 13:03.190 And I'm going to try removing and then 13:03.190 --> 13:06.340 returning this receiver. 13:06.340 --> 13:07.620 And then-- 13:07.620 --> 13:10.210 if not-- we're going to do the old-fashioned show of hands 13:10.210 --> 13:13.060 and all my beautifully constructed slides will turn 13:13.060 --> 13:16.500 out not to be useful, but that's all right. 13:16.500 --> 13:18.870 Worse things have happened in the world. 13:18.870 --> 13:19.330 All right. 13:19.330 --> 13:20.980 Try it again. 13:20.980 --> 13:21.900 Yay! 13:21.900 --> 13:22.370 Awesome. 13:22.370 --> 13:23.970 I have no idea what I changed. 13:23.970 --> 13:24.530 OK. 13:24.530 --> 13:26.050 So, answering this question. 13:26.050 --> 13:26.290 Wow. 13:26.290 --> 13:27.400 There's 64 of you. 13:27.400 --> 13:28.550 There's 71 of you. 13:28.550 --> 13:29.630 We'll do the countdown. 13:29.630 --> 13:30.820 10, 9, 8-- 13:30.820 --> 13:35.050 so let's see how the numbers come out in-- 13:35.050 --> 13:38.050 4, 3, 2, 1 seconds. 13:38.050 --> 13:39.580 Oh, and it's so exciting. 13:39.580 --> 13:41.270 Especially because we had to suffer first. 13:41.270 --> 13:42.900 The contrast. OK. 13:42.900 --> 13:48.150 So in this case, 6% of you think it's morally required 13:48.150 --> 13:51.280 for Jim to turn the trolley onto the other man. 13:51.280 --> 13:56.140 But you were the 6% who continue to be outliers, or 13:56.140 --> 13:57.090 perhaps you're different people. 13:57.090 --> 13:59.800 But let's look at what's going on. 13:59.800 --> 14:03.610 61% of you think it's morally permitted for him to turn the 14:03.610 --> 14:06.710 trolley onto the other man. 14:06.710 --> 14:10.490 And 32% of you think it's morally prohibited for him to 14:10.490 --> 14:12.560 turn the trolley onto the other man. 14:12.560 --> 14:17.660 Now interestingly, Judy Thomson expects that more of 14:17.660 --> 14:21.250 you will fall into this category. 14:21.250 --> 14:23.950 So it's an interesting question for us to think about 14:23.950 --> 14:28.660 as a class why it is that she is under the impression that 14:28.660 --> 14:34.230 it's rather surprising that this is the response that you 14:34.230 --> 14:38.930 gave. But in any case, let's move to a second contrast case 14:38.930 --> 14:40.570 and see how this goes. 14:40.570 --> 14:41.540 OK. 14:41.540 --> 14:46.340 Suppose now that we have only a two-way case. 14:46.340 --> 14:52.460 In the two-way case, bystander Jim has only two options. 14:52.460 --> 14:58.570 Either the trolley is going to hit the five or he can deflect 14:58.570 --> 15:01.510 the trolley in such a way that it hits him. 15:01.510 --> 15:04.200 I want to go back for a second and just get the numbers that 15:04.200 --> 15:06.540 I got on the last slide, because I forgot to record 15:06.540 --> 15:11.080 those for myself, thrown off as I was by our situation. 15:11.080 --> 15:12.810 So let me just record these. 15:12.810 --> 15:15.200 6%, 61%, 32%. 15:15.200 --> 15:15.690 OK. 15:15.690 --> 15:17.400 So moving on to the new case. 15:17.400 --> 15:19.500 It's a two-way trolley, and the question is this. 15:19.500 --> 15:24.020 In Bystander's Two Options, is it morally required for him to 15:24.020 --> 15:27.190 let the trolley hit the five instead of himself, is it 15:27.190 --> 15:30.490 morally permitted for him to let the trolley hit the five 15:30.490 --> 15:34.910 instead of himself, or is it morally prohibited for him to 15:34.910 --> 15:38.030 let the trolley hit the five instead of himself? 15:38.030 --> 15:38.810 OK? 15:38.810 --> 15:40.190 So let's think through that case. 15:40.190 --> 15:41.640 So remember, it's a two-way case. 15:41.640 --> 15:43.590 The trolley's heading down towards the five. 15:43.590 --> 15:47.050 And the question is: is it required, permitted, or 15:47.050 --> 15:49.810 prohibited for him to turn the trolley from 15:49.810 --> 15:53.460 the five to hit himself? 15:53.460 --> 15:54.710 OK. 15:54.710 --> 15:57.200 And let's see how the numbers come out here. 15:57.200 --> 16:00.290 We've got roughly 10 seconds to find out whether your 16:00.290 --> 16:04.470 distribution is going to be similar or different here. 16:07.180 --> 16:13.540 OK, so here's how the numbers come out: 8%, 70%, 22%. 16:13.540 --> 16:20.160 Now, the case with which we want to contrast this is the 16:20.160 --> 16:22.480 classic Bystander case. 16:22.480 --> 16:26.690 In the classic Bystander case, more of you thought he was 16:26.690 --> 16:30.670 morally required to turn the trolley than you think in this 16:30.670 --> 16:32.490 particular case. 16:32.490 --> 16:36.430 In the classic Bystander case, interestingly, you had roughly 16:36.430 --> 16:40.710 the same view about whether it was morally permitted. 16:40.710 --> 16:45.540 And more of you think it's morally prohibited for him to 16:45.540 --> 16:48.960 let the trolley hit the five instead of himself. 16:48.960 --> 16:52.370 So the interesting difference is this one here. 16:52.369 --> 16:56.389 You took a different attitude with respect to whether it's 16:56.390 --> 16:59.250 morally required for him to turn the trolley when the 16:59.250 --> 17:02.970 person it's going to hit is himself than when the person 17:02.970 --> 17:06.180 it's going to hit is another person. 17:06.180 --> 17:10.850 So let's go back and do just a classic Bystander case and see 17:10.850 --> 17:14.410 whether, as a result of having thought through this case, if 17:14.410 --> 17:16.840 there's any change in your intuitions. 17:16.840 --> 17:19.870 So this is just the standard Bystander case that you've 17:19.870 --> 17:21.130 seen before. 17:21.130 --> 17:25.640 In the classic two-way Bystander case, do you think 17:25.640 --> 17:29.020 it's morally mandatory, morally permitted, or morally 17:29.020 --> 17:32.020 prohibited for Jim to turn the trolley? 17:44.530 --> 17:48.670 So we're 3, 2, 1. 17:48.670 --> 17:51.140 And let's see how the numbers come out. 17:51.140 --> 17:54.240 20%, 65%, 15%. 17:54.240 --> 17:59.340 So as a result of having thought about the first-person 17:59.340 --> 18:01.370 analogue, some-- 18:01.370 --> 18:03.710 though many fewer than I would have thought-- 18:03.710 --> 18:07.970 some of you changed your view. 18:07.970 --> 18:11.670 Whereas originally, 15% of you thought it was morally 18:11.670 --> 18:13.710 mandatory to turn the trolley-- 18:13.710 --> 18:16.780 oh, you've changed your view exactly the direction against 18:16.780 --> 18:17.990 the one I would have predicted. 18:17.990 --> 18:19.450 So here's a mystery. 18:19.450 --> 18:21.940 Here's a little bit of experimental philosophy done 18:21.940 --> 18:23.090 in our classroom. 18:23.090 --> 18:27.010 What Judy Thomson was predicting-- and we can talk 18:27.010 --> 18:29.170 in sections about why this didn't happen -- 18:29.170 --> 18:32.700 what Judy Thomson was predicting is that you would 18:32.700 --> 18:34.600 react as follows. 18:34.600 --> 18:37.810 If it's not morally mandatory for me to turn the trolley 18:37.810 --> 18:42.490 onto myself, then it's not morally mandatory, indeed not 18:42.490 --> 18:46.140 morally permitted, for me to turn the trolley 18:46.140 --> 18:49.470 onto another person. 18:49.470 --> 18:55.400 If I'm not willing to take a hit myself in that case, I 18:55.400 --> 18:59.650 shouldn't be deciding on behalf of another person that 18:59.645 --> 19:01.375 he take that hit. 19:01.380 --> 19:06.000 So I want you to think about what it is in Thomson's 19:06.000 --> 19:10.660 thinking about this case that made it feel to her so obvious 19:10.660 --> 19:13.130 that as the result of considering the first-person 19:13.130 --> 19:15.920 case, people would be inclined to rethink the 19:15.920 --> 19:17.340 third-person case. 19:17.340 --> 19:21.600 And I have to say, I myself in reading Thomson's 2008 paper 19:21.600 --> 19:24.190 am very easily brought into the mindset 19:24.190 --> 19:25.420 she describes there. 19:25.420 --> 19:29.220 So I find it surprising and extremely interesting to see 19:29.220 --> 19:32.060 that that isn't what happened in this context. 19:32.060 --> 19:37.380 Let's assume, however, that at least for some of you, the 19:37.380 --> 19:40.060 intuition that you came to have as the result of 19:40.060 --> 19:42.240 considering this case was something 19:42.240 --> 19:44.350 like Thomson's intuition. 19:44.350 --> 19:48.420 So that whereas on the old view in Bystander case, you 19:48.420 --> 19:51.830 thought the right thing was to kill the one rather than to 19:51.830 --> 19:54.920 let the five die-- that is, in the standard switch case, and 19:54.920 --> 19:57.530 this is in fact what most of you think-- in the standard 19:57.530 --> 20:01.080 Bystander case, most of you think that the right thing to 20:01.080 --> 20:05.690 do is to kill the one rather than to let the five die. 20:05.690 --> 20:10.440 What Thomson says is that in thinking through the 20:10.440 --> 20:14.870 first-person case, you ought to realize that Bystander is a 20:14.870 --> 20:19.310 lot more like Fat Man than you initially thought. 20:19.310 --> 20:24.330 To the extent that you reject that intuition of Thomson's, 20:24.330 --> 20:28.100 you're in a position to disagree with her. 20:28.100 --> 20:32.020 So let's move to the view with which I take it most of you 20:32.020 --> 20:34.440 are going to end up agreeing, since this is exactly the 20:34.440 --> 20:37.870 opposite of Thomson's, namely Greene's argument that the 20:37.870 --> 20:39.570 assimilation ought to go the other way. 20:39.570 --> 20:43.200 So just to remind you where we are in the picture, the puzzle 20:43.200 --> 20:45.880 with which we began is that people were giving a different 20:45.880 --> 20:50.980 response in Bystander than in Fat Man, and Thomson tried to 20:50.980 --> 20:54.610 get rid of the problem by causing you to assimilate 20:54.610 --> 20:57.380 Bystander to Fat Man. 20:57.380 --> 21:01.430 I was unable through Thomson's cases to get you to shift your 21:01.430 --> 21:03.670 intuitions in that case. 21:03.670 --> 21:06.580 So we're stuck with a residual difference 21:06.580 --> 21:08.310 between your responses. 21:08.310 --> 21:12.240 Most of you think it's OK to turn the trolley in Bystander 21:12.240 --> 21:15.780 regardless of whether you wouldn't do it on yourself, 21:15.780 --> 21:18.700 but that it's not OK to push the man on the 21:18.700 --> 21:20.870 bridge in Fat Man. 21:20.870 --> 21:25.640 So Greene's going to give us a second way of thinking about 21:25.640 --> 21:29.010 how it is that we might bring those responses together. 21:29.010 --> 21:32.470 And his argument runs as follows. 21:32.470 --> 21:37.450 In general, we're not in a very good position to 21:37.450 --> 21:42.870 determine what really underlies our reasoning. 21:42.870 --> 21:46.270 There's an entire tradition in social psychology that I 21:46.270 --> 21:50.070 talked about in one of the early lectures that aims to 21:50.070 --> 21:54.690 show that a lot of what people engage in when they make 21:54.690 --> 22:00.490 decisions is post-facto rationalization of intuitive 22:00.490 --> 22:04.790 responses that they had which weren't in fact tracking what 22:04.790 --> 22:06.840 they would say are the relevant 22:06.840 --> 22:09.300 features of the situation. 22:09.300 --> 22:13.840 So famously, people are more likely to choose an object 22:13.840 --> 22:18.200 that lies on the left-hand side of a visual array than an 22:18.200 --> 22:20.070 object that lies on the right-hand side 22:20.070 --> 22:21.610 of that visual array. 22:21.610 --> 22:25.840 But in making the choice, they don't provide as their reason 22:25.840 --> 22:29.370 the location of the object, they provide as their reason 22:29.370 --> 22:31.660 some other feature of the object. 22:31.660 --> 22:35.230 And when we looked at, in the second lecture, the 22:35.230 --> 22:38.970 confabulation results, whereby subjects who had undergone 22:38.970 --> 22:39.810 commissurotomy-- 22:39.810 --> 22:42.890 that is, whose corpus callosum had been severed-- 22:42.890 --> 22:45.710 so that the right and left hemispheres of their brains 22:45.710 --> 22:49.030 weren't in communication, we discovered that when they 22:49.030 --> 22:54.140 performed an act that was based on stimulation of the 22:54.140 --> 22:58.600 right brain, the left brain, which is the linguistic part, 22:58.600 --> 23:02.140 came up with an explanation for what they were doing that 23:02.140 --> 23:07.020 was obviously not the real source of their behavior. 23:07.020 --> 23:11.400 So there are many cases, Greene points out, where our 23:11.400 --> 23:14.120 motivations our opaque us. 23:14.120 --> 23:16.610 Where we think we're responding to one thing, but 23:16.610 --> 23:19.980 in fact we're responding to something else. 23:19.980 --> 23:25.900 One of those cases, says Greene, is the difference in 23:25.900 --> 23:29.390 our response to the Fat Man case and to 23:29.390 --> 23:31.420 the Bystander case. 23:31.420 --> 23:35.050 So what happens in the Bystander case-- 23:35.050 --> 23:37.730 where we're trying to decide whether to shift the trolley 23:37.730 --> 23:39.820 from the five to the one-- 23:39.820 --> 23:41.570 is that our rational 23:41.570 --> 23:45.350 processing system gets activated. 23:45.350 --> 23:48.700 Whereas what happens, hypothesizes Greene-- and 23:48.700 --> 23:51.800 we'll give some evidence in a minute-- what happens in the 23:51.800 --> 23:57.300 Fat Man case is that our emotional processing system 23:57.296 --> 23:59.656 gets activated. 23:59.660 --> 24:05.120 And says Greene, given the choice between our rational 24:05.120 --> 24:11.120 system and our emotional system, the rational system is 24:11.120 --> 24:17.990 the one whose outputs we ought to trust. So says Greene, the 24:17.990 --> 24:23.360 morally right thing to do in this case is to 24:23.360 --> 24:26.090 push the fat man. 24:26.090 --> 24:31.120 Notice that this is a multi-step argument, some of 24:31.120 --> 24:34.090 whose premises are a good deal more 24:34.090 --> 24:36.720 controversial than others. 24:36.720 --> 24:39.520 So the premise that our motivations are often opaque 24:39.520 --> 24:45.390 to us is completely undisputed by everyone. 24:45.390 --> 24:50.100 There's no question that often we aren't aware of what's 24:50.100 --> 24:54.310 causing us to respond in a particular way. 24:54.310 --> 24:59.130 I may be particularly irritable because my feet are 24:59.130 --> 25:03.610 wet, and unaware of the fact that the reason that I'm 25:03.610 --> 25:06.020 responding to you in a short-tempered way is not 25:06.020 --> 25:09.100 because you are particularly irritating, but because my 25:09.100 --> 25:11.160 feet are uncomfortable. 25:11.160 --> 25:15.280 This phenomenon is undeniable. 25:15.280 --> 25:18.000 The question of whether what actually explains our 25:18.000 --> 25:22.340 different responses in these two cases is an interesting 25:22.340 --> 25:24.530 empirical question. 25:24.530 --> 25:29.680 And there has been collected over the last decade or so 25:29.680 --> 25:33.680 some pretty interesting neuroimaging data suggesting 25:33.680 --> 25:38.270 that there are systematic activation differences in what 25:38.270 --> 25:42.830 goes on when people give utilitarian responses to cases 25:42.830 --> 25:46.770 and what goes on when people give responses to cases that 25:46.770 --> 25:49.690 seem to involve the sorts of notions to which 25:49.690 --> 25:51.210 deontologists appeal. 25:51.210 --> 25:54.050 Notions like rights. 25:54.050 --> 25:58.560 And there is a certain amount of additional evidence coming 25:58.560 --> 26:02.440 from other research that the areas that are differentially 26:02.440 --> 26:06.400 activated in those two cases correspond on the one hand 26:06.400 --> 26:10.390 with what is often thought of as a rational processing 26:10.390 --> 26:13.040 system-- a calculative processing system-- 26:13.040 --> 26:16.470 whereas in the other, they correspond with areas of the 26:16.470 --> 26:20.940 brain that have been in independent cases implicated 26:20.940 --> 26:22.460 in emotional processing. 26:22.460 --> 26:25.230 So the first premise is uncontroversial. 26:25.230 --> 26:28.600 The second premise is reasonably well-supported. 26:28.600 --> 26:32.690 There's controversy about the data, but there is scientific 26:32.690 --> 26:35.990 evidence for which there's a good argument to be made that 26:35.990 --> 26:39.590 what it shows is roughly what's written here. 26:39.590 --> 26:45.150 The controversial question is whether even if the first two 26:45.150 --> 26:51.350 premises are true, the third normative premise is true. 26:51.350 --> 26:57.000 Is it the case that if our responses to Fat Man are 26:57.000 --> 27:00.880 triggered by emotion, whereas our responses to Bystander are 27:00.880 --> 27:05.350 triggered by the rational system, is it the case that we 27:05.350 --> 27:08.840 ought to go with the rational system? 27:08.840 --> 27:13.910 That is a normative claim, not an empirical one. 27:13.910 --> 27:17.320 And even if the arguments that we're going to consider in a 27:17.320 --> 27:21.250 minute successfully establish the truth of the second 27:21.250 --> 27:26.580 premise, we don't yet have the truth of the third premise 27:26.580 --> 27:29.060 thereby established. 27:29.060 --> 27:37.000 So let's talk about evidence that Greene has found in favor 27:37.000 --> 27:41.140 of the premise that what goes on in cases like Fat Man is an 27:41.140 --> 27:44.290 emotional response, whereas what goes on in cases like 27:44.290 --> 27:47.770 classic Bystander is a rational response. 27:47.770 --> 27:55.370 So Greene has for the last decade or so put people into 27:55.370 --> 28:00.730 fMRI machines-- into scanners which track where blood is 28:00.730 --> 28:02.670 flowing in the brain-- 28:02.670 --> 28:05.730 and presented them in the scanners with 28:05.730 --> 28:09.010 three kinds of dilemmas. 28:09.010 --> 28:12.310 The first kind of dilemma are dilemmas that he calls 28:12.310 --> 28:15.020 moral/personal dilemmas. 28:15.020 --> 28:18.150 These are dilemmas like Fat Man where you're being asked 28:18.150 --> 28:20.770 whether you want to push the fat man off the bridge. 28:20.770 --> 28:23.760 Dilemmas like the doctor case, which I presented, where we're 28:23.760 --> 28:26.620 considering whether to cut up a healthy patient to save the 28:26.620 --> 28:27.770 lives of others. 28:27.770 --> 28:30.570 Dilemmas like a lifeboat case where there's not enough food 28:30.570 --> 28:32.820 and water to go around on the lifeboat and you're 28:32.820 --> 28:35.430 considering whether to throw off one of the people on the 28:35.430 --> 28:38.920 lifeboat so as to leave enough food and water to go around 28:38.920 --> 28:40.920 for the remaining subjects. 28:40.920 --> 28:44.030 So that's the first class of cases that he has subjects 28:44.030 --> 28:46.510 consider in the scanner. 28:46.510 --> 28:49.090 The second class of cases that he has people consider in the 28:49.090 --> 28:53.220 scanner are what he calls moral and impersonal cases. 28:53.220 --> 28:56.270 So these are cases like Bystander at the switch where 28:56.270 --> 28:59.490 you're facing a moral dilemma, but not one where you are 28:59.490 --> 29:02.870 imagining, in an up-close and personal way, causing 29:02.870 --> 29:05.920 particular harm to a particular individual who's in 29:05.920 --> 29:07.110 your proximity. 29:07.110 --> 29:10.340 Cases like ones where you've found a lost wallet and you 29:10.340 --> 29:12.590 need to decide whether to return it. 29:12.590 --> 29:15.840 Cases where you're voting on a policy that will have certain 29:15.840 --> 29:20.430 kinds of effects on people, but where those effects are 29:20.430 --> 29:23.080 relatively remote from you. 29:23.080 --> 29:26.180 And finally, he presents people with what he calls 29:26.180 --> 29:27.370 non-moral dilemmas. 29:27.370 --> 29:30.850 Questions like, if I'm trying to get from Cleveland to 29:30.850 --> 29:34.740 Chicago, should I take the bus or the train or a plane? 29:34.740 --> 29:37.890 Or if I'm trying to decide which coupon to use on the 29:37.890 --> 29:42.240 internet to save on shipping, should I do this or that? 29:42.240 --> 29:45.530 Cases that involve the same kinds of objects, right? 29:45.530 --> 29:48.010 Fat Man involves trains. 29:48.010 --> 29:51.330 Bus versus train involves trains. 29:51.330 --> 29:54.000 We might have a coupon-use case where you're using the 29:54.000 --> 29:56.110 coupon to buy a boat. 29:56.110 --> 29:58.280 Lifeboat involves a boat. 29:58.280 --> 30:01.680 So he has the subjects in the scanner and they're presented 30:01.680 --> 30:03.190 with these sorts of cases. 30:03.190 --> 30:06.330 And you'll notice that I've put a little color-coded box 30:06.330 --> 30:09.820 here of black, grey, and white. 30:09.820 --> 30:13.940 What Greene discovered in the 2001 paper-- 30:13.940 --> 30:16.520 and let me say some of these data have since been 30:16.520 --> 30:20.630 re-analyzed, so some of the details haven't held up, but 30:20.630 --> 30:22.190 many of them have-- 30:22.190 --> 30:28.770 what he discovered is that if one believes, as many do, that 30:28.770 --> 30:32.180 the brain areas listed here-- 30:32.180 --> 30:35.940 brain areas like medial frontal gyrus, and angular 30:35.940 --> 30:39.260 gyrus, and posterior cingulate gyrus-- 30:39.260 --> 30:44.320 if one believes that those are areas associated with emotion, 30:44.320 --> 30:48.120 then we have good evidence that in the moral/personal 30:48.120 --> 30:51.320 cases, the areas of the brain associated 30:51.320 --> 30:53.800 with emotion are activated. 30:53.800 --> 30:56.910 Whereas in the moral/impersonal and non-moral 30:56.910 --> 31:00.180 cases that doesn't occur. 31:00.180 --> 31:04.420 By contrast, it looks like a bunch of areas that are 31:04.420 --> 31:07.820 traditionally associated with working memory-- 31:07.820 --> 31:10.450 parietal lobe, middle frontal gyrus-- 31:10.450 --> 31:15.680 are more active in the impersonal case and the 31:15.680 --> 31:20.970 non-moral case than they are in the personal case. 31:20.970 --> 31:24.900 And here's the famous image from Greene's 2001 paper 31:24.900 --> 31:29.880 reproduced in many papers since that shows the brain 31:29.880 --> 31:32.770 areas that exhibit differential response in the 31:32.770 --> 31:35.710 moral/personal cases as contrasted 31:35.710 --> 31:39.230 with the other cases. 31:39.230 --> 31:44.870 So it looks like there is some, perhaps decisive, 31:44.870 --> 31:49.540 evidence in favor of Greene's second premise. 31:49.540 --> 31:54.390 In favor of the premise that what goes on in moral/personal 31:54.390 --> 31:58.110 cases is an activation of the part of the brain associated 31:58.110 --> 32:01.710 with emotion, whereas what goes on in cases like 32:01.710 --> 32:05.270 Bystander is an activation of the part of the brain 32:05.270 --> 32:11.010 associated with reasoning and other sorts of 32:11.010 --> 32:14.220 more-controlled processes. 32:14.220 --> 32:21.150 Moreover says Greene, there's lots and lots of behavioral 32:21.150 --> 32:25.800 evidence that supports the hypothesis that one of the 32:25.800 --> 32:30.810 things that goes on when we respond to hypothetical moral 32:30.810 --> 32:34.670 dilemmas is that we track features of the case that are 32:34.670 --> 32:37.250 not morally relevant. 32:37.250 --> 32:43.110 So for example, there's a study from the early 2000s by 32:43.110 --> 32:47.180 behavioral economists Small and Loewenstein that points 32:47.175 --> 32:53.085 out that in a very profound sense, identifiable victims 32:53.090 --> 32:57.250 produce in us more powerful emotional responses the 32:57.250 --> 33:00.910 non-identifiable victims. And this isn't just the difference 33:00.910 --> 33:04.850 between a picture of the child to whom your Oxfam donations 33:04.850 --> 33:07.740 will go versus a description of the child to whom your 33:07.740 --> 33:09.990 Oxfam donations will go. 33:09.990 --> 33:16.750 There is in fact a strikingly large difference between 33:16.750 --> 33:23.080 people's willingness to give some of their rewards in a 33:23.080 --> 33:28.740 game in a laboratory to person number four-- 33:28.740 --> 33:32.670 right, so they draw a name from a hat and it says person 33:32.670 --> 33:34.550 number four-- 33:34.550 --> 33:38.800 than in cases where they're told, decide how much money 33:38.800 --> 33:42.870 you want to give to the person whose number you're about to 33:42.870 --> 33:44.220 draw from the hat. 33:47.715 --> 33:50.455 In neither of these instances do they know who person number 33:50.460 --> 33:52.320 four is going to be. 33:52.320 --> 33:55.430 But the fact that in the first case, the person they draw 33:55.429 --> 33:57.899 from the hat and it says person number four, and they 33:57.900 --> 33:59.620 think, oh, I'll give this amount of my 33:59.620 --> 34:01.140 proceeds to the person. 34:01.139 --> 34:04.859 Whereas in the second case, they decide what amount of 34:04.860 --> 34:08.170 proceeds they want to give to the person whose number they 34:08.165 --> 34:09.875 are about to draw. 34:09.880 --> 34:15.650 The fact that that produces in subjects consistently 34:15.650 --> 34:21.360 different responses suggests to Greene, and perhaps to 34:21.360 --> 34:28.010 others of you, that perhaps using our intuitions about 34:28.010 --> 34:33.190 these sorts of cases to build our moral theories may not be 34:33.190 --> 34:39.030 the best way to proceed, since presumably there are few of 34:39.030 --> 34:42.890 you who think that there is a relevant moral difference 34:42.890 --> 34:47.480 between whether you know the number of the person to whom 34:47.480 --> 34:51.580 you're going to be giving the gift or whether you're about 34:51.580 --> 34:55.080 to find out the number of the person to whom you're going to 34:55.080 --> 34:57.270 be giving the gift. 34:57.270 --> 35:00.600 Here's something else that appears to affect our moral 35:00.600 --> 35:01.910 responses to cases. 35:07.610 --> 35:10.630 This is work done by Jon Haidt, author of The Happiness 35:10.630 --> 35:13.410 Hypothesis, with various collaborators. 35:13.410 --> 35:15.950 If you're deciding how much punishment to give somebody-- 35:15.950 --> 35:19.420 if you're deciding how wrong an act was-- 35:19.420 --> 35:25.460 if you have been induced to feel disgust, either by 35:25.460 --> 35:30.160 sitting at a dirty table or by having been trained to 35:30.160 --> 35:34.560 associate certain terms with disgust through a hypnotic 35:34.560 --> 35:40.390 suggestion, you will be harsher in punishing people 35:40.390 --> 35:42.890 for their misdeeds. 35:42.890 --> 35:47.800 Now, I take it that most of you don't think people deserve 35:47.800 --> 35:52.430 harsher punishment when you are feeling disgust because 35:52.430 --> 35:54.550 the table in front of you is dirty. 35:54.550 --> 35:58.480 I take it you think that how bad an act is that somebody 35:58.480 --> 36:03.530 else has done is independent of your feelings of disgust. 36:03.530 --> 36:06.810 But it looks like one of the things that condemnation 36:06.810 --> 36:09.190 tracks is that feeling. 36:09.190 --> 36:12.710 And in a minute, I'll talk about how that connects to 36:12.710 --> 36:16.810 Sunstein's more general discussion of heuristics. 36:16.810 --> 36:22.720 Finally, some work by David Pizarro, a Yale PhD, suggest 36:22.720 --> 36:27.560 that in specific trolley cases, we can get people's 36:27.560 --> 36:32.300 intuitions to move around in cases like Fat Man just by 36:32.300 --> 36:36.510 varying what most people would say are morally irrelevant 36:36.510 --> 36:38.760 features of the situation. 36:38.760 --> 36:42.260 In particular, Pizarro presents subjects with two 36:42.260 --> 36:44.740 different versions of the Fat Man case. 36:44.740 --> 36:48.410 In the first, you're asked whether it is morally 36:48.410 --> 36:52.480 permitted, required, or prohibited, to push a man 36:52.480 --> 36:57.800 named Tyrone Peyton off the bridge in order to save 100 36:57.800 --> 37:01.160 members of the New York Philharmonic. 37:01.160 --> 37:04.110 And in the second, you're asked whether it's morally 37:04.110 --> 37:08.900 acceptable to push a man named Chip Ellsworth III off the 37:08.900 --> 37:15.400 bridge to save 100 members of the Harlem Jazz Orchestra. 37:15.400 --> 37:18.700 So the question is whether pushing a white man off the 37:18.700 --> 37:23.980 bridge to save 100 people of African descent or pushing a 37:23.980 --> 37:27.910 black man off the bridge to save 100 people of European 37:27.910 --> 37:32.480 descent should produce different responses. 37:32.480 --> 37:34.570 And interestingly-- 37:34.570 --> 37:36.640 perhaps as the result of a certain kind of 37:36.640 --> 37:38.600 self-correction-- 37:38.600 --> 37:42.610 liberals say it is less morally acceptable to push 37:42.610 --> 37:45.130 Tyrone Peyton off the bridge than to 37:45.130 --> 37:47.990 produce Chip Ellsworth. 37:47.990 --> 37:50.650 Regardless of which direction the numbers come out, what's 37:50.650 --> 37:54.610 interesting is the numbers come out differently, tracking 37:54.610 --> 37:59.130 a feature which most of us would think isn't a morally 37:59.130 --> 38:01.080 relevant feature. 38:01.080 --> 38:07.110 So it looks like strengthening Greene's second premise-- and 38:07.110 --> 38:11.110 this is an argument that he makes in more detail in a 38:11.110 --> 38:14.560 paper, from which we'll read excerpts after the break, 38:14.560 --> 38:18.600 called "The Secret Joke of Kant's Soul"-- it looks like 38:18.600 --> 38:22.810 there's pretty good reason to think at least some of our 38:22.810 --> 38:26.990 responses to these cases are tracking features which we 38:26.990 --> 38:29.760 wouldn't reflectively endorse. 38:29.760 --> 38:34.740 And Greene thinks in particular in Fat Man, our 38:34.740 --> 38:39.950 reluctance to push the fat man off the bridge is tracking one 38:39.950 --> 38:43.390 of those morally irrelevant features. 38:43.390 --> 38:47.540 Deontological judgments, says Greene-- those where we're 38:47.540 --> 38:50.560 unwilling to make the utilitarian move-- 38:50.560 --> 38:53.020 deontological judgments, says Greene, are driven by 38:53.020 --> 38:54.810 emotional responses. 38:54.810 --> 38:58.270 Consequentialist judgments are driven by cognitive ones. 38:58.270 --> 39:02.480 And the deontological responses, he says, lack moral 39:02.480 --> 39:03.960 significance. 39:03.960 --> 39:10.950 In fact, deontology itself is a kind of moral confabulation. 39:10.950 --> 39:14.590 I'm going to give Kant the last word in this lecture. 39:14.590 --> 39:18.660 So those of you who are crying for the sage of Konigsberg, 39:18.660 --> 39:22.340 realize that he will get the very last word in today's 39:22.340 --> 39:25.870 lecture complete with a beautiful image of his face. 39:25.870 --> 39:30.720 But before I do that I want to spend the final 10 substantive 39:30.720 --> 39:35.450 minutes of the lecture talking you through the third article 39:35.450 --> 39:39.070 which we are considering for today, namely 39:39.070 --> 39:41.640 Cass Sunstein's article. 39:41.640 --> 39:45.360 So Sunstein, in somewhat similar vein to Greene, though 39:45.360 --> 39:49.400 drawing on a slightly different literature, argues 39:49.400 --> 39:54.460 that a good portion of our moral reasoning operates in 39:54.460 --> 39:58.640 exactly the same way that our regular reasoning does, namely 39:58.640 --> 40:02.960 by making use of heuristics, which we know about from the 40:02.960 --> 40:06.210 lecture on January 20 on dual processing. 40:06.210 --> 40:10.320 Heuristics are fast and frugal tools for dealing with the 40:10.320 --> 40:13.160 complexity of the world when we're faced with 40:13.160 --> 40:17.500 time-sensitive, decision-making tasks. 40:17.500 --> 40:22.560 And the way that heuristics work is really smart. 40:22.560 --> 40:25.640 They work by means of something called attribute 40:25.640 --> 40:27.130 substitution. 40:27.130 --> 40:30.350 We're interested in a target attribute-- 40:30.350 --> 40:32.890 something that's relatively hard to find 40:32.890 --> 40:34.820 out about the world. 40:34.820 --> 40:40.430 And we focus our attention instead on a heuristic 40:40.430 --> 40:41.310 attribute-- 40:41.310 --> 40:43.740 something that's relatively easy to find 40:43.740 --> 40:45.680 out about the world. 40:45.680 --> 40:50.270 So some of you may make use of this when you're trying to 40:50.270 --> 40:52.530 distinguish your telephone from other people's 40:52.530 --> 40:53.910 telephones. 40:53.910 --> 40:55.300 The target attribute-- 40:55.300 --> 40:57.200 the thing you're really interested in is-- 40:57.200 --> 40:59.220 is this my phone? 40:59.220 --> 41:01.360 Something which you're only going to be able to determine 41:01.360 --> 41:03.630 by turning on the phone and looking to see whether the 41:03.630 --> 41:05.810 numbers in it are the numbers that you've placed 41:05.810 --> 41:07.450 into it, let's say. 41:07.450 --> 41:11.480 But you might make your life easy by putting a cover on 41:11.480 --> 41:14.410 your phone or a sticker on your phone or some surface 41:14.410 --> 41:18.580 feature on your phone that will let you find your phone 41:18.580 --> 41:21.250 quickly and well. 41:21.250 --> 41:21.540 Right? 41:21.535 --> 41:25.675 So you're going to make use of an easy to find attribute 41:25.680 --> 41:30.650 rather than a difficult to determine attribute. 41:30.650 --> 41:34.910 In general, this is an extraordinarily good way to 41:34.910 --> 41:36.930 navigate the world. 41:36.930 --> 41:40.500 Target and heuristic attributes generally coincide. 41:40.500 --> 41:44.060 That's how the heuristic attributes came to be the ones 41:44.060 --> 41:47.360 which you're using as the markers of the target. 41:47.360 --> 41:52.060 And it takes much less effort to process surface features of 41:52.060 --> 41:56.060 the world than to spend your time working through the 41:56.060 --> 41:59.350 details of each of the things that you want 41:59.350 --> 42:01.980 to make sense of. 42:01.980 --> 42:06.240 So I observed myself this morning making use of a target 42:06.240 --> 42:09.730 attribute on my way into school. 42:09.730 --> 42:12.970 I was stopped at a stoplight, and I noticed out of the 42:12.970 --> 42:15.440 corner of my eye that the car next to me 42:15.440 --> 42:17.640 had started to move. 42:17.640 --> 42:21.380 Now obviously, the attribute in which I was interested in 42:21.380 --> 42:25.470 was whether the light had turned green. 42:25.470 --> 42:29.070 But because I couldn't quite see the green light from where 42:29.070 --> 42:33.470 I was sitting, I was able to use instead the motion of the 42:33.470 --> 42:36.720 car next to me as an indicator of the thing that I was 42:36.720 --> 42:38.500 concerned with. 42:38.500 --> 42:40.990 Now of course, the heuristic could have 42:40.990 --> 42:42.590 misfired in this case. 42:42.590 --> 42:44.880 It could have been that that car was moving even though the 42:44.880 --> 42:45.660 light was still red. 42:45.660 --> 42:48.050 It could have been that that car was moving in the left 42:48.050 --> 42:50.770 lane and had a special light that I didn't. 42:50.770 --> 42:56.370 But for the most part, we make use of heuristics all the time 42:56.370 --> 42:58.290 and they help us. 42:58.290 --> 43:01.630 Now Sunstein's argument is that in non-moral cases, 43:01.630 --> 43:03.800 people often use heuristics. 43:03.800 --> 43:07.770 That though these are useful, they may also lead to errors. 43:07.770 --> 43:12.100 And that in moral cases, people often use 43:12.100 --> 43:13.350 heuristics as well. 43:15.840 --> 43:20.400 But just as they may lead to errors in the non-moral cases, 43:20.400 --> 43:24.960 so too may they lead to errors in the moral cases. 43:24.960 --> 43:30.500 And in particular, he thinks they do in a number of cases 43:30.500 --> 43:31.890 that he goes on to discuss. 43:35.680 --> 43:37.420 And I realize, I think we're going to close-- 43:37.420 --> 43:39.850 I said Kant would get the last word, but Kant's going to get 43:39.850 --> 43:41.830 the last word on Thursday. 43:41.830 --> 43:43.920 We're going to go through Sunstein and one of the 43:43.920 --> 43:45.580 examples, and then we'll get to Kant. 43:45.580 --> 43:48.700 So Sunstein points out, for example, that there's a 43:48.700 --> 43:51.410 heuristic called the availability heuristic. 43:51.410 --> 43:54.230 That's a heuristic that says, look, I'm trying to figure out 43:54.230 --> 43:56.650 how likely something is to happen, and here's a good way 43:56.650 --> 43:59.020 to determine how likely something is to happen. 43:59.020 --> 44:01.930 I think about how easy it is for me to think of cases where 44:01.930 --> 44:03.100 that did happen. 44:03.100 --> 44:05.510 So whenever I'm worried that my children are going to be 44:05.510 --> 44:09.820 kidnapped, I think for myself, how many friends do I know 44:09.820 --> 44:11.490 whose children were kidnapped? 44:11.490 --> 44:13.830 How many people do I know whose children were kidnapped? 44:13.830 --> 44:18.340 And when I discover that the answer is none, I relax. 44:18.340 --> 44:21.560 This kind of heuristic is often correct, but 44:21.560 --> 44:22.990 it can lead us astray. 44:22.990 --> 44:28.550 Suppose, for example, you're asked whether it's more likely 44:28.550 --> 44:30.670 whether there are more words in the English language that 44:30.670 --> 44:34.280 end in I-N-G or more words in the English language whose 44:34.280 --> 44:45.300 second-to-last letter is N. It's much easier to think of 44:45.300 --> 44:49.360 words that end with I-N-G, and so people tend to say that 44:49.360 --> 44:52.420 there are more words ending in I-N-G than words whose 44:52.420 --> 44:57.220 second-to-last letter is N. But of course, every word that 44:57.220 --> 45:01.870 ends with I-N-G is a word whose second-to-last letter is 45:01.870 --> 45:08.360 N. You've been bamboozled by the availability heuristic. 45:08.360 --> 45:10.920 Or suppose you make use of what's sometimes called the 45:10.920 --> 45:12.610 representative heuristic. 45:12.610 --> 45:15.930 That the probability of something occurring tracks its 45:15.930 --> 45:18.300 degree of typicality. 45:18.300 --> 45:20.900 This too is often correct. 45:20.900 --> 45:24.360 What it is to be a typical instance is to be one of the 45:24.360 --> 45:27.790 instances that occurs more frequently. 45:27.790 --> 45:30.590 But as you know from the Linda the bank teller case or the 45:30.590 --> 45:35.330 farmer with a tractor case, if I ask you of our random 45:35.330 --> 45:39.090 resident of Iowa whether it's more likely that that resident 45:39.090 --> 45:44.750 is a farmer or a farmer with a tractor, the representative 45:44.750 --> 45:47.540 heuristic is going to draw you towards saying that it's more 45:47.540 --> 45:50.830 likely that the person is a farmer with a tractor. 45:50.830 --> 45:53.250 But obviously, every farmer with a 45:53.250 --> 45:57.290 tractor is also a farmer. 45:57.290 --> 46:03.070 Now remember that Sunstein's arguments for one and two are 46:03.070 --> 46:06.230 easy to make because we have an independent way of 46:06.230 --> 46:09.380 determining whether somebody has made an 46:09.380 --> 46:11.290 error in those cases. 46:11.290 --> 46:14.340 We can see what went wrong in the availability heuristic and 46:14.340 --> 46:17.520 in the representative heuristic, because we can see 46:17.520 --> 46:21.580 that it is in fact more likely that the second-to-last letter 46:21.580 --> 46:26.120 of a word is N than it is that the last three letters of the 46:26.120 --> 46:30.200 word are I-N-G. We can see that it's more likely that 46:30.200 --> 46:33.060 somebody's a farmer than it is that somebody's a 46:33.060 --> 46:34.770 farmer with a tractor. 46:34.770 --> 46:37.940 Because in both of those cases, one of them is a 46:37.940 --> 46:40.980 special instance of the other. 46:40.980 --> 46:45.290 Sunstein's argument for moral heuristics is going to take 46:45.290 --> 46:47.300 more steps. 46:47.300 --> 46:51.150 Because it's not enough for him to show what we'll 46:51.150 --> 46:52.340 establish-- 46:52.340 --> 46:56.840 that in moral cases, people often use heuristics-- 46:56.840 --> 47:01.740 he's also going to need to show that in so doing, they're 47:01.740 --> 47:03.380 making mistakes. 47:03.380 --> 47:06.610 Where the question of how we get an independent handle on 47:06.610 --> 47:11.900 what it is to make a mistake is a rather complicated one. 47:11.900 --> 47:16.310 But let's first think about what his argument in favor of 47:16.310 --> 47:19.580 the claim that in moral cases people often 47:19.580 --> 47:21.040 use heuristics is. 47:21.040 --> 47:23.080 And I'm going to close today's lecture with two 47:23.080 --> 47:24.220 examples that he gives. 47:24.220 --> 47:27.870 And then we'll begin on Thursday by running through 47:27.870 --> 47:31.190 some particular cases where I'll ask you to respond. 47:31.190 --> 47:34.610 So one of the examples that he provides is again some work by 47:34.610 --> 47:38.640 Jonathan Haidt on a phenomenon known as moral dumbfounding. 47:38.640 --> 47:43.960 As you know from reading Sunstein's paper, people often 47:43.960 --> 47:48.270 respond to the question: "Is it morally acceptable for a 47:48.270 --> 47:52.440 brother and sister to engage in consensual, harm-free 47:52.440 --> 47:57.640 sibling incestuous relations?" by saying that it is morally 47:57.640 --> 47:59.530 unacceptable. 47:59.530 --> 48:05.550 But when asked to provide reasons for that, subjects 48:05.550 --> 48:08.820 find it difficult to do so. 48:08.820 --> 48:11.580 Likewise, many people are inclined to think there's 48:11.580 --> 48:15.930 something morally problematic about wiping the floor of your 48:15.930 --> 48:22.200 bathroom with a flag, or about eating your dog if he's been 48:22.200 --> 48:28.410 hit by a car, but they find it difficult to articulate what 48:28.410 --> 48:32.570 their reasons are for those responses. 48:32.570 --> 48:38.120 Sunstein suggests the reason is an overextension of 48:38.120 --> 48:39.740 heuristics. 48:39.740 --> 48:43.040 Likewise, he points out in moral framing cases-- and 48:43.040 --> 48:45.660 we'll start with this next lecture-- 48:45.660 --> 48:49.900 cases like the Asian disease case that I presented you with 48:49.900 --> 48:55.040 in our lecture on January 20, whether you present a moral 48:55.040 --> 48:59.770 dilemma as involving lives saved, or by 48:59.770 --> 49:02.500 contrast lives lost-- 49:02.500 --> 49:06.340 even when those are just complementary descriptions of 49:06.340 --> 49:08.220 the same outcome-- 49:08.220 --> 49:12.970 people are likely to have different responses. 49:12.970 --> 49:18.600 And Sunstein concludes on that basis that people make use of 49:18.600 --> 49:22.720 heuristics in moral reasoning just as they do 49:22.720 --> 49:25.240 in non-moral reasoning. 49:25.240 --> 49:29.430 And we'll begin on Thursday with Sunstein's discussion of 49:29.430 --> 49:32.980 those cases, and then we'll let Kant and Mill get the last 49:32.980 --> 49:34.610 words in the trolley debate. 49:34.610 --> 49:36.280 I'll see you then.