1
00:00:18,200 --> 00:00:26,190
PATRICK WINSTON: Welcome
to 6034.

2
00:00:26,190 --> 00:00:28,230
I don't know if I can deal
with this microphone.

3
00:00:28,230 --> 00:00:29,405
We'll see what happens.

4
00:00:29,405 --> 00:00:30,380
It's going to be a good year.

5
00:00:30,380 --> 00:00:33,530
We've got [INAUDIBLE] a bunch
of interesting people.

6
00:00:33,530 --> 00:00:35,790
It's always interesting to see
what people named their

7
00:00:35,790 --> 00:00:39,820
children two decades ago.

8
00:00:39,820 --> 00:00:42,510
And I find they were overwhelmed
with Emilys.

9
00:00:42,510 --> 00:00:45,420
And there are not too many
Peters, Pauls, and Marys, but

10
00:00:45,420 --> 00:00:50,540
enough to call forth a suitable
song at some point.

11
00:00:50,540 --> 00:00:55,630
We have lots of Jesses
of both genders.

12
00:00:55,630 --> 00:00:57,220
We have a [INAUDIBLE]

13
00:00:57,220 --> 00:00:59,890
of both genders.

14
00:00:59,890 --> 00:01:02,810
And we have a Duncan,
where's Duncan?

15
00:01:02,810 --> 00:01:04,080
There you are, Duncan.

16
00:01:04,080 --> 00:01:06,900
You've changed your hairstyle.

17
00:01:06,900 --> 00:01:10,130
I want to assure use that the
Thane of Cawdor is not taking

18
00:01:10,130 --> 00:01:12,390
the course this semester.

19
00:01:12,390 --> 00:01:14,890
What I'm going to do is tell
you about artificial

20
00:01:14,890 --> 00:01:18,680
intelligence today, and what
this subject is about.

21
00:01:18,680 --> 00:01:24,100
There's been about a 10% percent
turnover in the roster

22
00:01:24,100 --> 00:01:25,320
in the last 24 hours.

23
00:01:25,320 --> 00:01:29,680
I expect another 10% turnover
in the next 24 hours, too.

24
00:01:29,680 --> 00:01:32,600
So I know many of you are
sightseers, wanting to know if

25
00:01:32,600 --> 00:01:33,720
this is something
you want to do.

26
00:01:33,720 --> 00:01:36,140
So I'm going to tell you about
what we're going to do this

27
00:01:36,140 --> 00:01:40,229
semester, and what you'll know
when you get out of here.

28
00:01:40,229 --> 00:01:43,620
I'm going to walk you through
this outline.

29
00:01:43,620 --> 00:01:45,539
I'm going to start by talking
about what artificial

30
00:01:45,539 --> 00:01:48,630
intelligence is, and
why we do it.

31
00:01:48,630 --> 00:01:50,160
And then I'll give you a little
bit of the history of

32
00:01:50,160 --> 00:01:54,000
artificial intelligence, and
conclude with some of the

33
00:01:54,000 --> 00:01:56,570
covenants by which we
run the course.

34
00:01:56,570 --> 00:01:58,140
One of which is no
laptops, please.

35
00:02:00,900 --> 00:02:05,620
I'll explain why we have these
covenants at the end.

36
00:02:05,620 --> 00:02:07,360
So what is it?

37
00:02:07,360 --> 00:02:12,220
Well, it must have something
to do with thinking.

38
00:02:12,220 --> 00:02:15,280
So let's start up here, a
definition of artificial

39
00:02:15,280 --> 00:02:21,820
intelligence, by saying that
it's about thinking,

40
00:02:21,820 --> 00:02:23,850
whatever that is.

41
00:02:23,850 --> 00:02:26,100
My definition of artificial
intelligence has

42
00:02:26,100 --> 00:02:28,560
to be rather broad.

43
00:02:28,560 --> 00:02:31,710
So we're going to say it's
not only about thinking.

44
00:02:31,710 --> 00:02:39,110
It's also about perception,
and it's about action.

45
00:02:42,610 --> 00:02:46,060
And if this were a philosophy
class, then I'd stop right

46
00:02:46,060 --> 00:02:49,120
there and just say, in this
subject we're going to talk

47
00:02:49,120 --> 00:02:52,370
about problems involving
thinking,

48
00:02:52,370 --> 00:02:54,860
perception, and action.

49
00:02:54,860 --> 00:02:57,890
But this is not a philosophy
class.

50
00:02:57,890 --> 00:02:59,050
This a Course six class.

51
00:02:59,050 --> 00:03:00,350
It's an engineering
school class.

52
00:03:00,350 --> 00:03:02,000
It's an MIT class.

53
00:03:02,000 --> 00:03:03,940
So we need more than that.

54
00:03:03,940 --> 00:03:12,380
And therefore we're going to
talk about models that are

55
00:03:12,380 --> 00:03:20,329
targeted at thinking,
perception, and action.

56
00:03:20,329 --> 00:03:23,800
And this should not be strange
to you, because model making

57
00:03:23,800 --> 00:03:26,900
is what MIT is about.

58
00:03:26,900 --> 00:03:29,540
You run into someone at a bar,
or relative asks you what you

59
00:03:29,540 --> 00:03:33,200
do at MIT, the right knee jerk
reaction is to say, we learned

60
00:03:33,200 --> 00:03:35,130
how to build models.

61
00:03:35,130 --> 00:03:36,990
That's what we do at MIT.

62
00:03:36,990 --> 00:03:40,570
We build the models using
differential equations.

63
00:03:40,570 --> 00:03:43,450
We build models using
probabilities.

64
00:03:43,450 --> 00:03:47,829
We build models using physical
and computational simulations.

65
00:03:47,829 --> 00:03:50,300
Whatever we do, we
build models.

66
00:03:50,300 --> 00:03:55,710
Even in humanities class, MIT
approach is to make models

67
00:03:55,710 --> 00:04:00,290
that we can use to explain the
past, predict the future,

68
00:04:00,290 --> 00:04:03,050
understand the subject,
and control the world.

69
00:04:03,050 --> 00:04:04,290
That's what MIT is about.

70
00:04:04,290 --> 00:04:06,400
And that's what this subject
is about, too.

71
00:04:06,400 --> 00:04:10,650
And now, our models are
models of thinking.

72
00:04:10,650 --> 00:04:12,060
So you might say,
if I take this

73
00:04:12,060 --> 00:04:13,840
classic will I get smarter?

74
00:04:13,840 --> 00:04:14,760
And the answer is yes.

75
00:04:14,760 --> 00:04:15,500
You will get smarter.

76
00:04:15,500 --> 00:04:17,980
Because you'll have better
models of your own thinking,

77
00:04:17,980 --> 00:04:21,120
not just the subject matter of
the subject, but better models

78
00:04:21,120 --> 00:04:23,840
of your own thinking.

79
00:04:23,840 --> 00:04:25,370
So models targeted at thinking,

80
00:04:25,370 --> 00:04:27,670
perception, and action.

81
00:04:27,670 --> 00:04:29,460
We know that's not quite enough,
because in order to

82
00:04:29,460 --> 00:04:33,790
have a model, you have to
have representation.

83
00:04:33,790 --> 00:04:37,040
So let's say that artificial
intelligence is about

84
00:04:37,040 --> 00:04:52,470
representations that support
the making of models to

85
00:04:52,470 --> 00:04:55,040
facilitate an understanding
of thinking,

86
00:04:55,040 --> 00:04:57,500
perception, and action.

87
00:04:57,500 --> 00:04:59,409
Now you might say to me, well
what's a representation?

88
00:04:59,409 --> 00:05:00,170
And what good can it do?

89
00:05:00,170 --> 00:05:02,780
So I'd like to take a brief
moment to tell you about

90
00:05:02,780 --> 00:05:03,540
gyroscopes.

91
00:05:03,540 --> 00:05:06,380
Many of you have friends in
mechanical engineering.

92
00:05:06,380 --> 00:05:10,980
One of the best ways embarrass
them is to say here's a

93
00:05:10,980 --> 00:05:12,210
bicycle wheel.

94
00:05:12,210 --> 00:05:16,720
And if I spin it, and blow hard
on it right here, on the

95
00:05:16,720 --> 00:05:19,110
edge of the wheel, is
going to turn over

96
00:05:19,110 --> 00:05:23,160
this way or this way?

97
00:05:23,160 --> 00:05:26,250
I guarantee that what they will
do is they'll put their

98
00:05:26,250 --> 00:05:29,400
hand in an arthritic posture
called the right hand screw

99
00:05:29,400 --> 00:05:35,870
rule, aptly named because people
who use it tend to get

100
00:05:35,870 --> 00:05:41,460
the right answer about
50% of the time.

101
00:05:41,460 --> 00:05:44,560
But we're never going to make
that mistake again.

102
00:05:44,560 --> 00:05:45,780
Because we're electrical
engineers,

103
00:05:45,780 --> 00:05:46,820
not mechanical engineers.

104
00:05:46,820 --> 00:05:49,300
And we know about
representation.

105
00:05:49,300 --> 00:05:51,290
What we're going to do is we're
going to think about it

106
00:05:51,290 --> 00:05:52,486
a little bit.

107
00:05:52,486 --> 00:05:54,780
And we're going to use some
duct tape to help us think

108
00:05:54,780 --> 00:05:57,730
about just one piece
of the wheel.

109
00:05:57,730 --> 00:06:01,270
So I want you to just think
about that piece of the wheel

110
00:06:01,270 --> 00:06:03,050
as the wheel comes flying
over the top, and I

111
00:06:03,050 --> 00:06:04,810
blow on it like that.

112
00:06:04,810 --> 00:06:08,090
What's going to happen
to that one piece?

113
00:06:08,090 --> 00:06:10,520
It's going to go off
that way, right?

114
00:06:10,520 --> 00:06:13,400
And the next piece is going
to go off that way too.

115
00:06:13,400 --> 00:06:17,610
So when it comes over, it
has to go that way.

116
00:06:17,610 --> 00:06:19,440
Let me do some ground f
here just to be sure.

117
00:06:22,200 --> 00:06:23,450
It's very powerful feeling.

118
00:06:26,760 --> 00:06:28,730
Try it.

119
00:06:28,730 --> 00:06:29,540
We need a demonstration.

120
00:06:29,540 --> 00:06:33,010
I don't anybody think that
I'm cheating, here.

121
00:06:33,010 --> 00:06:39,840
So let's just twist it
one way or the other.

122
00:06:39,840 --> 00:06:42,100
So that's powerful
pull, isn't it.

123
00:06:42,100 --> 00:06:46,820
Alex is now never going to get
the gyroscope wrong, because

124
00:06:46,820 --> 00:06:49,870
he's got the right
representation.

125
00:06:49,870 --> 00:06:52,130
So much of what you're going to
accumulate in this subject

126
00:06:52,130 --> 00:06:54,560
is a suite of representations
that will help you to build

127
00:06:54,560 --> 00:06:56,990
programs that are intelligent.

128
00:06:56,990 --> 00:07:00,330
But I want to give you a second
example, one a little

129
00:07:00,330 --> 00:07:01,300
bit more computational.

130
00:07:01,300 --> 00:07:03,990
But one of which was very
familiar to you by the time

131
00:07:03,990 --> 00:07:06,690
you went to first grade,
in most cases.

132
00:07:06,690 --> 00:07:09,610
It's the problem of the farmer,
the fox, the goose,

133
00:07:09,610 --> 00:07:11,020
and the grain.

134
00:07:11,020 --> 00:07:14,070
There's a river, a leaky rowboat
that can only carry

135
00:07:14,070 --> 00:07:17,340
the farmer, and one of
his four possessions.

136
00:07:17,340 --> 00:07:18,080
So what's the right

137
00:07:18,080 --> 00:07:21,220
representation for this problem?

138
00:07:21,220 --> 00:07:24,370
It might be a picture
of the farmer.

139
00:07:24,370 --> 00:07:29,570
It might be a poem about the
situation, perhaps a haiku.

140
00:07:29,570 --> 00:07:33,240
We know that those are not
the right representation.

141
00:07:33,240 --> 00:07:38,340
Somehow, we get the sense that
the right representation most

142
00:07:38,340 --> 00:07:42,440
involve something about the
location of the participants

143
00:07:42,440 --> 00:07:44,270
in this scenario.

144
00:07:44,270 --> 00:07:48,480
So we might draw a picture
that looks like this.

145
00:07:48,480 --> 00:07:54,150
There's the scenario, and
here in glorious green,

146
00:07:54,150 --> 00:07:58,450
representing our algae infested
rivers is the river.

147
00:07:58,450 --> 00:08:05,306
And here's the farmer, the fox,
the goose, and the grain.

148
00:08:05,306 --> 00:08:07,900
An initial situation.

149
00:08:07,900 --> 00:08:10,743
Now there are other situations
like this one, for example.

150
00:08:13,360 --> 00:08:19,200
We have the river, and
the farmer, and the

151
00:08:19,200 --> 00:08:21,590
goose is on that side.

152
00:08:21,590 --> 00:08:26,160
And the fox and the grain
is on that side.

153
00:08:26,160 --> 00:08:31,520
And we know that the farmer can
execute a movement from

154
00:08:31,520 --> 00:08:34,929
one situation to another.

155
00:08:34,929 --> 00:08:36,980
So now we're getting somewhere
where with the problem.

156
00:08:36,980 --> 00:08:39,120
This is at MIT approach
to the farmer, fox,

157
00:08:39,120 --> 00:08:40,010
goose, and grain problem.

158
00:08:40,010 --> 00:08:41,669
It might have stumped you when
you were a little kid.

159
00:08:45,370 --> 00:08:46,870
How many such situations
are there?

160
00:08:50,180 --> 00:08:52,632
What do you think, Tanya?

161
00:08:52,632 --> 00:08:56,280
It looks to me like all four of
individuals can be on one

162
00:08:56,280 --> 00:08:58,740
side or the other.

163
00:08:58,740 --> 00:09:01,480
So for every position the farmer
can be, each of the

164
00:09:01,480 --> 00:09:03,530
other things can be on either
side of the river.

165
00:09:06,060 --> 00:09:09,500
So it would be two to the fourth
she says aggressively

166
00:09:09,500 --> 00:09:12,060
and without hesitation.

167
00:09:12,060 --> 00:09:14,020
Yes, two to the fourth,
16 possibilities.

168
00:09:14,020 --> 00:09:17,590
So we could actually draw
out the entire graph.

169
00:09:17,590 --> 00:09:19,770
It's small enough.

170
00:09:19,770 --> 00:09:24,620
There's another position over
here with the farmer, fox,

171
00:09:24,620 --> 00:09:25,832
goose, and grain.

172
00:09:25,832 --> 00:09:28,560
And in fact that's
the one we want.

173
00:09:28,560 --> 00:09:32,660
And if we draw out the entire
graph, it looks like this.

174
00:09:35,910 --> 00:09:40,250
This is a graph of the
situations and the allowed

175
00:09:40,250 --> 00:09:42,960
connections between them.

176
00:09:42,960 --> 00:09:47,280
Why are there not 16?

177
00:09:47,280 --> 00:09:48,420
Because the other--

178
00:09:48,420 --> 00:09:49,170
how many have I got?

179
00:09:49,170 --> 00:09:49,625
Four?

180
00:09:49,625 --> 00:09:51,720
10?

181
00:09:51,720 --> 00:09:55,600
The others are situations in
which somebody gets eaten.

182
00:09:55,600 --> 00:09:58,490
So we don't want to go to
any of those places.

183
00:09:58,490 --> 00:10:00,850
So having got the
representation, something

184
00:10:00,850 --> 00:10:02,750
magical has happened.

185
00:10:02,750 --> 00:10:04,030
We've got our constraints
exposed.

186
00:10:18,420 --> 00:10:19,860
And that's why we build
representations.

187
00:10:19,860 --> 00:10:23,080
That's whey you algebra in high
school, because algebraic

188
00:10:23,080 --> 00:10:25,410
notation exposes the constraints
that make it

189
00:10:25,410 --> 00:10:29,630
possible to actually figure out
how many customers you get

190
00:10:29,630 --> 00:10:31,510
for the number of advertisements
you place in

191
00:10:31,510 --> 00:10:32,760
the newspaper.

192
00:10:34,420 --> 00:10:36,870
So artificial intelligence is
about constraints exposed by

193
00:10:36,870 --> 00:10:39,760
representations that support
models targeted to thinking--

194
00:10:39,760 --> 00:10:41,450
actually there's one
more thing, too.

195
00:10:41,450 --> 00:10:42,800
Not quite done.

196
00:10:42,800 --> 00:10:48,500
Because after all, in the end,
we have to build programs.

197
00:10:48,500 --> 00:11:04,820
So it's about algorithms enabled
by constraints exposed

198
00:11:04,820 --> 00:11:09,510
by representations that model
targeted thinking, perception,

199
00:11:09,510 --> 00:11:11,000
and action.

200
00:11:11,000 --> 00:11:15,520
So these algorithms, or we might
call them just as well

201
00:11:15,520 --> 00:11:18,620
procedures, or we might call
them just as well methods,

202
00:11:18,620 --> 00:11:20,030
whatever you like.

203
00:11:20,030 --> 00:11:21,980
These are the stuff of what
artificial intelligence is

204
00:11:21,980 --> 00:11:24,860
about-- methods, algorithms,
representations.

205
00:11:24,860 --> 00:11:27,470
I'd like to give you
one more example.

206
00:11:27,470 --> 00:11:29,570
It's something we call, in
artificial intelligence,

207
00:11:29,570 --> 00:11:31,770
generated test.

208
00:11:31,770 --> 00:11:33,640
And it's such a simple idea,
you'll never hear it again in

209
00:11:33,640 --> 00:11:34,840
this subject.

210
00:11:34,840 --> 00:11:37,130
But it's an idea you need to
add to your repertoire of

211
00:11:37,130 --> 00:11:40,780
problem solving methods,
techniques, procedures, and

212
00:11:40,780 --> 00:11:42,900
algorithms.

213
00:11:42,900 --> 00:11:44,150
So here's how it works.

214
00:11:48,380 --> 00:11:52,370
Maybe I can explain to best by
starting off with an example.

215
00:11:52,370 --> 00:11:55,590
Here's a tree leaf I picked
off a tree on

216
00:11:55,590 --> 00:11:56,760
the way over to class.

217
00:11:56,760 --> 00:11:59,950
I hope it's not the last
of the species.

218
00:11:59,950 --> 00:12:01,540
What is it, what kind of tree?

219
00:12:05,680 --> 00:12:06,720
I don't know.

220
00:12:06,720 --> 00:12:09,870
I never did learn my trees,
or my colors, or my

221
00:12:09,870 --> 00:12:11,910
multiplication tables.

222
00:12:11,910 --> 00:12:17,060
So I have to go back to this
book, the Audubon Society

223
00:12:17,060 --> 00:12:18,870
Field Guide to North
American Trees.

224
00:12:18,870 --> 00:12:21,160
And how would I solve
the problem?

225
00:12:21,160 --> 00:12:21,790
It's pretty simple.

226
00:12:21,790 --> 00:12:26,490
I just turn the pages one at a
time, until I find something

227
00:12:26,490 --> 00:12:28,700
that looks like this leaf.

228
00:12:28,700 --> 00:12:33,190
And then I discover it's a
sycamore, or something.

229
00:12:33,190 --> 00:12:35,930
MIT's full of them.

230
00:12:35,930 --> 00:12:39,660
So when I do that, I do
something very intuitive, very

231
00:12:39,660 --> 00:12:41,070
natural, something you
do all the time.

232
00:12:41,070 --> 00:12:42,550
But we're going to
give it a name.

233
00:12:42,550 --> 00:12:43,850
We're going to call it
generate and test.

234
00:12:55,510 --> 00:12:59,790
And generate and test method
consists of generating some

235
00:12:59,790 --> 00:13:04,260
possible solutions, feeding
them into a box that tests

236
00:13:04,260 --> 00:13:10,760
them, and then out the other
side comes mostly failures.

237
00:13:10,760 --> 00:13:14,890
But every once in a while we
get something that succeeds

238
00:13:14,890 --> 00:13:17,240
and pleases us.

239
00:13:17,240 --> 00:13:18,960
That's what I did
with the leaf.

240
00:13:18,960 --> 00:13:21,340
But now you have
a name for it.

241
00:13:21,340 --> 00:13:25,490
Once you have a name
for something, you

242
00:13:25,490 --> 00:13:26,860
get power over it.

243
00:13:26,860 --> 00:13:28,970
You can start to
talk about it.

244
00:13:28,970 --> 00:13:32,180
So I can say, if you're doing
a generate and test approach

245
00:13:32,180 --> 00:13:37,280
to a problem, you better build
a generator with certain

246
00:13:37,280 --> 00:13:40,210
properties that make
generators good.

247
00:13:40,210 --> 00:13:42,940
For example, they should
not be redundant.

248
00:13:42,940 --> 00:13:46,210
They shouldn't give you the
same solution twice.

249
00:13:46,210 --> 00:13:49,680
They should be informable.

250
00:13:49,680 --> 00:13:52,620
They should be able to absorb
information such as, this is a

251
00:13:52,620 --> 00:13:53,650
deciduous tree.

252
00:13:53,650 --> 00:13:56,260
Don't bother looking
at the conifers.

253
00:13:56,260 --> 00:13:58,130
So once you have a name for
something, you can start

254
00:13:58,130 --> 00:13:59,370
talking about.

255
00:13:59,370 --> 00:14:02,580
And that vocabulary
gives you power.

256
00:14:02,580 --> 00:14:06,610
So we call this the
Rumpelstiltskin Principle

257
00:14:06,610 --> 00:14:10,530
perhaps The first of our
powerful ideas for the day.

258
00:14:10,530 --> 00:14:12,180
This subject is full
of powerful ideas.

259
00:14:12,180 --> 00:14:14,300
There will be some
in every class.

260
00:14:14,300 --> 00:14:16,480
Rumpelstiltskin Principle says
that once you can name

261
00:14:16,480 --> 00:14:18,250
something, you get
power over it.

262
00:14:18,250 --> 00:14:19,220
You know what that little
thing is on

263
00:14:19,220 --> 00:14:20,470
the end of your shoelace?

264
00:14:23,650 --> 00:14:24,280
It's interesting.

265
00:14:24,280 --> 00:14:26,480
She's gesturing like mad.

266
00:14:26,480 --> 00:14:28,920
That's something we'll talk
about later, too--

267
00:14:28,920 --> 00:14:32,070
motor stuff, and how
it helps us think.

268
00:14:32,070 --> 00:14:33,310
What is it?

269
00:14:33,310 --> 00:14:35,500
No one knows?

270
00:14:35,500 --> 00:14:36,780
It's an ag something, right?

271
00:14:36,780 --> 00:14:38,910
It's an aglet, very good.

272
00:14:38,910 --> 00:14:41,460
So once you have the name, you
can start to talk about.

273
00:14:41,460 --> 00:14:44,320
You can say the purpose of an
aglet is pretty much like the

274
00:14:44,320 --> 00:14:45,590
whipping on the end of a rope.

275
00:14:45,590 --> 00:14:48,140
It keeps the thing
from unwinding.

276
00:14:48,140 --> 00:14:51,140
Now you have a place to
hang that knowledge.

277
00:14:51,140 --> 00:14:54,730
So we're talking about this
frequently from now into the

278
00:14:54,730 --> 00:14:56,680
rest of the semester,
the power of being

279
00:14:56,680 --> 00:14:58,010
able to name things.

280
00:14:58,010 --> 00:15:01,850
Symbolic labels give us
power over concepts.

281
00:15:01,850 --> 00:15:04,290
While we're here I should also
say that this is a very simple

282
00:15:04,290 --> 00:15:06,570
idea, generate and test.

283
00:15:06,570 --> 00:15:09,770
And you might be tempted to
say to someone, we learned

284
00:15:09,770 --> 00:15:11,470
about generate and test today.

285
00:15:11,470 --> 00:15:14,290
But it's a trivial idea.

286
00:15:14,290 --> 00:15:17,030
The word trivial is a word I
would like you to purge from

287
00:15:17,030 --> 00:15:22,340
your vocabulary, because it's
a very dangerous label.

288
00:15:22,340 --> 00:15:25,970
The reason it's dangerous is
because there's a difference

289
00:15:25,970 --> 00:15:27,360
between trivial and simple.

290
00:15:27,360 --> 00:15:30,310
What is it?

291
00:15:30,310 --> 00:15:33,320
What's the difference between
labeling something as trivial

292
00:15:33,320 --> 00:15:34,570
and calling it simple?

293
00:15:34,570 --> 00:15:35,820
Yes?

294
00:15:39,440 --> 00:15:40,450
Exactly so.

295
00:15:40,450 --> 00:15:44,950
He says that simple can be
powerful, and trivial makes it

296
00:15:44,950 --> 00:15:48,830
sound like it's not only simple,
but of little worth.

297
00:15:48,830 --> 00:15:53,060
So many MIT people miss
opportunities, because they

298
00:15:53,060 --> 00:15:55,270
have a tendency to think that
ideas aren't important unless

299
00:15:55,270 --> 00:15:57,660
they're complicated.

300
00:15:57,660 --> 00:15:59,965
But the most simple ideas in
artificial intelligence are

301
00:15:59,965 --> 00:16:02,120
often the most powerful.

302
00:16:02,120 --> 00:16:04,130
We could teach an artificial
intelligence course to you

303
00:16:04,130 --> 00:16:06,180
that would be so full of
mathematics it would make a

304
00:16:06,180 --> 00:16:09,150
Course 18 professor gag.

305
00:16:09,150 --> 00:16:11,900
But those ideas would be
merely gratuitously

306
00:16:11,900 --> 00:16:15,590
complicated, and gratuitously
mathematical, and gratuitously

307
00:16:15,590 --> 00:16:16,940
not simple.

308
00:16:16,940 --> 00:16:18,790
Simple ideas are often
the most powerful.

309
00:16:22,460 --> 00:16:24,570
So where are we so far?

310
00:16:24,570 --> 00:16:27,165
We talked about the
definition.

311
00:16:27,165 --> 00:16:29,660
We talked about an example
of a method.

312
00:16:29,660 --> 00:16:32,560
Showed you a representation, and
perhaps also talked about

313
00:16:32,560 --> 00:16:34,220
the first idea, too.

314
00:16:34,220 --> 00:16:36,690
You've got the representation
right, you're

315
00:16:36,690 --> 00:16:38,440
often almost done.

316
00:16:38,440 --> 00:16:41,030
Because with this
representation, they can

317
00:16:41,030 --> 00:16:44,580
immediately see that there are
just two solutions to this

318
00:16:44,580 --> 00:16:48,930
problem, something that wouldn't
have occurred to us

319
00:16:48,930 --> 00:16:50,630
when we were little kids, and
didn't think to draw the

320
00:16:50,630 --> 00:16:51,880
[? state ?] diagram.

321
00:16:56,640 --> 00:16:57,890
There's still one more thing.

322
00:17:01,410 --> 00:17:04,740
In the past, and in other
places, artificial

323
00:17:04,740 --> 00:17:09,108
intelligence is often taught
as purely about reasoning.

324
00:17:09,108 --> 00:17:12,515
But we solve problems with
our eyes, as well as

325
00:17:12,515 --> 00:17:15,420
our symbolic apparatus.

326
00:17:15,420 --> 00:17:18,858
And you solved that problem
with your eyes.

327
00:17:18,858 --> 00:17:22,839
So I like to reinforce that by
giving you a little puzzle.

328
00:17:22,839 --> 00:17:24,630
Let's see, who's here?

329
00:17:30,370 --> 00:17:32,420
I don't see [? Kambe, ?] but
I'll bet he's from Africa.

330
00:17:32,420 --> 00:17:34,610
Is anyone from Africa?

331
00:17:34,610 --> 00:17:37,180
No one's from Africa?

332
00:17:37,180 --> 00:17:38,960
No?

333
00:17:38,960 --> 00:17:40,210
Well so much the better--

334
00:17:46,090 --> 00:17:47,760
because they would know the
answer to the puzzle.

335
00:17:47,760 --> 00:17:49,230
Here's the puzzle.

336
00:17:49,230 --> 00:17:53,270
How many countries in Africa
does the Equator cross?

337
00:17:56,690 --> 00:17:58,290
Would anybody be willing
to stake their

338
00:17:58,290 --> 00:17:59,750
life on their answer?

339
00:18:02,520 --> 00:18:03,770
Probably not.

340
00:18:06,910 --> 00:18:16,550
Well, now let me repeat
the question.

341
00:18:16,550 --> 00:18:19,475
How many countries in Africa
does the Equator cross?

342
00:18:21,995 --> 00:18:23,040
Yeah, six.

343
00:18:23,040 --> 00:18:25,970
What happened is a miracle.

344
00:18:25,970 --> 00:18:29,770
The miracle is that I have
communicated with you through

345
00:18:29,770 --> 00:18:34,660
language, and your language
system commanded your visual

346
00:18:34,660 --> 00:18:38,590
system to execute a program that
involves scanning across

347
00:18:38,590 --> 00:18:41,390
that line, counting as you go.

348
00:18:41,390 --> 00:18:43,770
And then your vision system
came back to your language

349
00:18:43,770 --> 00:18:45,950
system and said, six.

350
00:18:45,950 --> 00:18:47,575
And that is a miracle.

351
00:18:47,575 --> 00:18:50,600
And without understanding that
miracle, we'll never have a

352
00:18:50,600 --> 00:18:53,330
full understanding of the
nature of intelligence.

353
00:18:53,330 --> 00:18:55,720
But that kind of problem solving
is the kind of problem

354
00:18:55,720 --> 00:18:57,950
solving I wish we could teach
you a lot about it.

355
00:18:57,950 --> 00:19:00,420
But we can't teach you about
stuff we don't understand.

356
00:19:00,420 --> 00:19:01,500
We [INAUDIBLE]

357
00:19:01,500 --> 00:19:02,750
for that.

358
00:19:05,540 --> 00:19:08,930
That's a little bit about the
definition and some examples.

359
00:19:08,930 --> 00:19:10,460
What's it for?

360
00:19:10,460 --> 00:19:12,530
We can deal with that
very quickly.

361
00:19:12,530 --> 00:19:16,440
If we're engineers, it's for
building smarter programs.

362
00:19:16,440 --> 00:19:20,690
It's about building a tool kit
of representations and methods

363
00:19:20,690 --> 00:19:22,660
that make it possible to
build smarter programs.

364
00:19:22,660 --> 00:19:25,660
And you will find, these days,
that you can't build a big

365
00:19:25,660 --> 00:19:28,520
system without having embedded
in it somewhere the ideas that

366
00:19:28,520 --> 00:19:31,230
we talk about in the subject.

367
00:19:31,230 --> 00:19:33,420
If you're a scientist, there's
a somewhat different

368
00:19:33,420 --> 00:19:33,970
motivation.

369
00:19:33,970 --> 00:19:36,420
But it amounts to studying
the same sorts of things.

370
00:19:36,420 --> 00:19:40,800
If you're a scientist, you're
interested in what it is that

371
00:19:40,800 --> 00:19:43,800
enables us to build a
computational account of

372
00:19:43,800 --> 00:19:45,110
intelligence.

373
00:19:45,110 --> 00:19:46,910
That's the part that I do.

374
00:19:46,910 --> 00:19:49,620
But most this subject is going
to be about the other part,

375
00:19:49,620 --> 00:19:52,440
the part that makes it possible
for you to build

376
00:19:52,440 --> 00:19:53,510
smarter programs.

377
00:19:53,510 --> 00:19:56,990
And some of it will be about
what it is that makes us

378
00:19:56,990 --> 00:20:02,900
different from the chimpanzees
with whom we share an enormous

379
00:20:02,900 --> 00:20:05,130
fraction of our DNA.

380
00:20:05,130 --> 00:20:08,010
It used to be thought that we
share 95% of our DNA with

381
00:20:08,010 --> 00:20:09,840
chimpanzees.

382
00:20:09,840 --> 00:20:13,660
Then it went up to 98.

383
00:20:13,660 --> 00:20:15,470
Thank God it stopped
about there.

384
00:20:15,470 --> 00:20:16,770
Then it actually went
back a little bit.

385
00:20:16,770 --> 00:20:23,350
I think we're back down to 94.

386
00:20:23,350 --> 00:20:25,190
How about if we talk a little
bit now about the history of

387
00:20:25,190 --> 00:20:28,940
AI, so we can see how we got
to where we are today?

388
00:20:28,940 --> 00:20:31,040
This will also be a history of
AI that tells you a little bit

389
00:20:31,040 --> 00:20:32,420
about what you'll learn
in this course.

390
00:20:36,610 --> 00:20:42,730
It all started with Lady
Lovelace, the world's first

391
00:20:42,730 --> 00:20:47,750
programmer, Who wrote programs
about 100 years before there

392
00:20:47,750 --> 00:20:50,500
were computers to run them.

393
00:20:50,500 --> 00:20:54,140
But it's interesting that even
in 1842, people were hassling

394
00:20:54,140 --> 00:20:57,200
her about whether computers
could get really smart.

395
00:20:57,200 --> 00:21:03,370
And she said, "The analytical
engine has no pretensions to

396
00:21:03,370 --> 00:21:05,500
originate anything.

397
00:21:05,500 --> 00:21:10,080
It can do whatever we know how
to order it to perform."

398
00:21:10,080 --> 00:21:13,600
Screwball idea that persists
to this day.

399
00:21:13,600 --> 00:21:15,730
Nevertheless, that was
the origin of it all.

400
00:21:15,730 --> 00:21:17,330
That was the beginning
of the discussions.

401
00:21:17,330 --> 00:21:22,050
And then nothing much happened
until about 1950, when Alan

402
00:21:22,050 --> 00:21:24,110
Turing wrote his famous
paper, which

403
00:21:24,110 --> 00:21:27,090
introduced the Turing test.

404
00:21:27,090 --> 00:21:29,340
Of course, Alan Turing had
previously won the Second

405
00:21:29,340 --> 00:21:34,090
World War by breaking the German
code, the Ultra Code,

406
00:21:34,090 --> 00:21:36,240
for which the British government
rewarded him by

407
00:21:36,240 --> 00:21:38,180
driving him to suicide, because
he happened to be

408
00:21:38,180 --> 00:21:41,040
homosexual.

409
00:21:41,040 --> 00:21:44,260
But Turing wrote his paper in
1950, and that was the first

410
00:21:44,260 --> 00:21:48,720
milestone after Lady Lovelace's
comment in 1842.

411
00:21:48,720 --> 00:21:53,140
And then the modern era really
began with a paper written by

412
00:21:53,140 --> 00:21:56,120
Marvin Minsky in 1960, titled
"Steps Toward Artificial

413
00:21:56,120 --> 00:21:59,530
Intelligence." And it wasn't
a long after that Jim

414
00:21:59,530 --> 00:22:01,490
[? Slagle, ?]

415
00:22:01,490 --> 00:22:03,830
a nearly blind graduate student,
wrote a program that

416
00:22:03,830 --> 00:22:07,270
did symbolic integration.

417
00:22:07,270 --> 00:22:09,850
Not adding up area under a
curve, but doing symbolic

418
00:22:09,850 --> 00:22:11,870
integration just like you learn
to do in high school

419
00:22:11,870 --> 00:22:14,470
when you're a freshman.

420
00:22:14,470 --> 00:22:16,560
Now on Monday, we're going to
talk about this program.

421
00:22:16,560 --> 00:22:18,730
And you're going to understand
exactly how it works.

422
00:22:18,730 --> 00:22:21,570
And you can write
one yourself.

423
00:22:21,570 --> 00:22:23,730
And we're going to reach way
back in time to look at that

424
00:22:23,730 --> 00:22:27,400
program because, in one day
discussing it, talking about

425
00:22:27,400 --> 00:22:29,500
it, will be in itself a
miniature artificial

426
00:22:29,500 --> 00:22:30,360
intelligence course.

427
00:22:30,360 --> 00:22:35,250
Because it's so rich with
important ideas.

428
00:22:35,250 --> 00:22:39,310
So that's the dawn age,
early dawn age.

429
00:22:39,310 --> 00:22:49,810
This was the age of speculation,
and this was the

430
00:22:49,810 --> 00:22:51,060
dawn age in here.

431
00:22:53,820 --> 00:22:57,600
So in that early dawn age , the
integration program took

432
00:22:57,600 --> 00:22:58,390
the world by storm.

433
00:22:58,390 --> 00:23:00,280
Because not everybody knows
how to do integration.

434
00:23:00,280 --> 00:23:03,580
And someone, everyone, thought
that if we can do integration

435
00:23:03,580 --> 00:23:05,420
today, the rest of intelligence
will be figured

436
00:23:05,420 --> 00:23:06,882
out tomorrow.

437
00:23:06,882 --> 00:23:10,580
Too bad for our side it didn't
work out that way.

438
00:23:10,580 --> 00:23:14,200
Here's another dawn age
program, the Eliza

439
00:23:14,200 --> 00:23:15,220
[? thing ?].

440
00:23:15,220 --> 00:23:17,590
But I imagine you'd prefer
a demonstration to

441
00:23:17,590 --> 00:23:19,455
just reading it, right?

442
00:23:19,455 --> 00:23:22,870
Do you prefer a demonstration?

443
00:23:22,870 --> 00:23:24,120
Let's see if we can
demonstrate it.

444
00:24:08,250 --> 00:24:11,400
This is left over from a
hamentashen debate of a couple

445
00:24:11,400 --> 00:24:13,411
of years ago.

446
00:24:13,411 --> 00:24:15,050
How do you spell hamentashen,
anybody know?

447
00:24:19,450 --> 00:24:20,600
I sure hope that's right.

448
00:24:20,600 --> 00:24:21,390
It doesn't matter.

449
00:24:21,390 --> 00:24:22,640
Something interesting
will come.

450
00:24:25,920 --> 00:24:28,580
OK, your choice.

451
00:24:28,580 --> 00:24:31,310
Teal?

452
00:24:31,310 --> 00:24:32,560
Burton House?

453
00:24:35,890 --> 00:24:37,140
Teal.

454
00:24:47,810 --> 00:24:49,710
So that's dawn age AI.

455
00:24:49,710 --> 00:24:53,700
And no one ever took that stuff
seriously, except that

456
00:24:53,700 --> 00:24:54,900
it was a fun [INAUDIBLE]

457
00:24:54,900 --> 00:24:57,980
project level thing to work
out some matching

458
00:24:57,980 --> 00:24:59,810
programs, and so on.

459
00:24:59,810 --> 00:25:01,310
The integration program
was serious.

460
00:25:01,310 --> 00:25:03,240
This one wasn't.

461
00:25:03,240 --> 00:25:06,940
This was serious, programs that
do geometric analogy,

462
00:25:06,940 --> 00:25:09,020
problems of the kind you find
on intelligence tests.

463
00:25:09,020 --> 00:25:09,990
Do you have the answer
to this?

464
00:25:09,990 --> 00:25:11,340
A is to B as C is to what?

465
00:25:14,260 --> 00:25:15,610
That would be 2, I guess.

466
00:25:18,270 --> 00:25:19,520
What's the second best answer?

467
00:25:22,070 --> 00:25:24,750
And the theories of the program
that solve these

468
00:25:24,750 --> 00:25:28,990
problems are pretty much
identical to what you just

469
00:25:28,990 --> 00:25:30,460
figured out.

470
00:25:30,460 --> 00:25:35,970
In the first case you deleted
the inside figure.

471
00:25:35,970 --> 00:25:38,280
And the second case is, the
reason you got four is because

472
00:25:38,280 --> 00:25:44,340
you deleted the outside part
and grew the inside part.

473
00:25:44,340 --> 00:25:46,670
There's another one.

474
00:25:46,670 --> 00:25:49,510
I think this was the hardest one
it got, or the easiest one

475
00:25:49,510 --> 00:25:50,110
it didn't get.

476
00:25:50,110 --> 00:25:51,300
I've forgotten.

477
00:25:51,300 --> 00:25:54,170
A is to B as C is to 3.

478
00:25:57,490 --> 00:26:00,180
In the late dawn age, we began
to turn our attention from

479
00:26:00,180 --> 00:26:03,010
purely symbolic reasoning to
thinking a little bit about

480
00:26:03,010 --> 00:26:04,600
perceptual apparatus.

481
00:26:04,600 --> 00:26:07,410
And programs were written that
could figure out the nature of

482
00:26:07,410 --> 00:26:11,800
shapes and forms,
such as that.

483
00:26:11,800 --> 00:26:14,200
And it's interesting that those
programs had the same

484
00:26:14,200 --> 00:26:19,920
kind of difficulty with
this that you do.

485
00:26:19,920 --> 00:26:22,410
Because now, having deleted
all the edges, everything

486
00:26:22,410 --> 00:26:23,840
becomes ambiguous.

487
00:26:23,840 --> 00:26:26,790
And it may be a series of
platforms, or it may be a

488
00:26:26,790 --> 00:26:27,615
series of--

489
00:26:27,615 --> 00:26:30,520
can you see the saw blade
sticking up if you go through

490
00:26:30,520 --> 00:26:31,770
the reversal?

491
00:26:33,790 --> 00:26:35,750
Programs were written that
could learn from a small

492
00:26:35,750 --> 00:26:37,600
number of examples.

493
00:26:37,600 --> 00:26:41,490
Many people think of computer
learning as involving leading

494
00:26:41,490 --> 00:26:46,030
some neural net to submission
with thousands of trials.

495
00:26:46,030 --> 00:26:48,280
Programs were written in the
early dawn age that learned

496
00:26:48,280 --> 00:26:51,980
that an arch is something that
has to have the flat part on

497
00:26:51,980 --> 00:26:55,180
top, and the two sides can't
touch, and the top may or may

498
00:26:55,180 --> 00:26:56,430
not be a wedge.

499
00:26:59,630 --> 00:27:01,880
In the late dawn age, though,
the most important thing,

500
00:27:01,880 --> 00:27:06,302
perhaps, was what you look at
with me on Wednesday next.

501
00:27:06,302 --> 00:27:09,120
It's a rule-based
expert systems.

502
00:27:09,120 --> 00:27:14,740
And a program was written at
Stanford that did diagnosis of

503
00:27:14,740 --> 00:27:16,520
bacterial infections
of the blood.

504
00:27:16,520 --> 00:27:21,040
It turned out to do it better
than most doctors, most

505
00:27:21,040 --> 00:27:22,860
general practitioners.

506
00:27:22,860 --> 00:27:26,990
It was never used,
curiously enough.

507
00:27:26,990 --> 00:27:31,360
Because nobody cares what your
problem actually is.

508
00:27:31,360 --> 00:27:33,420
They just give you a broad
spectrum antibiotic that'll

509
00:27:33,420 --> 00:27:34,670
kill everything.

510
00:27:36,700 --> 00:27:39,700
But this late dawn age system,
the so-called [INAUDIBLE]

511
00:27:39,700 --> 00:27:46,440
system, was the system that
launched a thousand companies,

512
00:27:46,440 --> 00:27:48,960
because people started building
expert systems built

513
00:27:48,960 --> 00:27:50,040
on that technology.

514
00:27:50,040 --> 00:27:52,130
Here's that you don't know
you used, or that was

515
00:27:52,130 --> 00:27:53,430
used on your behalf.

516
00:27:53,430 --> 00:27:56,570
If you go through, for example,
the Atlanta airport,

517
00:27:56,570 --> 00:27:59,380
your airplane is parked by a
rule-based expert system that

518
00:27:59,380 --> 00:28:03,430
knows how to park aircraft
effectively.

519
00:28:03,430 --> 00:28:09,841
It saves Delta Airlines about
to $0.5 million a day of jet

520
00:28:09,841 --> 00:28:13,780
fuel by being all smarter
about how to park them.

521
00:28:13,780 --> 00:28:15,760
So that's an example of an
expert system that does a

522
00:28:15,760 --> 00:28:17,160
little bit of good for
a lot of people.

523
00:28:19,680 --> 00:28:20,600
There's Deep Blue.

524
00:28:20,600 --> 00:28:27,930
That takes us to the next stage
beyond the age of expert

525
00:28:27,930 --> 00:28:30,410
systems, and the business age.

526
00:28:30,410 --> 00:28:33,250
It takes us into this age
here, which I call the

527
00:28:33,250 --> 00:28:43,260
bulldozer age, because this is
the time when people began to

528
00:28:43,260 --> 00:28:46,390
see that we had at our disposal

529
00:28:46,390 --> 00:28:48,700
unlimited amounts of computing.

530
00:28:48,700 --> 00:28:50,880
And frequently you can
substitute computing for

531
00:28:50,880 --> 00:28:52,930
intelligence.

532
00:28:52,930 --> 00:28:56,420
So no one would say that Deep
Blue does anything like what a

533
00:28:56,420 --> 00:28:59,230
human chess master does.

534
00:28:59,230 --> 00:29:04,180
But nevertheless, Deep Blue,
by processing data like a

535
00:29:04,180 --> 00:29:06,700
bulldozer processes gravel,
was able to

536
00:29:06,700 --> 00:29:07,950
beat the world champion.

537
00:29:11,910 --> 00:29:13,100
So what's the right way?

538
00:29:13,100 --> 00:29:15,390
That's the age we're
in right now.

539
00:29:15,390 --> 00:29:18,420
I will of course be inducing
programs for those ages as we

540
00:29:18,420 --> 00:29:19,940
go through the subject.

541
00:29:19,940 --> 00:29:22,950
There is a question of what
age we're in right now.

542
00:29:22,950 --> 00:29:24,840
And it's always dangerous
to name an age when

543
00:29:24,840 --> 00:29:26,640
you're in it, I guess.

544
00:29:26,640 --> 00:29:30,660
I like to call at the age
of the right way.

545
00:29:30,660 --> 00:29:33,270
And this is an age when we begin
to realize that that

546
00:29:33,270 --> 00:29:36,750
definition up there is actually
a little incomplete,

547
00:29:36,750 --> 00:29:41,630
because much of our intelligence
has to do not

548
00:29:41,630 --> 00:29:47,920
with thinking, perception, and
action acting separately, but

549
00:29:47,920 --> 00:29:55,510
with loops that tie all
those together.

550
00:29:55,510 --> 00:29:57,490
We had one example
with Africa.

551
00:29:57,490 --> 00:30:02,270
Here's another example drawn
from a program that has been

552
00:30:02,270 --> 00:30:04,960
under development, and continues
to be, in my

553
00:30:04,960 --> 00:30:06,210
laboratory.

554
00:30:11,490 --> 00:30:13,500
We're going to ask the system
to imagine something.

555
00:30:18,670 --> 00:30:19,610
SYSTEM: OK.

556
00:30:19,610 --> 00:30:22,406
I will imagine that a ball
falls into a bowl.

557
00:30:33,098 --> 00:30:33,584
OK.

558
00:30:33,584 --> 00:30:38,930
I will imagine that a man
runs into a woman.

559
00:30:38,930 --> 00:30:40,240
PATRICK WINSTON: You see, it
does the best that it can if

560
00:30:40,240 --> 00:30:43,440
it doesn't have a good memory
of what these situations

561
00:30:43,440 --> 00:30:45,005
actually involve.

562
00:30:45,005 --> 00:30:48,130
But having imagined the
scene it can then--

563
00:30:48,130 --> 00:30:48,620
SYSTEM: Yes.

564
00:30:48,620 --> 00:30:52,110
I have learned from experience
that contact between a man and

565
00:30:52,110 --> 00:30:56,240
a woman appeared because a
man runs into a woman.

566
00:30:56,240 --> 00:30:58,110
PATRICK WINSTON: Having imagined
the scene, it can

567
00:30:58,110 --> 00:31:01,740
then read the answers using its
visual apparatus on the

568
00:31:01,740 --> 00:31:03,580
scene that it imagined.

569
00:31:03,580 --> 00:31:06,530
So just like what you did with
Africa, only now it's working

570
00:31:06,530 --> 00:31:08,790
with its own visual memory,
using visual programs.

571
00:31:12,026 --> 00:31:12,507
SYSTEM: OK.

572
00:31:12,507 --> 00:31:17,330
I will imagine that a man
gives a ball to a man.

573
00:31:17,330 --> 00:31:21,050
PATRICK WINSTON: I know this
looks like slugs, but they're

574
00:31:21,050 --> 00:31:22,300
actually distinguished
professors.

575
00:31:26,500 --> 00:31:27,750
It always does the
best it can.

576
00:31:30,465 --> 00:31:31,451
SYSTEM: OK.

577
00:31:31,451 --> 00:31:33,600
I will imagine that
a man flies.

578
00:31:39,560 --> 00:31:41,480
PATRICK WINSTON: It's the
best that it can do.

579
00:31:47,870 --> 00:31:51,920
So that concludes our discussion
of the history.

580
00:31:51,920 --> 00:31:54,240
And I've provided you with a
little bit of a glimpse of

581
00:31:54,240 --> 00:31:57,880
what we're going to look at
as the semester unfolds.

582
00:31:57,880 --> 00:32:00,410
Yes, Chris?

583
00:32:00,410 --> 00:32:03,880
CHRIS: Is it actually a
demonstration of something?

584
00:32:03,880 --> 00:32:06,200
Does it have a large
database of videos?

585
00:32:06,200 --> 00:32:08,676
PATRICK WINSTON: No, it has
a small database videos.

586
00:32:08,676 --> 00:32:13,480
CHRIS: But it's intelligently
picking among them based on--

587
00:32:13,480 --> 00:32:16,140
PATRICK WINSTON: Based
on their content.

588
00:32:16,140 --> 00:32:20,570
So if you say imagine that a
student gave a ball to another

589
00:32:20,570 --> 00:32:23,170
student, it imagines that.

590
00:32:23,170 --> 00:32:26,430
You say, now does the other
student have the ball?

591
00:32:26,430 --> 00:32:27,590
Does the other student
take the ball?

592
00:32:27,590 --> 00:32:30,030
It can answer those questions
because it can review the same

593
00:32:30,030 --> 00:32:33,250
video and see the take as well
as the give in the same video.

594
00:32:35,760 --> 00:32:42,840
So now we have to think about
why we ought to be optimistic

595
00:32:42,840 --> 00:32:44,460
about the future.

596
00:32:44,460 --> 00:32:47,210
Because we've had a long history
here, and we haven't

597
00:32:47,210 --> 00:32:48,830
solved the problem.

598
00:32:48,830 --> 00:32:51,350
But one reason why we can feel
optimistic about future is

599
00:32:51,350 --> 00:32:54,580
because all of our friends
have been on the march.

600
00:32:54,580 --> 00:32:57,950
And our friends include the
cognitive psychologists, the

601
00:32:57,950 --> 00:32:58,930
[? developmental ?]

602
00:32:58,930 --> 00:33:02,200
psychologists, the linguists,
sometimes the philosophers,

603
00:33:02,200 --> 00:33:03,450
and especially the
paleoanthropologists.

604
00:33:06,040 --> 00:33:09,430
Because it is becoming
increasingly clear why we're

605
00:33:09,430 --> 00:33:12,780
actually different from the
chimpanzees, and how we got to

606
00:33:12,780 --> 00:33:14,690
be that way.

607
00:33:14,690 --> 00:33:19,910
The high school idea is that
we evolved through slow,

608
00:33:19,910 --> 00:33:23,540
gradual, and continuous
improvement.

609
00:33:23,540 --> 00:33:25,730
But that doesn't seem to
be the way it happened.

610
00:33:25,730 --> 00:33:30,210
There are some characteristics
of our species that are

611
00:33:30,210 --> 00:33:34,010
informative when it comes to
guiding the activities of

612
00:33:34,010 --> 00:33:36,120
people like me.

613
00:33:36,120 --> 00:33:37,950
And here's what the
story seems to be

614
00:33:37,950 --> 00:33:40,600
from the fossil record.

615
00:33:40,600 --> 00:33:45,330
First of all, we humans have
been around for maybe 200,000

616
00:33:45,330 --> 00:33:49,210
years in our present
anatomical form.

617
00:33:49,210 --> 00:33:53,140
If someone walked through the
door right now from 200,000

618
00:33:53,140 --> 00:34:00,990
years ago, I imagine they
would be dirty,

619
00:34:00,990 --> 00:34:03,750
but other than that--

620
00:34:03,750 --> 00:34:05,100
probably naked, too--

621
00:34:05,100 --> 00:34:08,070
other than that, you wouldn't
be able to tell the

622
00:34:08,070 --> 00:34:09,320
difference, especially at MIT.

623
00:34:15,840 --> 00:34:21,900
And so the ensuing 150,000 years
was a period in which we

624
00:34:21,900 --> 00:34:25,159
humans didn't actually
amount to much.

625
00:34:25,159 --> 00:34:30,070
But somehow, shortly before
50,000 years ago, some small

626
00:34:30,070 --> 00:34:34,500
group of us developed a
capability that separated us

627
00:34:34,500 --> 00:34:37,520
from all other species.

628
00:34:37,520 --> 00:34:40,409
It was an accident
of evolution.

629
00:34:40,409 --> 00:34:43,310
And these accidents may or may
not happen, but it happened to

630
00:34:43,310 --> 00:34:45,199
produce us.

631
00:34:45,199 --> 00:34:47,920
It's also the case that we
probably necked down as a

632
00:34:47,920 --> 00:34:50,600
species to a few thousand, or
maybe even a few hundred

633
00:34:50,600 --> 00:34:55,219
individuals, something which
made these accidental changes,

634
00:34:55,219 --> 00:34:57,210
accidental evolutionary
products,

635
00:34:57,210 --> 00:35:00,500
more capable of sticking.

636
00:35:00,500 --> 00:35:02,820
This leads us to speculate on
what it was that happened

637
00:35:02,820 --> 00:35:04,750
50,000 years ago.

638
00:35:04,750 --> 00:35:10,520
And paleoanthropologists, Noam
Chomsky, a lot of people

639
00:35:10,520 --> 00:35:12,580
reached similar conclusions.

640
00:35:12,580 --> 00:35:15,050
And that conclusion is--

641
00:35:15,050 --> 00:35:16,700
I'll quote Chomsky.

642
00:35:16,700 --> 00:35:18,990
He's the voice of authority.

643
00:35:18,990 --> 00:35:22,020
"It seems that shortly before
50,000 years ago, some small

644
00:35:22,020 --> 00:35:26,040
group of us acquired the ability
to take two concepts,

645
00:35:26,040 --> 00:35:29,020
and combine them to make a
third concept, without

646
00:35:29,020 --> 00:35:33,540
disturbing the original two
concepts, without limit." And

647
00:35:33,540 --> 00:35:37,110
from a perspective of an AI
person like me, what Chomsky

648
00:35:37,110 --> 00:35:39,490
seems to be saying is, we
learned how to begin to

649
00:35:39,490 --> 00:35:42,270
describe things, in a way
that was intimately

650
00:35:42,270 --> 00:35:43,790
connected with language.

651
00:35:43,790 --> 00:35:46,360
And that, in the end, is what
separates us from the

652
00:35:46,360 --> 00:35:49,150
chimpanzees.

653
00:35:49,150 --> 00:35:50,790
So you might say, well let's
just study language.

654
00:35:50,790 --> 00:35:53,580
No, you can't do that, because
we think with our eyes.

655
00:35:53,580 --> 00:35:55,510
So language does two things.

656
00:35:55,510 --> 00:35:57,790
Number one, it enables us
to make descriptions.

657
00:35:57,790 --> 00:36:00,040
Descriptions enable us
to tell stories.

658
00:36:00,040 --> 00:36:02,400
And storytelling and story
understanding is what all of

659
00:36:02,400 --> 00:36:03,730
education is about.

660
00:36:03,730 --> 00:36:04,810
That's going up.

661
00:36:04,810 --> 00:36:09,370
And going down enables us to
marshal the resources of our

662
00:36:09,370 --> 00:36:12,950
perceptual systems, and even
command our perceptual systems

663
00:36:12,950 --> 00:36:17,450
to imagine things we've
never seen.

664
00:36:17,450 --> 00:36:20,120
So here's an example.

665
00:36:20,120 --> 00:36:22,420
Imagine running down
the street with a

666
00:36:22,420 --> 00:36:24,640
full bucket of water.

667
00:36:24,640 --> 00:36:25,890
What happens?

668
00:36:27,810 --> 00:36:29,100
Your leg gets wet.

669
00:36:29,100 --> 00:36:30,780
The water sloshes out.

670
00:36:30,780 --> 00:36:34,600
You'll never find that fact
anywhere on the web.

671
00:36:34,600 --> 00:36:37,470
You've probably never been told
that that's what happens

672
00:36:37,470 --> 00:36:39,910
when you run down the street
with a full bucket of water.

673
00:36:39,910 --> 00:36:43,400
But you easily imagine this
scenario, and you know what's

674
00:36:43,400 --> 00:36:43,990
going to happen.

675
00:36:43,990 --> 00:36:47,150
There was internal imagination
simulation.

676
00:36:47,150 --> 00:36:49,030
We're never going to understand
human intelligence

677
00:36:49,030 --> 00:36:51,670
until we can understand that.

678
00:36:51,670 --> 00:36:52,540
Here's another example.

679
00:36:52,540 --> 00:36:53,670
Imagine running down
the street with a

680
00:36:53,670 --> 00:36:55,620
full bucket of nickels?

681
00:36:55,620 --> 00:36:56,870
What happens?

682
00:36:59,460 --> 00:37:00,415
Nickels weigh a lot.

683
00:37:00,415 --> 00:37:01,395
You're going to be bent over.

684
00:37:01,395 --> 00:37:03,850
You're going to stagger.

685
00:37:03,850 --> 00:37:05,690
But nobody ever told you that.

686
00:37:05,690 --> 00:37:08,080
You won't find it anywhere
on the web.

687
00:37:08,080 --> 00:37:10,720
So language is at the center of
things because it enables

688
00:37:10,720 --> 00:37:14,380
storytelling going up, and
marshalling the resources of

689
00:37:14,380 --> 00:37:18,240
the perceptual apparatus,
going down.

690
00:37:18,240 --> 00:37:20,950
And that's where we're going
to finish the subject the

691
00:37:20,950 --> 00:37:23,320
semester, by trying to
understand more about that

692
00:37:23,320 --> 00:37:25,950
phenomenon.

693
00:37:25,950 --> 00:37:28,150
So that concludes everything
I wanted to say about the

694
00:37:28,150 --> 00:37:30,520
material and the subject.

695
00:37:30,520 --> 00:37:32,290
Now I want to turn my attention
a little bit to how

696
00:37:32,290 --> 00:37:34,090
we are going to operate
the subject.

697
00:37:34,090 --> 00:37:36,915
Because there are many
characteristics of the subject

698
00:37:36,915 --> 00:37:38,715
that are confusing.

699
00:37:43,610 --> 00:37:46,840
First of all, we have
four kinds of

700
00:37:46,840 --> 00:37:48,180
activities in the course.

701
00:38:17,930 --> 00:38:22,870
And each of these has
a different purpose.

702
00:38:39,580 --> 00:38:40,850
So I did the lectures.

703
00:38:40,850 --> 00:38:43,910
And the lectures are supposed
to be an hour about

704
00:38:43,910 --> 00:38:46,740
introducing the material
and the big picture.

705
00:38:46,740 --> 00:38:48,880
They're about powerful ideas.

706
00:38:52,138 --> 00:38:56,060
They're about the experience
side of the course.

707
00:38:56,060 --> 00:38:59,130
Let me step aside and
make a remark.

708
00:38:59,130 --> 00:39:00,390
MIT is about two things.

709
00:39:00,390 --> 00:39:06,200
It's about skill building,
and it's about big ideas.

710
00:39:06,200 --> 00:39:10,120
So you can build a skill at
home, or at Dartmouth, or at

711
00:39:10,120 --> 00:39:13,830
Harvard, or Princeton, or all
those kinds of places.

712
00:39:13,830 --> 00:39:16,280
But the experience you
can only get at MIT.

713
00:39:16,280 --> 00:39:18,000
I know everybody there is
to know in artificial

714
00:39:18,000 --> 00:39:18,940
intelligence.

715
00:39:18,940 --> 00:39:20,190
I can tell you about
how they think.

716
00:39:20,190 --> 00:39:21,250
I can tell you about
how I think.

717
00:39:21,250 --> 00:39:22,620
And that's something
you're not going to

718
00:39:22,620 --> 00:39:25,120
get any other place.

719
00:39:25,120 --> 00:39:29,640
So that's my role, as I see it,
in giving these lectures.

720
00:39:29,640 --> 00:39:34,160
Recitations are four buttressing
and expanding on

721
00:39:34,160 --> 00:39:37,430
the material, and providing a
venue that's small enough for

722
00:39:37,430 --> 00:39:38,680
discussion.

723
00:39:40,540 --> 00:39:42,200
Mega recitations are
[? a usual ?]

724
00:39:42,200 --> 00:39:43,380
components of the course.

725
00:39:43,380 --> 00:39:46,230
They're taught at the same
hour on Fridays.

726
00:39:46,230 --> 00:39:48,100
Mark Seifter, my graduate
student,

727
00:39:48,100 --> 00:39:49,510
will be teaching those.

728
00:39:49,510 --> 00:39:53,140
And those are wrapped around
past quiz problems.

729
00:39:53,140 --> 00:39:55,700
And Mark will show you
how to work them.

730
00:39:55,700 --> 00:40:00,050
It's very important component
to the subject.

731
00:40:00,050 --> 00:40:02,400
And finally the tutorials
are about helping

732
00:40:02,400 --> 00:40:04,890
you with the homework.

733
00:40:04,890 --> 00:40:09,160
So you might say to me,
well, do I really

734
00:40:09,160 --> 00:40:11,590
need to go to class?

735
00:40:11,590 --> 00:40:15,170
I like to say that the answer
is, only if you like to pass

736
00:40:15,170 --> 00:40:17,870
the subject.

737
00:40:17,870 --> 00:40:19,920
But you are MIT students.

738
00:40:19,920 --> 00:40:22,190
And MIT people always like
to look at the data.

739
00:40:26,150 --> 00:40:29,640
So this is a scattergram we
made after the subject was

740
00:40:29,640 --> 00:40:33,010
taught last fall, which shows
the relationship between

741
00:40:33,010 --> 00:40:38,680
attendance at lectures and the
grades awarded in the course.

742
00:40:38,680 --> 00:40:40,820
And if you're not sure what
that all means, here's the

743
00:40:40,820 --> 00:40:42,070
regression line.

744
00:40:46,160 --> 00:40:50,400
So that information is a
little suspect for two

745
00:40:50,400 --> 00:40:55,520
reasons, one of which is we
asked people to self report on

746
00:40:55,520 --> 00:40:58,330
how many lectures they thought
they attended.

747
00:40:58,330 --> 00:41:02,340
And our mechanism for assigning
these numerical

748
00:41:02,340 --> 00:41:03,590
grades is a little weird.

749
00:41:06,560 --> 00:41:08,470
And there's a third thing, too,
and that is, one must

750
00:41:08,470 --> 00:41:12,860
never confuse correlation
with cause.

751
00:41:12,860 --> 00:41:15,150
You can think of other
explanations for why that

752
00:41:15,150 --> 00:41:21,230
trend line goes up, different
from whether it has something

753
00:41:21,230 --> 00:41:25,840
to do with lectures producing
good grades.

754
00:41:25,840 --> 00:41:28,070
You might ask how I feel about
the people up there on the

755
00:41:28,070 --> 00:41:30,580
other upper left hand corner.

756
00:41:30,580 --> 00:41:32,970
There are one or two people who
were near the top of the

757
00:41:32,970 --> 00:41:37,190
subject who didn't go
to class at all.

758
00:41:37,190 --> 00:41:40,890
And I have mixed feelings
about that.

759
00:41:40,890 --> 00:41:41,550
You're adults.

760
00:41:41,550 --> 00:41:43,460
It's your call.

761
00:41:43,460 --> 00:41:46,490
On the other hand, I wish that
if that's what you do

762
00:41:46,490 --> 00:41:49,780
habitually in all the subjects
you take at MIT, that you

763
00:41:49,780 --> 00:41:52,450
would resign and go somewhere
else, and let somebody else

764
00:41:52,450 --> 00:41:53,780
take their slot.

765
00:41:53,780 --> 00:41:57,050
Because you're not benefiting
from the powerful ideas, and

766
00:41:57,050 --> 00:41:58,930
the other kinds of things
that involve

767
00:41:58,930 --> 00:42:01,150
interaction with faculty.

768
00:42:01,150 --> 00:42:02,350
So it can be done.

769
00:42:02,350 --> 00:42:03,310
But I don't recommend it.

770
00:42:03,310 --> 00:42:06,970
By the way, all of the four
activities that we have here

771
00:42:06,970 --> 00:42:08,440
show similar regression lines.

772
00:42:11,280 --> 00:42:12,930
But what about that
five point scale?

773
00:42:12,930 --> 00:42:14,940
Let me explain how that
works to you.

774
00:42:14,940 --> 00:42:18,060
We love to have people
ask us what the class

775
00:42:18,060 --> 00:42:20,430
average is on a quiz.

776
00:42:20,430 --> 00:42:23,480
Because that's when we get
to use our blank stare.

777
00:42:23,480 --> 00:42:26,180
Because we have no idea
what the class average

778
00:42:26,180 --> 00:42:28,740
ever is on any quiz.

779
00:42:28,740 --> 00:42:29,990
Here's what we do.

780
00:42:46,110 --> 00:42:48,660
Like everybody else, we
start off with a score

781
00:42:48,660 --> 00:42:50,680
from zero to 100.

782
00:42:50,680 --> 00:42:54,750
But then we say to ourselves,
what score would you get if

783
00:42:54,750 --> 00:42:57,260
you had a thorough understanding
of the material?

784
00:42:57,260 --> 00:43:00,400
And we say, well, for this
particular exam, it's this

785
00:43:00,400 --> 00:43:02,480
number right here.

786
00:43:02,480 --> 00:43:04,630
And what score would you
get if you had a good

787
00:43:04,630 --> 00:43:05,500
understanding of the material?

788
00:43:05,500 --> 00:43:07,340
That's that score.

789
00:43:07,340 --> 00:43:10,040
And what happens if you're
down here is that you're

790
00:43:10,040 --> 00:43:12,670
following off the edge of the
range in which we think you

791
00:43:12,670 --> 00:43:14,940
need to do more work.

792
00:43:14,940 --> 00:43:18,830
So what we do is, we say that if
you're in this range here--

793
00:43:18,830 --> 00:43:21,520
following MIT convention
with GPAs and stuff,

794
00:43:21,520 --> 00:43:23,440
that gets you a five.

795
00:43:23,440 --> 00:43:25,300
If you're in this range
down here, there's a

796
00:43:25,300 --> 00:43:28,170
sharp drop off to four.

797
00:43:28,170 --> 00:43:29,710
If you're in this range
down here, there's a

798
00:43:29,710 --> 00:43:33,070
sharp fall off to three.

799
00:43:33,070 --> 00:43:34,950
So that means if you're in the
middle of one of those

800
00:43:34,950 --> 00:43:37,790
plateaus there's no point
in arguing with this.

801
00:43:37,790 --> 00:43:40,550
Because it's not going
to do you any good.

802
00:43:40,550 --> 00:43:43,020
We have these boundaries where
we think performance break

803
00:43:43,020 --> 00:43:45,430
points are.

804
00:43:45,430 --> 00:43:47,870
So you say, well that seems
a little harsh.

805
00:43:47,870 --> 00:43:49,850
Blah, blah, blah, blah, blah,
and start arguing.

806
00:43:49,850 --> 00:43:53,560
But then we will come back with
a second major innovation

807
00:43:53,560 --> 00:43:54,860
we have in the course.

808
00:43:54,860 --> 00:44:00,300
That is that your grade is
calculated in several parts.

809
00:44:00,300 --> 00:44:06,970
Part one is the max of
your grade on Q1, and

810
00:44:06,970 --> 00:44:09,830
part one of the final.

811
00:44:09,830 --> 00:44:12,780
So in other words, you get
two shots at everything.

812
00:44:12,780 --> 00:44:18,080
So if you have complete glorious
undeniable horrible F

813
00:44:18,080 --> 00:44:21,540
on the first quiz, it gets
erased on the final if you do

814
00:44:21,540 --> 00:44:23,910
well on that part
of the final.

815
00:44:23,910 --> 00:44:25,590
So each quiz has
a corresponding

816
00:44:25,590 --> 00:44:27,080
mirror on the final.

817
00:44:27,080 --> 00:44:32,680
You get the max of the score you
got on those two pieces.

818
00:44:32,680 --> 00:44:36,540
And now you say to me,
I'm an MIT student.

819
00:44:36,540 --> 00:44:38,066
I have a lot of guts.

820
00:44:38,066 --> 00:44:39,740
I'm only going to
take the final.

821
00:44:44,020 --> 00:44:46,060
It has been done.

822
00:44:46,060 --> 00:44:49,400
We don't recommend it.

823
00:44:49,400 --> 00:44:51,970
And the reason we don't
recommend it is that we don't

824
00:44:51,970 --> 00:44:53,730
expect everybody to do
all of the final.

825
00:44:53,730 --> 00:44:55,740
So there would be a lot of time
pressure if you had to do

826
00:44:55,740 --> 00:45:00,200
all of the final, all five
parts of the final.

827
00:45:00,200 --> 00:45:02,890
So we have four quizzes.

828
00:45:02,890 --> 00:45:05,590
And the final has a fifth part
because there's some material

829
00:45:05,590 --> 00:45:08,710
that we teach you after the last
date on which we can give

830
00:45:08,710 --> 00:45:11,010
you a final by institute
rules.

831
00:45:11,010 --> 00:45:12,050
But that's roughly
how it works.

832
00:45:12,050 --> 00:45:16,690
And you can read about more of
the details in the FAQ on the

833
00:45:16,690 --> 00:45:17,940
subject homepage.

834
00:45:19,910 --> 00:45:21,050
So now we're almost done.

835
00:45:21,050 --> 00:45:24,190
I just want to talk a little bit
about how we're going to

836
00:45:24,190 --> 00:45:27,840
communicate with you in the
next few days, while we're

837
00:45:27,840 --> 00:45:29,710
getting ourselves organized.

838
00:45:29,710 --> 00:45:32,560
So, number one-- if I could ask
the TAs to help me pass

839
00:45:32,560 --> 00:45:34,570
these out--

840
00:45:34,570 --> 00:45:37,890
we need to schedule you
into tutorials.

841
00:45:37,890 --> 00:45:39,970
So we're going to ask you to
fill out this form, and give

842
00:45:39,970 --> 00:45:41,220
it to us before you leave.

843
00:45:44,250 --> 00:45:48,910
So you'll be hearing from
us once we do the sort.

844
00:45:48,910 --> 00:45:58,100
There's the issue of whether
we're going to have ordinary

845
00:45:58,100 --> 00:46:00,500
recitation and a mega recitation
this week.

846
00:46:00,500 --> 00:46:01,460
So pay attention.

847
00:46:01,460 --> 00:46:03,630
Otherwise, you're going to be
stranded in a classroom with

848
00:46:03,630 --> 00:46:04,990
nothing to do.

849
00:46:04,990 --> 00:46:07,450
We're not going to have any
regular recitations this week.

850
00:46:10,010 --> 00:46:11,410
Are we having regular recitation
this week,

851
00:46:11,410 --> 00:46:11,720
[INAUDIBLE]?

852
00:46:11,720 --> 00:46:14,270
No.

853
00:46:14,270 --> 00:46:19,890
We may, and probably will, have
a mega recitation this

854
00:46:19,890 --> 00:46:24,480
week that's devoted to
a Python review.

855
00:46:24,480 --> 00:46:27,810
Now we know that there are
many of you who are

856
00:46:27,810 --> 00:46:31,250
celebrating a religious holiday
on Friday, and so we

857
00:46:31,250 --> 00:46:34,680
will be putting a lot of
resources online so you can

858
00:46:34,680 --> 00:46:38,060
get that review in
another way.

859
00:46:38,060 --> 00:46:42,740
We probably will have a Python
review on Friday.

860
00:46:42,740 --> 00:46:45,640
And we ask that you look at
our home page for further

861
00:46:45,640 --> 00:46:49,520
information about that as
the week progresses.

862
00:46:49,520 --> 00:46:50,640
So that's all folks.

863
00:46:50,640 --> 00:46:52,470
That concludes what we're
going to do today.

864
00:46:52,470 --> 00:46:56,750
And as soon as you give us
your form, we're through.