File size: 26,251 Bytes
c996935
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
[0.000 --> 22.200]  Okay, now I don't want to alarm anybody in this room, but it's just come to my attention
[22.200 --> 26.600]  that the person to your right is a liar.
[26.600 --> 30.440]  So the person to your left is a liar.
[30.440 --> 33.640]  Also the person sitting in your very seats is a liar.
[33.640 --> 35.440]  We're all liars.
[35.440 --> 39.080]  What I'm going to do today is I'm going to show you what the research says about why we're
[39.080 --> 44.320]  all liars, how you can become a lie spotter, and why you might want to go the extra mile
[44.320 --> 47.720]  and go from lie spotting to truth seeking.
[47.720 --> 50.520]  And ultimately to trust building.
[50.520 --> 56.560]  Now speaking of trust, ever since I wrote this book lie spotting, no one wants to meet
[56.560 --> 57.560]  me in person anymore.
[57.560 --> 58.560]  No, no, no, no.
[58.560 --> 60.560]  They say it's okay.
[60.560 --> 63.320]  We'll email you.
[63.320 --> 67.680]  I can't even get a coffee date at Starbucks.
[67.680 --> 69.680]  My husband's like honey deception.
[69.680 --> 71.280]  Maybe you could have focused on cooking.
[71.280 --> 72.960]  How about French cooking?
[72.960 --> 78.200]  So before we get started, what I'm going to do is I'm going to clarify my goal for you,
[78.200 --> 80.320]  which is not to teach a game of gacha.
[80.320 --> 83.600]  Lie spotters aren't those nit picky kids, those kids in the back of the room that are
[83.600 --> 88.600]  shouting gacha, gacha, your eyebrow twitched, you flared, you're an astral, I watched that
[88.600 --> 91.200]  TV show lie to me, I know you're lying.
[91.200 --> 92.200]  No.
[92.200 --> 96.360]  Lie spotters are armed with scientific knowledge of how to spot deception.
[96.360 --> 100.560]  They use it to get to the truth, and they do what mature leaders do every day.
[100.560 --> 105.600]  They have difficult conversations with difficult people, sometimes during very difficult times.
[105.600 --> 109.200]  And they start up that path by accepting a core proposition.
[109.200 --> 111.440]  And that proposition is the filing.
[111.440 --> 114.600]  Lying is a cooperative act.
[114.600 --> 115.600]  Think about it.
[115.600 --> 118.120]  A lie has no power whatsoever by its mere utterance.
[118.120 --> 121.960]  Its power emerges when someone else agrees to believe the lie.
[121.960 --> 128.320]  So I know it may sound like tough love, but look, if at some point you got lied to, it's
[128.320 --> 130.200]  because you agreed to get lied to.
[130.200 --> 131.720]  Truth number one about lying.
[131.720 --> 132.760]  Lying is a cooperative act.
[132.760 --> 135.840]  Now, not all lies are harmful.
[135.840 --> 141.360]  Sometimes we're willing participants in deception for the sake of social dignity,
[141.360 --> 144.160]  maybe to keep a secret that should be kept secret secret.
[144.160 --> 146.160]  We say, nice song.
[146.160 --> 147.760]  Honey, you don't look fat in that.
[147.760 --> 148.760]  No.
[148.760 --> 151.200]  Or we say, favorite of the digerati.
[151.200 --> 154.280]  You know, I just fished that email out of my spam folder.
[154.280 --> 156.720]  I'm so sorry.
[156.720 --> 160.360]  But there are times when we are unwilling participants in deception.
[160.360 --> 162.800]  And that can have dramatic costs for us.
[162.800 --> 170.160]  Last year saw $997 billion in corporate fraud alone in the United States.
[170.160 --> 172.200]  That's an eyelash under $1 trillion.
[172.200 --> 174.480]  That's 7% of revenues.
[174.480 --> 175.560]  Deception can cost billions.
[175.560 --> 181.560]  Think Enron, Mat off the mortgage crisis, or in the case of double agents and traders
[181.560 --> 185.960]  like Robert Hansen or Aldrich Ames, lies can betray our country.
[185.960 --> 188.080]  They can compromise our security.
[188.080 --> 189.360]  They can undermine democracy.
[189.360 --> 191.960]  They can cause the deaths of those that defend us.
[191.960 --> 194.720]  Deception is actually serious business.
[194.720 --> 199.640]  This con man, Henry Oberlander, he was such an effective con man.
[199.640 --> 203.360]  The British authorities say he could have undermined the entire banking system of the Western
[203.360 --> 204.360]  world.
[204.360 --> 205.360]  And you can't find this guy on Google.
[205.360 --> 206.360]  You can't find him anywhere.
[206.360 --> 207.360]  He was interviewed once.
[207.360 --> 208.360]  And he said the following.
[208.360 --> 210.520]  He said, look, I've got one rule.
[210.520 --> 211.680]  And this was Henry's rule.
[211.680 --> 215.600]  He said, look, everyone is willing to give you something.
[215.600 --> 219.480]  They're ready to give you something for whatever it is they're hungry for.
[219.480 --> 220.480]  And that's the crux of it.
[220.480 --> 225.360]  If you don't want to be deceived, you have to know what is it that you're hungry for.
[225.360 --> 227.120]  And we all kind of hate to admit it.
[227.120 --> 233.000]  You know, we kind of wish we were better husbands, better wives, smarter, more powerful,
[233.000 --> 234.560]  taller, richer.
[234.560 --> 237.000]  The list goes on.
[237.000 --> 241.640]  Lying is an attempt to bridge that gap, to connect our wishes and our fantasies about
[241.640 --> 245.720]  who we wish we were, how we wish we could be, with what we're really like.
[245.720 --> 249.200]  And, boy, we're willing to fill in those gaps in our lives with lies.
[249.200 --> 254.680]  On a given day, studies show that you may be lied to anywhere from 10 to 200 times.
[254.680 --> 257.960]  And now granted, many of those are white lies.
[257.960 --> 263.600]  But in another study, it showed that strangers lied three times within the first 10 minutes
[263.600 --> 265.040]  of meeting each other.
[265.040 --> 269.320]  Now, when we first hear this data, we recoil.
[269.320 --> 270.760]  We can't believe how prevalent lying is.
[270.760 --> 272.560]  We're essentially against lying.
[272.560 --> 276.720]  But if you look more closely, the plot actually thickens.
[276.720 --> 280.640]  We lie more to strangers than we lie to co-workers.
[280.640 --> 284.400]  Extroverts lie more than introverts.
[284.400 --> 289.480]  And lie eight times more about themselves than they do other people.
[289.480 --> 292.600]  Women lie more to protect other people.
[292.600 --> 297.200]  If you're in an average married couple, you're going to lie to your spouse in one out of
[297.200 --> 298.600]  every 10 interactions.
[298.600 --> 300.400]  Now, you may think that's bad.
[300.400 --> 303.760]  If you're unmarried, that number drops to three.
[303.760 --> 305.240]  Lying's complex.
[305.240 --> 308.880]  It's woven into the fabric of our daily and our business lives, where deeply ambivalent
[308.880 --> 310.120]  about the truth.
[310.120 --> 314.240]  We parse it out on an as-needed basis, sometimes for very, very good reasons, and other times
[314.640 --> 317.720]  just because we don't understand the gaps in our lives.
[317.720 --> 319.360]  That's truth number two about lying.
[319.360 --> 322.680]  We're against lying, but we're covertly for it.
[322.680 --> 326.720]  It weighs that our society has sanctioned for centuries and centuries and centuries.
[326.720 --> 328.680]  It's as old as breathing.
[328.680 --> 329.680]  It's part of our culture.
[329.680 --> 331.400]  It's part of our history.
[331.400 --> 338.480]  Think Dante, Shakespeare, the Bible, news of the world.
[338.480 --> 341.480]  Lying has evolutionary value to us as a species.
[341.480 --> 346.520]  Researchers have long known that the more intelligent the species, the larger the neocortex,
[346.520 --> 349.200]  the more likely it is to be deceptive.
[349.200 --> 350.680]  Now you might remember Coco.
[350.680 --> 354.000]  Does anybody here remember Coco the Gorilla, who was taught sign language?
[354.000 --> 357.200]  Coco was taught to communicate via sign language.
[357.200 --> 358.200]  Here's Coco with her kitten.
[358.200 --> 361.800]  It's her cute little fluffy pet kitten.
[361.800 --> 368.000]  Coco once blamed her pet kitten for ripping a sink out of the wall.
[368.000 --> 370.200]  We're hardwired to become leaders of the pack.
[370.200 --> 372.240]  It starts really, really early.
[372.240 --> 373.240]  How early?
[373.240 --> 378.480]  Well, babies will fake a cry, pause, wait to see who's coming, and then go right back to
[378.480 --> 380.560]  crying.
[380.560 --> 383.760]  One year old's learned concealment.
[383.760 --> 386.400]  Two year old's bluff.
[386.400 --> 387.760]  Five year olds lie outright.
[387.760 --> 390.200]  They manipulate via flattery.
[390.200 --> 392.720]  Nine year olds masters of the cover up.
[392.720 --> 395.680]  By the time you enter college, you're going to lie to your mom and one out of every five
[395.680 --> 397.880]  interactions.
[397.880 --> 401.880]  By the time we enter this work world and we're breadwinners, we enter a world that has
[401.880 --> 408.960]  just cluttered with spam, fake digital friends, partisan media, ingenious identity thieves,
[408.960 --> 413.120]  world-class Ponzi schemers, a deception epidemic.
[413.120 --> 417.960]  In short, what one author calls a post-truth society.
[417.960 --> 424.200]  It's been very confusing for a long time now.
[424.200 --> 426.440]  What do you do?
[426.440 --> 430.600]  Well, there are steps we can take to navigate our way through the morass.
[430.600 --> 433.120]  Trained lie spotters get to the truth 90% of the time.
[433.120 --> 435.920]  The rest of us are only 54% accurate.
[435.920 --> 437.520]  Why is it so easy to learn?
[437.520 --> 439.520]  Well, they're good liars and they're bad liars.
[439.520 --> 440.920]  They're no real original liars.
[440.920 --> 442.520]  We all bake the same mistakes.
[442.520 --> 444.520]  We all use the same techniques.
[444.520 --> 448.400]  So what I'm going to do is I'm going to show you two patterns of deception and then we're
[448.400 --> 450.600]  going to look at the hot spots and see if we can find them ourselves.
[450.600 --> 454.600]  We're going to start with speech.
[454.600 --> 455.760]  I want you to listen to me.
[455.760 --> 458.000]  I'm going to say this again.
[458.000 --> 464.520]  I did not have sexual relations with that woman, Miss Lewinsky.
[464.520 --> 469.680]  I never told anybody it's a lie, not a single time, never.
[469.680 --> 475.160]  These allegations are false and I need to go back to work for the American people.
[475.160 --> 477.160]  Thank you.
[477.160 --> 479.880]  Okay.
[479.880 --> 481.400]  What were the telltale signs?
[481.400 --> 486.280]  Well, first we heard what's known as a non-contracted denial.
[486.280 --> 490.920]  Studies show that people who are over-determined in their denial will resort to formal rather than
[490.920 --> 492.360]  informal language.
[492.360 --> 495.000]  We also heard distancing language that woman.
[495.000 --> 499.920]  We know that liars will unconsciously distance themselves from their subject using languages
[499.920 --> 500.920]  their tool.
[500.920 --> 505.080]  Now, Bill Clinton has said, well, they tell you the truth.
[505.080 --> 507.480]  Or Richard Nixon's favorite in all candor.
[507.480 --> 511.160]  He would have been a dead giveaway for any lie spotter that knows the qualifying language
[511.240 --> 513.560]  as it's called qualifying language like that.
[513.560 --> 516.240]  Further discredits the subject.
[516.240 --> 521.000]  Now if he had repeated the question in its entirety or if he had peppered his account with a little
[521.000 --> 525.880]  too much detail and we're all really glad he didn't do that, he would have further discredited
[525.880 --> 527.640]  himself.
[527.640 --> 528.640]  Freud had it right.
[528.640 --> 532.520]  Freud said, look, there's much more to it than speech.
[532.520 --> 534.920]  No mortal can keep a secret.
[534.920 --> 538.960]  If his lips are silent, he shatters with his fingertips and we all do it.
[538.960 --> 541.120]  No matter how powerful you are, we all do it.
[541.120 --> 543.000]  We all chatter with our fingertips.
[543.000 --> 550.400]  I'm going to show you Dominique Strauss-Con with Obama, who's chattering with his fingertips.
[550.400 --> 558.280]  Now this brings us to our next pattern, which is body language.
[558.280 --> 560.640]  With body language, here's what you've got to do.
[560.640 --> 565.120]  You really got to just throw your assumptions out the door, let the science temper your
[565.120 --> 568.840]  knowledge a little bit because we think liars fidget all the time.
[568.840 --> 572.240]  Well guess what, they're known to freeze their upper bodies when they're lying.
[572.240 --> 575.040]  We think liars won't look in the eyes.
[575.040 --> 578.600]  Well guess what, they look you in the eyes a little too much just to compensate for that
[578.600 --> 579.600]  myth.
[579.600 --> 585.760]  We think warmth and smiles convey honesty, sincerity, but a trained lie spotter can spot a fake
[585.760 --> 587.400]  smile a mile away.
[587.400 --> 591.160]  Can you all spot the fake smile here?
[591.160 --> 596.960]  You can consciously contract the muscles in your cheeks, but the real smiles in the
[596.960 --> 597.960]  eyes.
[597.960 --> 602.460]  Those feet of the eyes, they cannot be consciously contracted, especially if you overdo the
[602.460 --> 603.460]  Botox.
[603.460 --> 604.460]  Don't overdo the Botox.
[604.460 --> 606.760]  Nobody will think you're honest.
[606.760 --> 608.600]  Now we're going to look at the hot spots.
[608.600 --> 610.400]  Can you tell what's happening in a conversation?
[610.400 --> 615.280]  Can you start to find the hot spots to see the discrepancies between someone's words and
[615.280 --> 616.440]  someone's actions?
[616.440 --> 620.880]  Now I know it seems really obvious, but when you're having a conversation with someone
[620.880 --> 625.920]  that uses a speck of deception, attitude is by far the most overlooked, but telling of
[625.920 --> 627.160]  indicators.
[627.160 --> 631.040]  When honest person is going to be cooperative, they're going to show their own side, they're
[631.040 --> 634.200]  going to be enthusiastic, they're going to be willing and helpful in getting you to
[634.200 --> 635.200]  the truth.
[635.200 --> 639.480]  They're going to be willing to brainstorm, name suspects, provide details.
[639.480 --> 644.920]  They're going to say, hey, maybe it was those guys in payroll that forged those checks.
[644.920 --> 648.600]  They're going to be infuriated if they sense the wrongly accused about the entire course
[648.600 --> 652.040]  of the interview, not just in flashes, they'll be infuriated throughout the entire course
[652.040 --> 653.040]  of the interview.
[653.560 --> 658.200]  If you ask someone honest, what should happen to whoever did forge those checks?
[658.200 --> 663.280]  An honest person is much more likely to recommend strict, rather than lenient punishment.
[663.280 --> 668.440]  Now, let's say you're having that exact same conversation with someone deceptive.
[668.440 --> 674.680]  That person may be withdrawn, look down, lower their voice, pause, be kind of herky jerky.
[674.680 --> 678.000]  Ask a deceptive person to tell their story, they're going to pepper it with way too much
[678.000 --> 681.840]  detail in all kinds of irrelevant places.
[681.840 --> 684.640]  And then they're going to tell their story in strict, chronological order.
[684.640 --> 689.520]  And what a trained interrogator does is they come in, and in very subtle ways, over the
[689.520 --> 694.280]  course of several hours, they will ask that person to tell their story backwards.
[694.280 --> 698.600]  And then they'll watch them squirm and track which questions produce the highest volume
[698.600 --> 699.600]  of deceptive tells.
[699.600 --> 700.600]  Why do they do that?
[700.600 --> 702.360]  Well, we all do the same thing.
[702.360 --> 705.720]  We rehearse our words, but we rarely rehearse our gestures.
[705.720 --> 707.920]  We say, yes, we shake our heads, no.
[707.920 --> 709.280]  We tell very convincing stories.
[709.280 --> 711.160]  We slightly shrug our shoulders.
[711.160 --> 715.840]  We commit terrible crimes, and we smile at the delight in getting away with it.
[715.840 --> 718.920]  Now that smile is known in the trade as duping delight.
[718.920 --> 722.680]  And we're going to see that in several videos moving forward, but we're going to start
[722.680 --> 724.080]  for those of you that don't know him.
[724.080 --> 728.840]  This is presidential candidate John Edwards, who shocked America by fathering a child out
[728.840 --> 729.840]  of wedlock.
[729.840 --> 733.080]  We're going to see him talk about getting a paternity test.
[733.080 --> 738.080]  To see now, if you can spot him saying yes while shaking his head, no, slightly shrugging
[738.080 --> 739.080]  his shoulders.
[739.080 --> 741.320]  I'm happy to participate in one.
[741.320 --> 746.760]  I know that it's not possible that this child could be mine because of the timing of events.
[746.760 --> 748.480]  So I know it's not possible.
[748.480 --> 751.520]  Happy to take a paternity test, and we'd love to see it happen.
[751.520 --> 752.920]  Are you going to do that soon?
[752.920 --> 754.920]  Is there somebody that you can't?
[754.920 --> 755.920]  I'm only one side.
[755.920 --> 760.760]  I can only one side of the test, but I'm happy to participate in one.
[760.760 --> 765.640]  Okay, those head shakes are much easier to spot once you know to look for them.
[765.640 --> 770.320]  Now there are going to be times when someone makes one expression while masking another that
[770.320 --> 773.000]  just kind of leaks through in a flash.
[773.000 --> 775.680]  Murderers are known to leak sadness.
[775.680 --> 778.640]  Your new joint venture partner might shake your hand, celebrate, go out to dinner with
[778.640 --> 781.720]  you, and then leak an expression of anger.
[781.720 --> 785.400]  And we're not all going to become facial expression experts overnight here, but there's one
[785.400 --> 789.120]  I can teach you that's very dangerous and that's easy to learn, and that's the expression
[789.120 --> 790.120]  of contempt.
[790.120 --> 794.160]  Now with anger, you've got two people in an even playing field.
[794.160 --> 798.480]  It's still somewhat of a healthy relationship, but when anger turns to contempt, you've
[798.480 --> 800.080]  been dismissed.
[800.080 --> 804.800]  It's associated with moral superiority, and for that reason, it's very, very hard to
[804.800 --> 805.800]  recover from.
[805.800 --> 807.320]  Here's what it looks like.
[807.320 --> 811.360]  It's marked by one lip corner pulled up and in.
[811.360 --> 814.200]  It's the only asymmetrical expression.
[814.200 --> 818.640]  And in the presence of contempt, whether or not deception follows, and it doesn't always
[818.640 --> 820.120]  follow.
[820.120 --> 824.080]  Look the other way, go the other direction, reconsider the deal, say no, thank you.
[824.080 --> 826.680]  I'm not coming up for just one more night cap.
[826.680 --> 828.320]  Thank you.
[828.320 --> 831.840]  Science has surfaced many, many more indicators.
[831.840 --> 836.680]  We know, for example, we know Lyra's will shift their blink rate, point their feet towards
[836.680 --> 838.440]  an exit.
[838.440 --> 842.760]  They will take barrier objects and put them between themselves and the person that's interviewing
[842.760 --> 843.760]  them.
[843.760 --> 847.680]  They'll alter their vocal tone, often making them make their vocal tone much lower.
[847.680 --> 850.080]  Now, here's the deal.
[850.080 --> 853.480]  These behaviors are just behaviors.
[853.480 --> 856.040]  They're not proof of deception.
[856.040 --> 857.040]  They're red flags.
[857.040 --> 858.040]  We're human beings.
[858.040 --> 861.520]  We make deceptive flailing gestures all over the place all day long.
[861.520 --> 863.920]  They don't mean anything in and of themselves.
[863.920 --> 867.000]  But when you see clusters of them, that's your signal.
[867.000 --> 868.760]  Look, listen, probe.
[868.760 --> 870.080]  Ask some hard questions.
[870.080 --> 872.960]  Get out of that very comfortable mode of knowing.
[872.960 --> 874.600]  Walk into curiosity mode.
[874.600 --> 876.480]  Ask more questions.
[876.480 --> 877.480]  Have a little dignity.
[877.480 --> 878.960]  Treat the person you're talking to with rapport.
[878.960 --> 882.320]  Don't try to be like those folks on law and order and those other TV shows that pummel
[882.320 --> 884.080]  their subjects into submission.
[884.080 --> 885.080]  Don't be too aggressive.
[885.080 --> 887.040]  It doesn't work.
[887.040 --> 890.720]  Now, we've talked a little bit about how to talk to someone who's lying.
[890.720 --> 892.040]  I mean, how to spot a lie.
[892.040 --> 895.160]  And as I promise, we're now going to look at what the truth looks like.
[895.160 --> 897.920]  And I'm going to show you two videos.
[897.920 --> 899.280]  Two mothers.
[899.280 --> 900.280]  One is lying.
[900.280 --> 901.280]  One is telling the truth.
[901.280 --> 904.600]  And these were surfaced by researcher David Matsumoto in California.
[904.600 --> 908.680]  And I think they're an excellent example of what the truth looks like.
[908.680 --> 913.640]  This mother, Diane Downes, shot her kids at close range.
[913.640 --> 917.040]  Draw them to the hospital while they bled all over the car.
[917.040 --> 919.520]  Claimed a scraggie-haired stranger did it.
[919.520 --> 923.280]  And you'll see when you see the video, she can't even pretend to be an agonizing mother.
[923.280 --> 928.340]  What you want to look for here is an incredible discrepancy between horrific events that she
[928.340 --> 931.400]  describes and her very, very cool demeanor.
[931.400 --> 934.320]  And if you look closely, you'll see dooping delight throughout this video.
[934.320 --> 937.880]  But at night, when I close my eyes, I can see Christy reaching her hand out to me while
[937.880 --> 939.480]  I'm driving.
[939.480 --> 942.000]  And the blood just keeps coming out of her mouth.
[942.000 --> 944.720]  And that, maybe it'll fade too with time.
[944.720 --> 946.440]  But I don't think so.
[946.440 --> 956.000]  That haunts me the most.
[956.000 --> 960.680]  Now I'm going to show you a video of an actual grieving mother, Erin Runyon, confronting
[960.680 --> 963.720]  her daughter's murder and torture in court.
[963.720 --> 965.680]  Here you're going to see no false emotion.
[965.680 --> 968.880]  The authentic expression of a mother's agony.
[968.880 --> 973.640]  I wrote the statement on the third anniversary of the night you took my baby.
[973.640 --> 974.640]  And you heard her.
[974.640 --> 976.640]  Then you crushed her.
[976.640 --> 978.640]  You terrified her.
[978.640 --> 981.120]  And tell her her heart stopped.
[981.120 --> 982.640]  And she fought.
[982.640 --> 984.640]  And I know she fought you.
[984.640 --> 987.840]  But I know she looked at you with those amazing brown eyes.
[987.840 --> 991.600]  And you still wanted to kill her.
[991.600 --> 993.040]  And I don't understand it.
[993.040 --> 994.440]  And they never will.
[994.440 --> 1000.000]  OK, there's no doubting the veracity of those emotions.
[1000.000 --> 1005.920]  Now the technology around what the truth looks like is progressing on the science of it.
[1005.920 --> 1012.120]  We know, for example, that we now have specialized eye trackers, infrared brain scans, MRIs that
[1012.120 --> 1016.360]  can decode the signals that our bodies send out when we're trying to be deceptive.
[1016.360 --> 1020.960]  And these technologies are going to be marketed to all of us as panacea is for deceit.
[1020.960 --> 1022.960]  And they will prove incredibly useful someday.
[1022.960 --> 1028.080]  But you've got to ask yourself in the meantime, who do you want on your side of the meeting?
[1028.080 --> 1032.280]  Someone who's trained in getting to the truth or some guy is going to drag a 400 pound
[1032.280 --> 1035.520]  electosephalogram through the door.
[1035.520 --> 1039.560]  Lies spotters rely on human tools.
[1039.560 --> 1043.160]  They know, as someone once said, characters who you are in the dark.
[1043.160 --> 1047.920]  And what's kind of interesting is that today we have so little darkness.
[1047.920 --> 1050.160]  Our world has lit up 24 hours a day.
[1050.160 --> 1052.000]  It's transparent.
[1052.000 --> 1056.240]  With blogs and social networks broadcasting the buzz of a whole new generation of people
[1056.240 --> 1059.240]  that have made a choice to live their lives in public.
[1059.240 --> 1063.520]  It's a much more noisy world.
[1063.520 --> 1070.360]  So one challenge we have is to remember, oversharing, that's not honesty.
[1070.360 --> 1075.360]  Our manic tweeting and texting can blind us to the fact that the subtleties of human
[1075.360 --> 1078.120]  decency, character, integrity, that's still what matters.
[1078.120 --> 1079.960]  That's always what's going to matter.
[1079.960 --> 1086.160]  So in this much noisy world, it might make sense for us to be just a little bit more explicit
[1086.160 --> 1088.840]  about our moral code.
[1088.840 --> 1093.160]  When you combine the science of recognizing deception with the art of looking, listening,
[1093.160 --> 1096.600]  you exempt yourself from collaborating in a lie.
[1096.600 --> 1101.240]  You start up that path of being just a little bit more explicit because you signal to everyone
[1101.240 --> 1102.240]  around you.
[1102.240 --> 1105.280]  You say, hey, my world, our world.
[1105.280 --> 1106.720]  It's going to be an honest one.
[1106.720 --> 1110.720]  My world is going to be one where truth is strengthened and falsehood is recognized and
[1110.720 --> 1112.200]  marginalized.
[1112.200 --> 1117.560]  And when you do that, the ground around you starts to shift just a little bit.
[1117.560 --> 1118.560]  And that's the truth.
[1118.560 --> 1119.560]  Thank you.