Saturday, May 4, 2013

Why FBI and CIA didn't connect the dots

Excuses, excuses.

 

B

 

Why FBI and CIA didn't connect the dots

By Bruce Schneier, Special to CNN

updated 10:37 AM EDT, Thu May 2, 2013

http://www.cnn.com/2013/05/02/opinion/schneier-boston-bombing/index.html

 

    FBI, CIA criticized for not keeping better track of Tamerlan Tsarnaev

before attack

    Bruce Schneier: Connecting the dots seems easy in hindsight, but in real

life, it's not

    He says there are an enormous number of potential bad guys

    Schneier: If a terrorist plot succeeds, it doesn't mean law enforcement

systems failed

 

Editor's note: Bruce Schneier is a security technologist and author of

"Liars and Outliers: Enabling the Trust Society Needs to Survive."

 

(CNN) -- The FBI and the CIA are being criticized for not keeping better

track of Tamerlan Tsarnaev in the months before the Boston Marathon

bombings. How could they have ignored such a dangerous person? How do we

reform the intelligence community to ensure this kind of failure doesn't

happen again?

 

It's an old song by now, one we heard after the 9/11 attacks in 2001 and

after the Underwear Bomber's failed attack in 2009. The problem is that

connecting the dots is a bad metaphor, and focusing on it makes us more

likely to implement useless reforms.

 

Connecting the dots in a coloring book is easy and fun. They're right there

on the page, and they're all numbered. All you have to do is move your

pencil from one dot to the next, and when you're done, you've drawn a

sailboat. Or a tiger. It's so simple that 5-year-olds can do it.

Bruce Schneier

Bruce Schneier

 

But in real life, the dots can only be numbered after the fact. With the

benefit of hindsight, it's easy to draw lines from a Russian request for

information to a foreign visit to some other piece of information that might

have been collected.

 

Opinion: Agencies often miss warning signs of attacks

 

In hindsight, we know who the bad guys are. Before the fact, there are an

enormous number of potential bad guys.

 

How many? We don't know. But we know that the no-fly list had 21,000 people

on it last year. The Terrorist Identities Datamart Environment, also known

as the watch list, has 700,000 names on it.

 

We have no idea how many potential "dots" the FBI, CIA, NSA and other

agencies collect, but it's easily in the millions. It's easy to work

backwards through the data and see all the obvious warning signs. But before

a terrorist attack, when there are millions of dots -- some important but

the vast majority unimportant -- uncovering plots is a lot harder.

 

Rather than thinking of intelligence as a simple connect-the-dots picture,

think of it as a million unnumbered pictures superimposed on top of each

other. Or a random-dot stereogram. Is it a sailboat, a puppy, two guys with

pressure-cooker bombs or just an unintelligible mess of dots? You try to

figure it out.

 

It's not a matter of not enough data, either.

 

Piling more data onto the mix makes it harder, not easier. The best way to

think of it is a needle-in-a-haystack problem; the last thing you want to do

is increase the amount of hay you have to search through.

 

The television show "Person of Interest" is fiction, not fact.

 

There's a name for this sort of logical fallacy: hindsight bias.

 

First explained by psychologists Daniel Kahneman and Amos Tversky, it's

surprisingly common. Since what actually happened is so obvious once it

happens, we overestimate how obvious it was before it happened.

 

We actually misremember what we once thought, believing that we knew all

along that what happened would happen. It's a surprisingly strong tendency,

one that has been observed in countless laboratory experiments and

real-world examples of behavior. And it's what all the post-Boston-Marathon

bombing dot-connectors are doing.

 

Before we start blaming agencies for failing to stop the Boston bombers, and

before we push "intelligence reforms" that will shred civil liberties

without making us any safer, we need to stop seeing the past as a bunch of

obvious dots that need connecting.

 

Kahneman, a Nobel prize winner, wisely noted: "Actions that seemed prudent

in foresight can look irresponsibly negligent in hindsight." Kahneman calls

it "the illusion of understanding," explaining that the past is only so

understandable because we have cast it as simple inevitable stories and

leave out the rest.

 

Nassim Taleb, an expert on risk engineering, calls this tendency the

"narrative fallacy." We humans are natural storytellers, and the world of

stories is much more tidy, predictable and coherent than the real world.

 

Millions of people behave strangely enough to warrant the FBI's notice, and

almost all of them are harmless. It is simply not possible to find every

plot beforehand, especially when the perpetrators act alone and on impulse.

 

We have to accept that there always will be a risk of terrorism, and that

when the occasional plot succeeds, it's not necessarily because our law

enforcement systems have failed.

 

==========================================

(F)AIR USE NOTICE: All original content and/or articles and graphics in this

message are copyrighted, unless specifically noted otherwise. All rights to

these copyrighted items are reserved. Articles and graphics have been placed

within for educational and discussion purposes only, in compliance with

"Fair Use" criteria established in Section 107 of the Copyright Act of 1976.

The principle of "Fair Use" was established as law by Section 107 of The

Copyright Act of 1976. "Fair Use" legally eliminates the need to obtain

permission or pay royalties for the use of previously copyrighted materials

if the purposes of display include "criticism, comment, news reporting,

teaching, scholarship, and research." Section 107 establishes four criteria

for determining whether the use of a work in any particular case qualifies

as a "fair use". A work used does not necessarily have to satisfy all four

criteria to qualify as an instance of "fair use". Rather, "fair use" is

determined by the overall extent to which the cited work does or does not

substantially satisfy the criteria in their totality. If you wish to use

copyrighted material for purposes of your own that go beyond 'fair use,' you

must obtain permission from the copyright owner. For more information go to:

http://www.law.cornell.edu/uscode/17/107.shtml

 

THIS DOCUMENT MAY CONTAIN COPYRIGHTED MATERIAL. COPYING AND DISSEMINATION IS

PROHIBITED WITHOUT PERMISSION OF THE COPYRIGHT OWNERS.

 

 

 

 

 

No comments:

Post a Comment