﻿1
00:00:12,680 --> 00:00:16,248
           NARRATOR:
         9/11 plus 15.

2
00:00:18,085 --> 00:00:20,252
         The footprints
      of the fallen towers

3
00:00:20,321 --> 00:00:24,823
  are now a haunting memorial
 to what and who was lost here.

4
00:00:28,996 --> 00:00:33,298
      The waterfalls flow,
      and so do the tears.

5
00:00:37,505 --> 00:00:41,040
The children who come here have
    lived their entire lives

6
00:00:41,108 --> 00:00:43,942
 under the shadow of terrorism.

7
00:00:45,913 --> 00:00:50,783
   What has become normal now
      was unheard of then.

8
00:00:50,851 --> 00:00:53,685
          JOHN CARLIN:
     We're in an incredibly
   complicated time right now

9
00:00:53,754 --> 00:00:55,254
         when it comes
   to the terrorist threats.

10
00:00:55,322 --> 00:01:00,125
 And what we've seen really is
a fundamental shift in strategy.

11
00:01:00,194 --> 00:01:04,797
           NARRATOR:
       Al Qaeda had aimed
     at this target before.

12
00:01:04,865 --> 00:01:10,135
 But in 1993, we didn't awaken
       from our slumber.

13
00:01:13,107 --> 00:01:16,008
     After 9/11, there was
      no ignoring the need

14
00:01:16,077 --> 00:01:17,476
       for urgent action.

15
00:01:19,780 --> 00:01:21,480
            CARLIN:
          We developed

16
00:01:21,549 --> 00:01:24,116
    an apparatus that became
  really good at figuring out

17
00:01:24,185 --> 00:01:25,951
  what they were trying to do
       and disrupting it

18
00:01:26,020 --> 00:01:27,052
   before they could succeed.

19
00:01:29,056 --> 00:01:31,557
           NARRATOR:
  We put boots on the ground,

20
00:01:31,625 --> 00:01:34,093
       drones in the air,

21
00:01:34,161 --> 00:01:36,829
   and systematically killed
        the ringleaders.

22
00:01:36,897 --> 00:01:38,931
          (explosions)

23
00:01:40,734 --> 00:01:43,135
     We tightened security
          at airports

24
00:01:43,204 --> 00:01:45,504
 and employed new technologies.

25
00:01:45,573 --> 00:01:48,874
          Al Qaeda is
     drastically weakened,

26
00:01:48,943 --> 00:01:52,978
  but terrorism as a strategy
      is still with us...

27
00:01:53,047 --> 00:01:55,147
          (explosion)

28
00:01:55,216 --> 00:01:58,750
        Benefitting from
    new technology as well.

29
00:01:58,819 --> 00:02:01,520
 The internet and social media
          specifically

30
00:02:01,589 --> 00:02:04,156
are really kind of game changers
         for extremism.

31
00:02:04,225 --> 00:02:06,391
They offer extremists advantages

32
00:02:06,460 --> 00:02:08,494
     that they don't offer
       mainstream people.

33
00:02:09,930 --> 00:02:13,198
           NARRATOR:
  The Islamic State perfected
       the pitch online--

34
00:02:13,267 --> 00:02:17,002
 professionally produced videos
       of warrior heroes

35
00:02:17,071 --> 00:02:21,573
  living in utopia, all aimed
   at recruiting new members.

36
00:02:21,642 --> 00:02:23,842
          HUMERA KHAN:
If you include all its branches,

37
00:02:23,911 --> 00:02:26,879
       ISIS has more than
       40 media companies

38
00:02:26,947 --> 00:02:28,981
      and each of them has
  a different specialization.

39
00:02:29,049 --> 00:02:32,251
       And so the volume
  of what is put out is huge.

40
00:02:32,319 --> 00:02:34,253
            CARLIN:
  By crowd-sourcing terrorism,

41
00:02:34,321 --> 00:02:36,054
  they just called upon people
      throughout the world

42
00:02:36,123 --> 00:02:38,423
 to, one, join them as foreign
       terrorist fighters

43
00:02:38,492 --> 00:02:40,225
       in Iraq and Syria

44
00:02:40,294 --> 00:02:42,294
   and, two, if they couldn't
     join them over there,

45
00:02:42,363 --> 00:02:43,529
     "Kill where you live."

46
00:02:45,699 --> 00:02:48,033
           NARRATOR:
 We saw the deadly consequences

47
00:02:48,102 --> 00:02:51,770
  of this new internet-fueled,
   self-radicalized terrorism

48
00:02:51,839 --> 00:02:58,377
   in Boston, San Bernardino,
       Orlando, and Nice.

49
00:02:58,445 --> 00:03:00,179
       It is a fact that
       many of the cases

50
00:03:00,247 --> 00:03:01,747
        that we've seen
      in the United States

51
00:03:01,815 --> 00:03:04,750
 simply would not have happened
  in the pre-social media era,

52
00:03:04,818 --> 00:03:06,818
      because the material
  just wasn't that accessible.

53
00:03:06,887 --> 00:03:12,724
           NARRATOR:
   It's no longer just a war
 of bullets, drones, and bombs.

54
00:03:12,793 --> 00:03:17,696
  Technology has created a new
      battlefield, online.

55
00:03:17,765 --> 00:03:20,465
   Are there new technologies
          to intervene

56
00:03:20,534 --> 00:03:25,070
    before vulnerable people
 answer the call of extremism?

57
00:03:25,139 --> 00:03:29,341
           NARRATOR:
    Can science take us into
    the mind of a terrorist?

58
00:03:29,410 --> 00:03:33,679
     "15 Years of Terror"--
       right now on<i> NOVA.</i>

59
00:03:50,097 --> 00:03:54,666
   Major funding for<i> NOVA</i> is
  provided by the following...

60
00:03:54,735 --> 00:03:59,171
  self-radicalized terrorists,
   empowered by social media.

61
00:04:00,641 --> 00:04:04,376
     The war on terror was
tailor-made to defeat al Qaeda.

62
00:04:04,445 --> 00:04:08,213
      But troops, drones,
      and tighter borders

63
00:04:08,282 --> 00:04:11,416
        offer no defense
     against the internet.

64
00:04:11,485 --> 00:04:14,453
    It is awash in violence
           and venom

65
00:04:14,521 --> 00:04:17,990
    produced and propagated
         by terrorists.

66
00:04:18,058 --> 00:04:21,126
    You can trace the roots,
       at least in part,

67
00:04:21,195 --> 00:04:23,095
 to a place you'd least expect.

68
00:04:28,602 --> 00:04:31,503
       Daphne, Alabama--

69
00:04:31,572 --> 00:04:35,240
   a city of 20,000 that sits
  across the bay from Mobile.

70
00:04:35,309 --> 00:04:38,644
It's everything you would expect
 from the American Bible Belt.

71
00:04:40,881 --> 00:04:44,516
   But it was also home to an
 unlikely, infamous resident...

72
00:04:47,488 --> 00:04:50,689
         Omar Hammami,
  an American who took up arms

73
00:04:50,758 --> 00:04:54,660
    with Islamic terrorists
 and took their propaganda war

74
00:04:54,728 --> 00:04:57,329
    into a whole new realm.

75
00:04:57,398 --> 00:04:58,997
What can his story tell us about

76
00:04:59,066 --> 00:05:02,467
     how social networking
        fuels terrorism?

77
00:05:02,536 --> 00:05:05,237
      The clues are there
       in his own words--

78
00:05:05,306 --> 00:05:09,775
a self-published autobiography.

79
00:05:09,843 --> 00:05:12,477
     HAMMAMI (dramatized):
   I was brought up like most
   of the privileged children

80
00:05:12,546 --> 00:05:13,845
          in America.

81
00:05:13,914 --> 00:05:17,582
    My mother was a typical
   Southern Protestant girl,

82
00:05:17,651 --> 00:05:20,252
  which attracted my father's
    conservative background.

83
00:05:20,321 --> 00:05:22,554
   An Arab from Syria marries

84
00:05:22,623 --> 00:05:24,890
    a little Southern belle
         from Alabama.

85
00:05:24,958 --> 00:05:27,693
         This is a very
      strange combination.

86
00:05:27,761 --> 00:05:28,894
         MITCH SILBER:
        He's the product

87
00:05:28,962 --> 00:05:30,696
     of a mixed marriage--

88
00:05:30,764 --> 00:05:33,999
   a father who is Muslim and
   a mother who is Christian,

89
00:05:34,068 --> 00:05:36,735
  a father who was an engineer

90
00:05:36,804 --> 00:05:40,939
    and grew up essentially
in an open, tolerant household.

91
00:05:41,008 --> 00:05:46,945
     HAMMAMI (dramatized):
 I was "saved" and baptized in
  the Perdido Baptist Church.

92
00:05:47,014 --> 00:05:50,415
   My mother used to take me
         and my sister.

93
00:05:50,484 --> 00:05:52,918
     I was the best student
        in Bible school.

94
00:05:52,986 --> 00:05:56,688
I didn't like getting less than
perfect grades from a young age.

95
00:05:56,757 --> 00:06:00,892
 My father was not a religious
       man in those days.

96
00:06:00,961 --> 00:06:03,995
        He did not pray
      or go to the mosque.

97
00:06:04,064 --> 00:06:06,898
 My mom used to tell us that we
have to keep our religion secret

98
00:06:06,967 --> 00:06:08,767
        from our father.

99
00:06:11,105 --> 00:06:15,273
           NARRATOR:
That inner conflict he described
   in his book was just that.

100
00:06:15,342 --> 00:06:19,978
    Outwardly, he was smart,
    popular, and easygoing.

101
00:06:20,047 --> 00:06:22,914
 He didn't seem to take himself
         too seriously.

102
00:06:22,983 --> 00:06:24,416
        SHAFIK HAMMAMI:
    He was very intelligent.

103
00:06:24,485 --> 00:06:28,019
       He's always happy.

104
00:06:28,088 --> 00:06:30,088
   He's an all-American boy.

105
00:06:30,157 --> 00:06:33,191
He liked sports, he liked music.

106
00:06:33,260 --> 00:06:37,662
     HAMMAMI (dramatized):
     By seventh grade I was
   the class vice president.

107
00:06:37,731 --> 00:06:42,033
 By eighth grade I think I was
the most popular guy in school.

108
00:06:42,102 --> 00:06:44,536
        The main reason
     was that I was funny.

109
00:06:47,941 --> 00:06:50,976
 It was the summer of my eighth
grade year when I went to Syria.

110
00:06:53,881 --> 00:06:56,782
   My cousins were very happy
           to see me,

111
00:06:56,850 --> 00:06:59,351
      but they didn't know
       who I was exactly.

112
00:06:59,420 --> 00:07:01,953
      They must have heard
 that my mother was teaching us

113
00:07:02,022 --> 00:07:05,690
Christianity so they started to
  try to teach me how to pray.

114
00:07:08,295 --> 00:07:11,029
    It was around that time
 that I prayed all five prayers

115
00:07:11,098 --> 00:07:13,265
  without missing any of them.

116
00:07:13,333 --> 00:07:14,699
    I felt so good that day

117
00:07:14,768 --> 00:07:19,805
 that I promised to always pray
      my prayers on time.

118
00:07:19,873 --> 00:07:22,107
  The trip to Syria really was
    his religious awakening.

119
00:07:22,176 --> 00:07:25,444
           NARRATOR:
         J.M. Berger is
      a former journalist

120
00:07:25,512 --> 00:07:27,679
  and now an author and fellow

121
00:07:27,748 --> 00:07:32,184
    at the George Washington
University Program on Extremism.

122
00:07:32,252 --> 00:07:34,686
   He had been a popular kid,
         confident kid,

123
00:07:34,755 --> 00:07:37,122
  who came back from this trip
        with a religion

124
00:07:37,191 --> 00:07:40,292
  that in Alabama was strange
       to his classmates.

125
00:07:40,360 --> 00:07:42,394
   I think that that made him
         feel isolated

126
00:07:42,463 --> 00:07:45,130
 and it may have encouraged him
      to think of himself

127
00:07:45,199 --> 00:07:47,466
 as special as a way to offset
         the rejection.

128
00:07:50,304 --> 00:07:52,237
     HAMMAMI (dramatized):
        When I came back
       from my vacation,

129
00:07:52,306 --> 00:07:54,673
      I had become a very
       different person,

130
00:07:54,741 --> 00:07:58,076
     but I was placed back
    into my old environment.

131
00:07:58,145 --> 00:08:00,545
     It was like a struggle
         of two worlds.

132
00:08:00,614 --> 00:08:04,015
     The drugs, the girls,
      the friends, the TV,

133
00:08:04,084 --> 00:08:07,052
     and everything hit me
        with a big slap.

134
00:08:07,120 --> 00:08:09,387
 Due to the blessings of Allah,

135
00:08:09,456 --> 00:08:12,357
      I managed to hold on
         to my prayers.

136
00:08:12,426 --> 00:08:14,326
    He converted from being
           a Baptist

137
00:08:14,394 --> 00:08:15,994
  to being a highly observant
            Muslim,

138
00:08:16,063 --> 00:08:17,562
     which you can imagine
        in rural Alabama

139
00:08:17,631 --> 00:08:21,433
 was not a typical decision and
 also brought a lot of disdain

140
00:08:21,502 --> 00:08:23,401
from his high school classmates.

141
00:08:23,470 --> 00:08:25,904
     HAMMAMI (dramatized):
    It was an upward battle,

142
00:08:25,973 --> 00:08:28,039
   but I had some new friends
        from the mosque

143
00:08:28,108 --> 00:08:31,009
  that used to give me support
        on the weekends.

144
00:08:31,078 --> 00:08:34,546
I began to feel that I was being
      flung into an ocean

145
00:08:34,615 --> 00:08:36,314
   and asked not to get wet.

146
00:08:39,820 --> 00:08:41,686
           NARRATOR:
  Finding extremists like Omar

147
00:08:41,755 --> 00:08:44,189
    as they test the waters
       of radicalization

148
00:08:44,258 --> 00:08:45,957
      is a he challenge

149
00:08:46,026 --> 00:08:51,229
      for law enforcement
   and intelligence agencies.

150
00:08:51,298 --> 00:08:53,031
  In law enforcement circles,

151
00:08:53,100 --> 00:08:57,068
    they call it countering
   violent extremism, or CVE.

152
00:08:58,906 --> 00:09:02,374
     This room is designed
       to make it easier.

153
00:09:02,442 --> 00:09:05,744
 This is the room 9/11 built--

154
00:09:05,812 --> 00:09:09,180
  the operations center at the
National Counterterrorism Center

155
00:09:09,249 --> 00:09:12,684
 just outside Washington, D.C.

156
00:09:12,753 --> 00:09:15,820
    On a 24/7 basis, we have
officers here working in shifts

157
00:09:15,889 --> 00:09:19,291
  who are consuming, reading,
    analyzing, and assessing

158
00:09:19,359 --> 00:09:22,093
     every bit of available
   information that there is

159
00:09:22,162 --> 00:09:23,929
      to try to figure out
     what terrorist threats

160
00:09:23,997 --> 00:09:25,297
are aimed at the United States.

161
00:09:26,533 --> 00:09:29,868
           NARRATOR:
         Nick Rasmussen
     is the director here.

162
00:09:29,937 --> 00:09:32,404
     This is where they try
      to connect the dots.

163
00:09:34,841 --> 00:09:37,342
   The nature of the work has
      changed dramatically

164
00:09:37,411 --> 00:09:39,311
        in recent years.

165
00:09:39,379 --> 00:09:41,479
These folks can get
radicalized by one group

166
00:09:41,548 --> 00:09:43,214
and the baton can be passed
to another group.

167
00:09:43,283 --> 00:09:46,184
           NARRATOR:
        More lone wolves
  and encrypted communication.

168
00:09:46,253 --> 00:09:49,287
           REPORTER:
    The FBI had this man on
  its radar as early as 2013.

169
00:09:49,356 --> 00:09:51,856
           NARRATOR:
  Fewer face-to-face meetings
       and phone calls--

170
00:09:51,925 --> 00:09:55,927
  the internet as a source of
   inspiration and planning.

171
00:09:55,996 --> 00:09:57,362
             Self-radicalization
            doesn't have to take

172
00:09:57,431 --> 00:10:00,298
      many months or many years.

173
00:10:00,367 --> 00:10:03,602
 Increasingly, what connecting
      the dots means to me

174
00:10:03,670 --> 00:10:07,405
   is dealing with the huge,
          huge volume

175
00:10:07,474 --> 00:10:09,374
     of publicly available
         or open source

176
00:10:09,443 --> 00:10:12,677
  or unclassified information
        that's out there

177
00:10:12,746 --> 00:10:14,512
    that may have terrorism
           relevance.

178
00:10:14,581 --> 00:10:15,847
  And the work we're doing now

179
00:10:15,916 --> 00:10:17,582
    with our partners in the
     intelligence community

180
00:10:17,651 --> 00:10:21,419
 often doesn't involve really,
 really sensitive intelligence.

181
00:10:21,488 --> 00:10:23,455
 It involves looking at Twitter

182
00:10:23,523 --> 00:10:26,658
    or looking at some other
     social media platform

183
00:10:26,727 --> 00:10:28,627
    and trying to figure out

184
00:10:28,695 --> 00:10:30,695
   who that individual behind
       that screen name,

185
00:10:30,764 --> 00:10:33,198
      behind that handle,
       might actually be

186
00:10:33,266 --> 00:10:36,968
 and whether that person poses
 a threat to the United States.

187
00:10:37,037 --> 00:10:42,040
           NARRATOR:
  The term of art in the world
   of espionage is SOCMINT--

188
00:10:42,109 --> 00:10:44,509
   social media intelligence.

189
00:10:44,578 --> 00:10:46,478
      Open source spying.

190
00:10:46,546 --> 00:10:47,812
          JEFF WEYERS:
       Anybody can track

191
00:10:47,881 --> 00:10:50,715
    a war online, can track
   a terrorist group online,

192
00:10:50,784 --> 00:10:54,586
     can develop informants
      and contacts online.

193
00:10:54,655 --> 00:10:58,523
           NARRATOR:
  Police officer and terrorism
 analyst Jeff Weyers is expert

194
00:10:58,592 --> 00:11:02,394
    at gleaning intelligence
       from social media.

195
00:11:02,462 --> 00:11:05,864
    His "operations center"
        is in his home.

196
00:11:05,932 --> 00:11:09,467
            WEYERS:
   I can do more open-source
       intelligence work

197
00:11:09,536 --> 00:11:13,905
  from my living room than any
  analyst could have ever done

198
00:11:13,974 --> 00:11:15,206
       even 20 years ago.

199
00:11:17,878 --> 00:11:21,413
           NARRATOR:
       The data is hiding
         in plain view.

200
00:11:21,481 --> 00:11:23,481
   All it takes is patience,

201
00:11:23,550 --> 00:11:26,685
persistence, and a little bit of
 technical know-how to find it.

202
00:11:26,753 --> 00:11:29,421
         For instance,
      it's an open secret

203
00:11:29,489 --> 00:11:32,557
that many Islamic State fighters
         do not disable

204
00:11:32,626 --> 00:11:34,559
    the geographic tracking
           capability

205
00:11:34,628 --> 00:11:36,895
built into their mobile phones.

206
00:11:36,963 --> 00:11:42,167
The technology makes it possible
for anyone to track a terrorist.

207
00:11:42,235 --> 00:11:45,603
            WEYERS:
If he broadcasts from Raqqa and
 then I again see him in Turkey

208
00:11:45,672 --> 00:11:47,772
and then I again see him moving
          into Europe,

209
00:11:47,841 --> 00:11:49,708
    well, this is a way that
  we can potentially interdict

210
00:11:49,776 --> 00:11:53,645
  with somebody that is maybe
    looking to do an attack.

211
00:11:53,714 --> 00:11:57,148
           NARRATOR:
    That, combined with some
     selfies, might provide

212
00:11:57,217 --> 00:12:01,252
     plenty of intelligence
     needed for targeting.

213
00:12:01,321 --> 00:12:02,787
 If you're looking for a drone
    attack and you're seeing

214
00:12:02,856 --> 00:12:05,824
where they're going for morning
coffee, Twitter could tell you.

215
00:12:05,892 --> 00:12:07,592
           NARRATOR:
    When it comes to terror,

216
00:12:07,661 --> 00:12:09,928
       the problem isn't
        a lack of data,

217
00:12:09,996 --> 00:12:14,199
   it's separating the wheat
        from the chaff.

218
00:12:14,267 --> 00:12:16,901
            WEYERS:
         If you look at
      the Orlando shooting

219
00:12:16,970 --> 00:12:20,171
 or the recent cases in Germany
          and France,

220
00:12:20,240 --> 00:12:22,540
  just because the government
       has all this data

221
00:12:22,609 --> 00:12:24,642
     doesn't mean they have
          the capacity

222
00:12:24,711 --> 00:12:27,545
   to analyze all that data,
   and so how do you then go

223
00:12:27,614 --> 00:12:29,714
    and make a determination
   as to whether that person

224
00:12:29,783 --> 00:12:31,783
 poses a threat to the public?

225
00:12:33,754 --> 00:12:35,620
           NARRATOR:
    With so many electronic
          breadcrumbs

226
00:12:35,689 --> 00:12:38,456
   scattered out in the open,
    couldn't it be possible

227
00:12:38,525 --> 00:12:42,127
  for a computer scientist to
 harness the right combination

228
00:12:42,195 --> 00:12:45,463
    of software and hardware
    to see where they lead?

229
00:12:45,532 --> 00:12:47,732
        All right, Howard Marks,
                  where are you?

230
00:12:47,801 --> 00:12:49,801
           NARRATOR:
 And make pre-crime arrests...

231
00:12:49,870 --> 00:12:51,536
          (screaming)

232
00:12:51,605 --> 00:12:55,006
     ...as depicted in the
  2002 movie<i> Minority Report.</i>

233
00:12:55,075 --> 00:12:57,642
I'm placing you under arrest for
the future murder of Sarah Marks

234
00:12:57,711 --> 00:13:00,345
       and Donald Dubin that was
  to take place today, April 22,

235
00:13:00,413 --> 00:13:02,247
                at 0800 hours...

236
00:13:02,315 --> 00:13:04,983
           NARRATOR:
      Science fiction now,
     but maybe not forever.

237
00:13:06,353 --> 00:13:08,486
 At the University of Maryland,

238
00:13:08,555 --> 00:13:12,123
       computer scientist
 V.S. Subrahmanian is applying

239
00:13:12,192 --> 00:13:15,326
      a big data approach
     to fighting terrorism.

240
00:13:15,395 --> 00:13:18,263
      He is trying to put
    more objective analysis

241
00:13:18,331 --> 00:13:21,833
   into decisions about which
     terrorists to target.

242
00:13:21,902 --> 00:13:23,968
         SUBRAHMANIAN:
        I'm a scientist,

243
00:13:24,037 --> 00:13:26,371
    and when somebody says,
     "We degraded al Qaeda

244
00:13:26,439 --> 00:13:28,506
    by taking person X out,"

245
00:13:28,575 --> 00:13:31,142
you know, if I can't measure it,
      I don't believe it.

246
00:13:31,211 --> 00:13:33,645
              ♪
♪

247
00:13:33,713 --> 00:13:37,182
           NARRATOR:
 He and his team focused on the
  Islamic terror organization

248
00:13:37,250 --> 00:13:39,717
        Lashkar-e-Taiba,

249
00:13:39,786 --> 00:13:43,955
   the group responsible for
  the 2008 attacks on Mumbai,

250
00:13:44,024 --> 00:13:46,457
   about a dozen coordinated
     shootings and bombings

251
00:13:46,526 --> 00:13:51,930
 lasting four days that killed
     more than 160 people.

252
00:13:54,768 --> 00:13:57,702
  So what you see here is the
terrorist network corresponding

253
00:13:57,771 --> 00:13:59,971
     to the terrorist group
        Lashkar-e-Taiba,

254
00:14:00,040 --> 00:14:04,742
and each node that you see here
 corresponds to an individual.

255
00:14:07,147 --> 00:14:13,017
           NARRATOR:
 They compiled 21 years of data
 on the group and its actions.

256
00:14:13,086 --> 00:14:15,987
     All of it is analyzed
 by some sophisticated software

257
00:14:16,056 --> 00:14:17,388
      that he calls STONE,

258
00:14:17,457 --> 00:14:21,926
     for Shaping Terrorist
 Organization Network Efficacy.

259
00:14:21,995 --> 00:14:25,330
        It's a schematic
     of a terrorist network

260
00:14:25,398 --> 00:14:30,635
    identifying individuals,
  subgroups, and affiliations.

261
00:14:30,704 --> 00:14:34,439
 The software assigns a number
    to measure the lethality

262
00:14:34,507 --> 00:14:36,875
  of the terror organization.

263
00:14:36,943 --> 00:14:41,112
     The higher the number,
the more dangerous the group is.

264
00:14:41,181 --> 00:14:44,649
  So what would happen if you
    targeted the leadership?

265
00:14:44,718 --> 00:14:48,319
Let's take a look at the leader
 of the group here, number one.

266
00:14:48,388 --> 00:14:49,621
   If you right-click on him,

267
00:14:49,689 --> 00:14:51,089
  we will see some information
           about him.

268
00:14:51,157 --> 00:14:54,292
           NARRATOR:
  He is Hafiz Muhammad Saeed,

269
00:14:54,361 --> 00:14:58,129
a man with a $10 million bounty
          on his head.

270
00:14:58,198 --> 00:15:00,231
Let's pretend we are in the role
         of an analyst

271
00:15:00,300 --> 00:15:02,867
   and we're considering the
 consequences of targeting him

272
00:15:02,936 --> 00:15:04,802
        and removing him
       from the network.

273
00:15:04,871 --> 00:15:06,170
           NARRATOR:
   Here's what's surprising:

274
00:15:06,239 --> 00:15:09,374
     the software predicts
   if you take out the boss,

275
00:15:09,442 --> 00:15:12,410
   the lethality of the group
       actually goes up.

276
00:15:12,479 --> 00:15:15,747
         SUBRAHMANIAN:
    You may be faced with a
 situation where the new leader

277
00:15:15,815 --> 00:15:19,684
 is either much more aggressive
 about carrying out operations

278
00:15:19,753 --> 00:15:23,187
      or much better liked
     or much more competent

279
00:15:23,256 --> 00:15:24,656
 in carrying out his operation.

280
00:15:26,660 --> 00:15:29,294
           NARRATOR:
 The software makes it possible
        to run scenarios

281
00:15:29,362 --> 00:15:32,563
  to figure out who to target.

282
00:15:32,632 --> 00:15:35,633
So what would happen if Saeed's
       three top deputies

283
00:15:35,702 --> 00:15:37,435
      were all taken out?

284
00:15:37,504 --> 00:15:39,570
   The number goes way down.

285
00:15:39,639 --> 00:15:42,540
    Lashkar-e-Taiba becomes
     much less of a threat.

286
00:15:42,609 --> 00:15:44,208
         SUBRAHMANIAN:
    You can have a much more
           efficient

287
00:15:44,277 --> 00:15:45,576
   counterterrorism operation

288
00:15:45,645 --> 00:15:48,046
   that significantly weakens
            a group

289
00:15:48,114 --> 00:15:49,647
          by targeting
     just the right people.

290
00:15:49,716 --> 00:15:54,018
           NARRATOR:
    So can the same software
       predict an attack?

291
00:15:54,087 --> 00:15:55,887
            Sort of.

292
00:15:55,956 --> 00:15:58,323
         SUBRAHMANIAN:
    We could have predicted
      the Mumbai attacks.

293
00:15:58,391 --> 00:16:00,591
     However, we could not
         have predicted

294
00:16:00,660 --> 00:16:02,393
 exactly where they would have
           occurred.

295
00:16:02,462 --> 00:16:04,629
   So we can say things like,

296
00:16:04,698 --> 00:16:08,533
   "We expect these kinds of
       targets to be hit

297
00:16:08,601 --> 00:16:12,770
     in the next one, two,
      three, four months."

298
00:16:12,839 --> 00:16:15,506
    But we cannot say, "This
  specific target will be hit

299
00:16:15,575 --> 00:16:17,041
     in the next one, two,
      three, four months."

300
00:16:19,245 --> 00:16:22,146
        If I could reach
     in the terrorism world

301
00:16:22,215 --> 00:16:25,183
  the level of sophistication
in predicting hurricanes today,

302
00:16:25,251 --> 00:16:26,751
     I would be very happy.

303
00:16:26,820 --> 00:16:28,086
    So we're not there yet.

304
00:16:33,660 --> 00:16:38,463
           NARRATOR:
   In 2001, Omar Hammami had
 enrolled in college in Mobile

305
00:16:38,531 --> 00:16:41,566
   and a storm was gathering
          inside him.

306
00:16:41,634 --> 00:16:42,834
       (thunder rumbles)

307
00:16:42,902 --> 00:16:45,236
   Isolated and out of place
           on campus,

308
00:16:45,305 --> 00:16:48,740
     the radicalism seeded
         in Syria grew.

309
00:16:48,808 --> 00:16:51,409
        He was ripe for
       an external event

310
00:16:51,478 --> 00:16:56,080
      to trigger something
         more sinister.

311
00:16:56,149 --> 00:17:00,752
     HAMMAMI (dramatized):
    I was in university when
     September 11 happened.

312
00:17:00,820 --> 00:17:03,955
I came to class one day and this
     non-practicing Muslim

313
00:17:04,024 --> 00:17:05,323
     told me to check CNN,

314
00:17:05,392 --> 00:17:07,625
   where I saw a plane going
        into the towers.

315
00:17:10,096 --> 00:17:12,797
      I was mixed between
    the hatred of terrorism

316
00:17:12,866 --> 00:17:15,733
and my real hatred for America,
       the disbelievers,

317
00:17:15,802 --> 00:17:18,669
      and their oppression
        of the Muslims.

318
00:17:18,738 --> 00:17:23,241
But 9/11 didn't "radicalize" me,
          as they say.

319
00:17:23,309 --> 00:17:27,712
    I took things a bit more
   intellectually than that.

320
00:17:27,781 --> 00:17:32,784
           NARRATOR:
   But that changed when U.S.
  troops marched into Baghdad.

321
00:17:32,852 --> 00:17:35,653
          (explosions)

322
00:17:37,891 --> 00:17:40,224
     HAMMAMI (dramatized):
    By the time the Iraq war
            started,

323
00:17:40,293 --> 00:17:44,128
I could not find any way for us
to say that it is anything less

324
00:17:44,197 --> 00:17:46,764
    than obligatory to fight
      the Americans there.

325
00:17:51,104 --> 00:17:54,472
One day I just couldn't take the
 futility of it all any longer.

326
00:17:54,541 --> 00:17:57,041
  I went to the dean's office
     and I withdrew my name

327
00:17:57,110 --> 00:17:58,509
      from the university.

328
00:18:01,214 --> 00:18:04,916
Eventually I became so averse to
America that I wanted to leave.

329
00:18:07,220 --> 00:18:09,954
            SILBER:
Subsequently, he left university
    and went up to Toronto,

330
00:18:10,023 --> 00:18:13,157
  where he started to explore
    literature and theology,

331
00:18:13,226 --> 00:18:17,428
      and started to adopt
       a Salafi ideology.

332
00:18:17,497 --> 00:18:19,330
            BERGER:
 Salafists believe that there's

333
00:18:19,399 --> 00:18:22,266
 a mythical, pure form of Islam
     that they can restore.

334
00:18:22,335 --> 00:18:23,401
     It's very puritanical.

335
00:18:25,138 --> 00:18:26,337
     And Salafi jihadists--

336
00:18:26,406 --> 00:18:28,239
     al Qaeda and ISIS and
     movements like that--

337
00:18:28,308 --> 00:18:31,209
believe that not only does this
  mythical, pure form of Islam

338
00:18:31,277 --> 00:18:33,478
  exist but that they need to
  achieve it through violence.

339
00:18:33,546 --> 00:18:36,614
They need to fight to institute
      that form of Islam.

340
00:18:36,683 --> 00:18:39,450
           (gunfire)

341
00:18:41,921 --> 00:18:45,690
           NARRATOR:
  When Omar arrived in Toronto
      in the fall of 2004,

342
00:18:45,758 --> 00:18:50,828
  he entered the radical phase
     of his metamorphosis.

343
00:18:52,599 --> 00:18:56,834
   He was trying extremism on
 for size and it seemed to fit.

344
00:18:59,305 --> 00:19:02,540
  Omar married a newly arrived
       Somali immigrant--

345
00:19:02,609 --> 00:19:05,910
          19-year-old
    Sadiyo Mohamed Abdille.

346
00:19:05,979 --> 00:19:09,280
        In short order,
  they were expecting a baby.

347
00:19:09,349 --> 00:19:13,718
    But Omar was in no mood
        to settle down.

348
00:19:13,786 --> 00:19:15,853
            SILBER:
 He wanted to travel overseas,

349
00:19:15,922 --> 00:19:18,890
       to a land that was
     sufficiently Islamic.

350
00:19:18,958 --> 00:19:22,360
           NARRATOR:
        Less than a year
   after arriving in Toronto,

351
00:19:22,428 --> 00:19:25,730
  he decided he wanted to move
           to Egypt.

352
00:19:25,798 --> 00:19:29,433
   Sadiyo reluctantly agreed.

353
00:19:32,438 --> 00:19:37,275
   They arrived in Alexandria
        in June of 2005.

354
00:19:37,343 --> 00:19:41,779
     HAMMAMI (dramatized):
  When I got there, I realized
    it was a terrible place.

355
00:19:41,848 --> 00:19:45,049
I looked at the face of my wife
    and she was devastated.

356
00:19:45,118 --> 00:19:47,418
    But I still didn't care.

357
00:19:47,487 --> 00:19:50,788
           NARRATOR:
  Not long after they arrived,

358
00:19:50,857 --> 00:19:56,027
 Omar's wife Sadiyo gave birth
   to a baby girl, Taymiyyah.

359
00:19:56,095 --> 00:19:59,197
   Omar had planned to study
     at a local university,

360
00:19:59,265 --> 00:20:01,465
     but it didn't pan out.

361
00:20:01,534 --> 00:20:07,205
 He spent his days at internet
   cafés reading and posting

362
00:20:07,273 --> 00:20:09,340
       on jihadi forums.

363
00:20:09,409 --> 00:20:12,009
     HAMMAMI (dramatized):
 I was surfing the net one day
      and I found someone

364
00:20:12,078 --> 00:20:13,344
 who sounded like an American.

365
00:20:16,316 --> 00:20:18,683
      About an hour later,
  I met one of my best friends

366
00:20:18,751 --> 00:20:22,320
     and closest brothers:
    Abu Muhammad al-Amriki,

367
00:20:22,388 --> 00:20:26,657
       Daniel Maldonado.

368
00:20:26,726 --> 00:20:30,394
           NARRATOR:
       Daniel Maldonado,
an American from New Hampshire,

369
00:20:30,463 --> 00:20:33,831
   was a high school dropout
 who converted to Salafi Islam

370
00:20:33,900 --> 00:20:36,100
            in 2000.

371
00:20:36,169 --> 00:20:39,370
The two men became fast friends.

372
00:20:39,439 --> 00:20:41,439
            BERGER:
    They're in the same city
      but they met online.

373
00:20:41,507 --> 00:20:43,274
     I think it's important
         to understand

374
00:20:43,343 --> 00:20:46,043
  the power of these networks
     in making connections

375
00:20:46,112 --> 00:20:49,747
 really helps extremist groups.

376
00:20:49,816 --> 00:20:52,016
     HAMMAMI (dramatized):
      Abu Muhammad managed
      to give me guidance

377
00:20:52,085 --> 00:20:55,453
about which books are necessary
      to read about jihad.

378
00:20:55,521 --> 00:20:59,257
      Any remaining doubts
       had been removed.

379
00:20:59,325 --> 00:21:02,360
     I had become a jihadi.

380
00:21:03,930 --> 00:21:06,197
           NARRATOR:
        Omar had learned
      an important lesson

381
00:21:06,266 --> 00:21:07,798
  in the value of the internet

382
00:21:07,867 --> 00:21:10,701
   to promote and facilitate
           extremism.

383
00:21:10,770 --> 00:21:14,672
      It is a lesson that
   he clearly took to heart.

384
00:21:14,741 --> 00:21:18,876
    He and Maldonado decided
   to go to Somalia to fight

385
00:21:18,945 --> 00:21:22,713
 with the Islamic terror group
      known as al Shabaab.

386
00:21:22,782 --> 00:21:26,917
     They departed for the
      battlefield in 2006.

387
00:21:26,986 --> 00:21:31,155
       Omar left his wife
      and daughter behind.

388
00:21:34,027 --> 00:21:38,496
 Does this fit into the profile
    of a typical terrorist?

389
00:21:38,564 --> 00:21:40,531
     Is there such a thing?

390
00:21:40,600 --> 00:21:41,766
   It does not work that way,

391
00:21:41,834 --> 00:21:44,001
    there is no one profile
       of the terrorist.

392
00:21:44,070 --> 00:21:46,504
           NARRATOR:
But psychologist Arie Kruglanski
            believes

393
00:21:46,572 --> 00:21:49,607
 they share an important trait.

394
00:21:49,676 --> 00:21:51,642
They are looking for certainty,

395
00:21:51,711 --> 00:21:54,779
     for clear-cut answers
      in a chaotic world.

396
00:21:54,847 --> 00:21:58,849
   The psychological term is
       cognitive closure.

397
00:21:58,918 --> 00:22:00,318
          KRUGLANSKI:
 The need for cognitive closure

398
00:22:00,386 --> 00:22:02,053
   is the need for certainty

399
00:22:02,121 --> 00:22:04,689
  and the need to be confident
         about a topic,

400
00:22:04,757 --> 00:22:06,424
   the need to know for sure.

401
00:22:06,492 --> 00:22:10,828
           NARRATOR:
  Kruglanski and his team have
   authored reams of research

402
00:22:10,897 --> 00:22:12,997
       on the Sri Lankan
       terror group known

403
00:22:13,066 --> 00:22:16,801
  as the Liberation Tigers of
Tamil Eelam-- the Tamil Tigers.

404
00:22:16,869 --> 00:22:20,071
       The group invented
        the suicide belt

405
00:22:20,139 --> 00:22:23,307
     and pioneered the use
   of women in those attacks.

406
00:22:23,376 --> 00:22:26,444
     Kruglanski's team has
     interviewed thousands

407
00:22:26,512 --> 00:22:28,412
  of these former terrorists,

408
00:22:28,481 --> 00:22:30,614
   conducting one of the few
      longitudinal studies

409
00:22:30,683 --> 00:22:32,717
   of the terrorist mindset.

410
00:22:32,785 --> 00:22:38,055
   He discovered a clear link
 between feelings of self-worth

411
00:22:38,124 --> 00:22:40,591
and the desire to join a group.

412
00:22:40,660 --> 00:22:43,494
          KRUGLANSKI:
You feel that you're humiliated,
     you're insignificant,

413
00:22:43,563 --> 00:22:45,363
       you do not matter.

414
00:22:45,431 --> 00:22:47,798
  And that predisposes people
    to listen to ideologies

415
00:22:47,867 --> 00:22:50,101
 that tell you, "I'll tell you
  how you're going to matter.

416
00:22:50,169 --> 00:22:51,702
   You're going to matter"--

417
00:22:51,771 --> 00:22:54,305
    and this is in the case
  of ISIS and radicalization--

418
00:22:54,374 --> 00:22:56,107
    "You're going to matter
     by joining the fight."

419
00:22:56,175 --> 00:22:59,210
           NARRATOR:
        In other words,
  when people are struggling,

420
00:22:59,278 --> 00:23:03,381
    they are more vulnerable
        to group think.

421
00:23:03,449 --> 00:23:05,649
You guys can go ahead
and come on in.

422
00:23:05,718 --> 00:23:08,886
           NARRATOR:
   Kruglanski has tested this
theory with a simple experiment,

423
00:23:08,955 --> 00:23:11,355
  which he replicated for us.

424
00:23:11,424 --> 00:23:15,760
 Our subjects: four University
  of Maryland undergraduates.

425
00:23:15,828 --> 00:23:17,395
             The purpose of this
             experiment is to...

426
00:23:17,463 --> 00:23:20,398
           NARRATOR:
    They all played a simple
videogame called the Duck Hunt.

427
00:23:20,466 --> 00:23:22,299
     Okay, Ben, do you want
          to come on?

428
00:23:24,036 --> 00:23:27,905
           NARRATOR:
     The game was set to be
impossibly hard for two of them

429
00:23:27,974 --> 00:23:31,075
      and incredibly easy
       for the other two.

430
00:23:31,144 --> 00:23:34,979
 They were told a score of 100
            or more

431
00:23:35,047 --> 00:23:37,047
       predicts all kinds
      of success in life.

432
00:23:37,116 --> 00:23:39,483
     Scores lower than a hundred
       strongly predict failure.

433
00:23:39,552 --> 00:23:41,385
           So you're going to be
        playing that game today.

434
00:23:41,454 --> 00:23:43,587
           NARRATOR:
   Ben Weinberg had it easy.

435
00:23:45,625 --> 00:23:49,126
     He was knocking ducks
 out of the sky right and left

436
00:23:49,195 --> 00:23:52,963
and waltzed to the hundred-point
           threshold.

437
00:23:55,001 --> 00:23:58,002
   Afterward, he took a brief
 survey and had a quick debrief

438
00:23:58,070 --> 00:24:00,371
     with graduate student
       Marina Chernikova.

439
00:24:00,440 --> 00:24:02,840
           WEINBERG:
   It felt good when I got it
       on the first try.

440
00:24:02,909 --> 00:24:05,609
It was a little more frustrating
    when I took a few clicks

441
00:24:05,678 --> 00:24:06,744
        to get the duck.

442
00:24:08,481 --> 00:24:11,382
           NARRATOR:
But when it was Mara Lins' turn
        in the hot seat,

443
00:24:11,451 --> 00:24:12,850
  there were no sitting ducks.

444
00:24:12,919 --> 00:24:14,485
        Not even close.

445
00:24:14,554 --> 00:24:18,456
     (metal music playing)

446
00:24:21,494 --> 00:24:23,160
      It was really hard.

447
00:24:23,229 --> 00:24:27,598
I felt really frustrated because
the duck was just going so fast

448
00:24:27,667 --> 00:24:29,733
  I couldn't ever really click
        on it that well

449
00:24:29,802 --> 00:24:33,103
 and the score just kept going
           more down.

450
00:24:33,172 --> 00:24:34,805
It made me really uncomfortable,
           actually.

451
00:24:34,874 --> 00:24:36,307
CHERNIKOVA:
Okay.

452
00:24:36,375 --> 00:24:39,043
           NARRATOR:
      The survey included
      two dozen questions

453
00:24:39,111 --> 00:24:43,547
designed to assess people's need
   for support from a group.

454
00:24:43,616 --> 00:24:45,249
          KRUGLANSKI:
   So this person seems to be
            scoring

455
00:24:45,318 --> 00:24:47,218
very high on interdependence--

456
00:24:47,286 --> 00:24:49,787
do you know what
condition was he in?

457
00:24:49,856 --> 00:24:52,189
            Yes, this one was in
          the failure condition.

458
00:24:52,258 --> 00:24:54,124
          KRUGLANSKI:
What we find time and time again

459
00:24:54,193 --> 00:24:55,459
 is that if you're successful,

460
00:24:55,528 --> 00:24:57,895
you feel relatively independent
         of your group.

461
00:24:57,964 --> 00:24:59,163
  You can hack it on your own.

462
00:24:59,232 --> 00:25:01,665
  But when you feel humiliated
         and weakened,

463
00:25:01,734 --> 00:25:05,236
    that's the circumstances
   that lead you to undertake

464
00:25:05,304 --> 00:25:08,205
  sacrifices on behalf of the
group in order to feel rewarded

465
00:25:08,274 --> 00:25:12,142
    by the group by a sense
  of heroism and significance.

466
00:25:14,046 --> 00:25:16,313
           NARRATOR:
    So what about lone-wolf
          terrorists,

467
00:25:16,382 --> 00:25:19,884
 those who are self-radicalized
            online?

468
00:25:19,952 --> 00:25:22,820
          KRUGLANSKI:
    Yes, the group is there
           virtually.

469
00:25:22,889 --> 00:25:26,657
 The group does not need to be
 physically present and salient

470
00:25:26,726 --> 00:25:28,592
   and they can imagine that
     the group will approve

471
00:25:28,661 --> 00:25:31,362
       of their deeds and
   they can pick up a knife,

472
00:25:31,430 --> 00:25:36,800
    a machete, or a vehicle
  and go out and kill people.

473
00:25:42,074 --> 00:25:46,176
           NARRATOR:
 In 2006, Omar Hammami arrived
     in Mogadishu, Somalia,

474
00:25:46,245 --> 00:25:50,781
 where he joined a very deadly
      group of terrorists:

475
00:25:50,850 --> 00:25:54,385
     the al Qaeda affiliate
          al Shabaab--

476
00:25:54,453 --> 00:25:56,854
   Islamic terrorists waging
         an insurgency

477
00:25:56,923 --> 00:25:59,156
   against government forces.

478
00:26:00,426 --> 00:26:02,026
   Al Shabaab was well known

479
00:26:02,094 --> 00:26:04,395
  for aggressively recruiting
           Americans.

480
00:26:04,463 --> 00:26:08,432
    Omar Hammami was quickly
   welcomed into their ranks.

481
00:26:12,838 --> 00:26:14,572
     HAMMAMI (dramatized):
       That night leaving
         for al Shabaab

482
00:26:14,640 --> 00:26:19,710
   was the night I was given
  my AKM, which I still have.

483
00:26:19,779 --> 00:26:22,947
  I felt like I had just been
      given an atomic bomb

484
00:26:23,015 --> 00:26:24,882
 that might blow at any second.

485
00:26:29,555 --> 00:26:33,591
           NARRATOR:
The leadership of the terrorist
 organization quickly tapped in

486
00:26:33,659 --> 00:26:37,461
    to Omar's unique mix of
   charisma, computer skills,

487
00:26:37,530 --> 00:26:41,332
     and fluency in English
          and Arabic.

488
00:26:41,400 --> 00:26:45,803
He made his debut as a terrorist
        in October 2007

489
00:26:45,871 --> 00:26:48,472
 in an Al Jazeera news report.

490
00:26:48,541 --> 00:26:53,744
     His new nom de guerre:
     Abu Mansoor Al-Amriki.

491
00:26:53,813 --> 00:26:56,013
    All Muslims of America,

492
00:26:56,082 --> 00:26:59,483
  take into deep consideration
    the example of Somalia.

493
00:26:59,552 --> 00:27:02,686
    After 15 years of chaos
      and oppressive rule

494
00:27:02,755 --> 00:27:05,122
by the American-backed warlords,

495
00:27:05,191 --> 00:27:07,591
     your brothers stood up
  in order to establish peace

496
00:27:07,660 --> 00:27:09,126
   and justice in this land.

497
00:27:09,195 --> 00:27:11,161
When I first saw the interview,

498
00:27:11,230 --> 00:27:16,800
 I knew that was the end of...
      life as we know it.

499
00:27:16,869 --> 00:27:18,902
We will never be the same again.

500
00:27:18,971 --> 00:27:24,174
It's devastating for both of us;
       he's our only son.

501
00:27:24,243 --> 00:27:25,609
     We only have one son.

502
00:27:25,678 --> 00:27:26,810
       Now we have none.

503
00:27:30,783 --> 00:27:33,150
      (singing in Arabic)

504
00:27:35,488 --> 00:27:39,156
           NARRATOR:
  In 2009 al Shabaab released
      a widely distributed

505
00:27:39,225 --> 00:27:44,595
  propaganda video in English
featuring an ambush in Somalia.

506
00:27:44,664 --> 00:27:49,333
 Starring Omar, it was tailored
     to recruit Americans.

507
00:27:49,402 --> 00:27:51,902
  We're waiting for the enemy
            to come.

508
00:27:51,971 --> 00:27:56,607
 We heard that the numbers are
  close to a thousand or more.

509
00:27:56,676 --> 00:28:00,244
 So, what we're planning to do
   is put them in an ambush,

510
00:28:00,312 --> 00:28:02,813
try to blow up as many of their
      vehicles as we can,

511
00:28:02,882 --> 00:28:04,581
    and kill as many of them
           as we can,

512
00:28:04,650 --> 00:28:06,950
and take everything they've got,
           inshallah.

513
00:28:07,019 --> 00:28:11,588
        (rapid gunfire)

514
00:28:16,562 --> 00:28:17,861
            BERGER:
When the Ambush at Bardale video

515
00:28:17,930 --> 00:28:21,131
   came out, he became a bit
     of a media sensation.

516
00:28:21,200 --> 00:28:24,635
            HAMMAMI:
        ♪ Bomb by bomb,
        blast by blast ♪

517
00:28:24,704 --> 00:28:27,337
           NARRATOR:
   It featured a rap written
     and performed by Omar.

518
00:28:27,406 --> 00:28:30,340
            HAMMAMI:
        ♪ Word by word,
      Bush said it true ♪

519
00:28:30,409 --> 00:28:32,910
       ♪ You with him or
you're with the Muslim group. ♪

520
00:28:32,978 --> 00:28:35,145
            BERGER:
 Al Shabaab was very impressed

521
00:28:35,214 --> 00:28:37,147
       with the traction
     these rap videos got.

522
00:28:37,216 --> 00:28:41,018
 The only reason we're staying
 here, away from our families,

523
00:28:41,087 --> 00:28:44,288
away from the cities, away from,
   you know, ice, candy bars,

524
00:28:44,356 --> 00:28:45,489
    all these other things,

525
00:28:45,558 --> 00:28:47,758
is because we're waiting to meet
        with the enemy.

526
00:28:47,827 --> 00:28:50,828
            BERGER:
 At the time, I thought of him
    as kind of a novelty act

527
00:28:50,896 --> 00:28:54,631
   with the rap video and his
   sometimes awkward attempts

528
00:28:54,700 --> 00:28:57,401
 to sort of morph into more of
  a kind of a scholarly role.

529
00:28:57,470 --> 00:29:01,405
 One of the things that we seek
    for in this life of ours

530
00:29:01,474 --> 00:29:03,107
     is to die as a martyr.

531
00:29:03,175 --> 00:29:06,143
So the fact that we got two, uh,
    martyrs is nothing more

532
00:29:06,212 --> 00:29:07,745
than a victory in and of itself.

533
00:29:07,813 --> 00:29:10,147
    So if you can encourage
     more of your children

534
00:29:10,216 --> 00:29:12,216
   and more of your neighbors
     and anyone around you

535
00:29:12,284 --> 00:29:14,017
    to send people like him
         to this jihad,

536
00:29:14,086 --> 00:29:15,986
   it would be a great asset
            for us.

537
00:29:16,055 --> 00:29:17,554
            BERGEN:
   Omar Hammami was somebody

538
00:29:17,623 --> 00:29:20,758
   that Shabaab put front and
center to try and recruit people

539
00:29:20,826 --> 00:29:22,126
         into the group
   because he spoke English.

540
00:29:22,194 --> 00:29:25,496
            HAMMAMI:
 ♪ Night by night, day by day ♪

541
00:29:25,564 --> 00:29:28,766
     ♪ Mujahidin spreading
      all over the place ♪

542
00:29:28,834 --> 00:29:31,969
     HAMMAMI (dramatized):
The real fear that the Americans
 feel when they see an American

543
00:29:32,037 --> 00:29:36,006
in Somalia talking about jihad,
   is not how skillful he is

544
00:29:36,075 --> 00:29:39,476
  at sneaking back across the
 borders with nuclear weapons.

545
00:29:39,545 --> 00:29:43,547
 The Americans fear that their
cultural barrier has been broken

546
00:29:43,616 --> 00:29:46,416
    and now jihad has become
     a normal career choice

547
00:29:46,485 --> 00:29:48,886
        for any youthful
        American Muslim.

548
00:29:48,954 --> 00:29:50,320
            BERGEN:
         Shabaab lured

549
00:29:50,389 --> 00:29:52,823
    up to about 40 Americans
  to come and fight with them.

550
00:29:52,892 --> 00:29:55,559
  And they had a whole foreign
      fighter kind of crew

551
00:29:55,628 --> 00:29:57,294
       from people around
       the Muslim world.

552
00:29:57,363 --> 00:30:02,199
    It was kind of an early
  precursor of what ISIS did.

553
00:30:02,268 --> 00:30:04,268
           NARRATOR:
        The recruitment
      and propaganda push

554
00:30:04,336 --> 00:30:06,770
   by terrorist organizations
             online

555
00:30:06,839 --> 00:30:08,138
   has put a lot of pressure

556
00:30:08,207 --> 00:30:11,809
  on the big social networking
           companies.

557
00:30:11,877 --> 00:30:13,310
But how should they crack down?

558
00:30:13,379 --> 00:30:16,613
       It wasn't long ago
   that these companies took

559
00:30:16,682 --> 00:30:21,018
    a laissez-faire approach
     to terrorist content.

560
00:30:21,086 --> 00:30:24,621
 They claimed they didn't want
  to hinder freedom of speech.

561
00:30:24,690 --> 00:30:27,958
   But that started to change
            in 2013.

562
00:30:28,027 --> 00:30:33,664
During a terrifying assault at a
shopping mall in Nairobi, Kenya,

563
00:30:33,732 --> 00:30:35,132
   attackers with al Shabaab

564
00:30:35,201 --> 00:30:40,637
 live-tweeted for hours as they
   shot more than 175 people,

565
00:30:40,706 --> 00:30:43,674
          killing 67.

566
00:30:43,742 --> 00:30:45,576
            WEYERS:
    That was the first time

567
00:30:45,644 --> 00:30:48,512
         where Twitter
     was actively removing

568
00:30:48,581 --> 00:30:49,680
          the content
    that they were posting.

569
00:30:49,748 --> 00:30:51,315
  And the reason for that is,

570
00:30:51,383 --> 00:30:53,717
  they were actively tweeting
      their attack online,

571
00:30:53,786 --> 00:30:56,386
   and it was the first time
      we really saw that.

572
00:30:56,455 --> 00:30:58,589
   ISIS completely blew that
       out of the water.

573
00:30:58,657 --> 00:31:04,962
   They took that concept and
   magnified it by a million.

574
00:31:05,030 --> 00:31:07,731
           NARRATOR:
     Today, Twitter claims
   it aggressively takes down

575
00:31:07,800 --> 00:31:09,466
   accounts linked to terror,

576
00:31:09,535 --> 00:31:13,971
        360,000 of them
   since the middle of 2015.

577
00:31:14,039 --> 00:31:20,310
But repeat offenders simply open
  new accounts time and again.

578
00:31:20,379 --> 00:31:25,482
            WEYERS:
   And now they start to talk
about reverting back to Facebook

579
00:31:25,551 --> 00:31:26,550
       so they're talking

580
00:31:26,619 --> 00:31:28,185
        about reopening
    their Facebook account,

581
00:31:28,254 --> 00:31:31,121
 and here's a link to go to it.

582
00:31:34,059 --> 00:31:37,094
           NARRATOR:
    Facebook is the largest
   social networking platform

583
00:31:37,162 --> 00:31:39,529
         on the planet.

584
00:31:39,598 --> 00:31:43,967
It says it has a zero-tolerance
     policy for extremists,

585
00:31:44,036 --> 00:31:48,438
      but it must contend
   with a tsunami of content.

586
00:31:48,507 --> 00:31:50,874
       Facebook has more
     than one billion users

587
00:31:50,943 --> 00:31:54,344
  actively posting every day.

588
00:31:54,413 --> 00:31:58,749
The company says about one-half
of one percent of flagged items

589
00:31:58,817 --> 00:32:01,051
    are linked to terrorism,

590
00:32:01,120 --> 00:32:05,255
        but that's still
       a lot of material.

591
00:32:05,324 --> 00:32:07,758
         Monika Bickert
       is Facebook's head

592
00:32:07,826 --> 00:32:12,029
  of global policy management.

593
00:32:12,097 --> 00:32:13,864
We use photo-matching technology

594
00:32:13,933 --> 00:32:17,768
  to identify when somebody's
  trying to upload to Facebook

595
00:32:17,836 --> 00:32:19,736
            an image
   that we've already removed

596
00:32:19,805 --> 00:32:21,838
  for violating our policies.

597
00:32:21,907 --> 00:32:25,575
    Of course, the image may
or may not violate our policies

598
00:32:25,644 --> 00:32:27,044
    when it's uploaded again

599
00:32:27,112 --> 00:32:29,913
  because it could be somebody
who's sharing a terrorist image

600
00:32:29,982 --> 00:32:32,883
    as part of a news story
    or to condemn violence.

601
00:32:32,952 --> 00:32:35,953
      So we use automation
        to flag content

602
00:32:36,021 --> 00:32:38,188
       that we will then
     have our teams review.

603
00:32:38,257 --> 00:32:41,625
           NARRATOR:
         But are there
       more advanced ways

604
00:32:41,694 --> 00:32:44,928
to stop the extremists' messages
        from spreading?

605
00:32:44,997 --> 00:32:47,664
       Is there a better
    technological solution?

606
00:32:47,733 --> 00:32:50,200
          HANY FARID:
     We have the technology

607
00:32:50,269 --> 00:32:53,437
   to disrupt, not eliminate,

608
00:32:53,505 --> 00:32:57,040
         but to disrupt
    the global transmission

609
00:32:57,109 --> 00:32:58,675
 of extremism-related content.

610
00:32:58,744 --> 00:33:03,280
           NARRATOR:
    Hany Farid is a computer
scientist at Dartmouth College.

611
00:33:03,349 --> 00:33:06,483
 His challenge is significant:

612
00:33:06,552 --> 00:33:09,186
    how to identify and stop
      the spread of images

613
00:33:09,254 --> 00:33:13,323
made by, of, and for terrorists
        on the internet.

614
00:33:13,392 --> 00:33:17,594
The sheer volume of the problem
          is daunting.

615
00:33:17,663 --> 00:33:20,731
             FARID:
   So a video is just a bunch
  of images stacked together.

616
00:33:20,799 --> 00:33:22,265
 A short video, a few minutes,

617
00:33:22,334 --> 00:33:24,735
 you're talking about thousands
 of images you have to analyze,

618
00:33:24,803 --> 00:33:27,671
and you have to do this fast and
 you have to do it accurately.

619
00:33:27,740 --> 00:33:30,240
   And it is a spectacularly
       difficult problem

620
00:33:30,309 --> 00:33:32,843
because really, somebody turned
    on the firehose of data

621
00:33:32,911 --> 00:33:34,444
    and you are just trying
           to keep up

622
00:33:34,513 --> 00:33:37,314
    with this massive number
      of pixels coming in.

623
00:33:37,383 --> 00:33:39,282
           NARRATOR:
   Billions of uploads a day,

624
00:33:39,351 --> 00:33:43,120
   each of them with millions
           of pixels.

625
00:33:43,188 --> 00:33:45,422
Can a computer program possibly
           be capable

626
00:33:45,491 --> 00:33:47,791
   of sorting through it all

627
00:33:47,860 --> 00:33:52,129
      and find the images
  that inspire new recruits...

628
00:33:52,197 --> 00:33:54,064
           (singing)

629
00:33:54,133 --> 00:33:55,565
           NARRATOR:
   ...incite new violence...

630
00:33:55,634 --> 00:33:57,934
              MAN:
         <i> Allahu akbar!</i>

631
00:33:58,003 --> 00:33:59,469
           NARRATOR:
     ...and horrify us all?

632
00:34:01,740 --> 00:34:05,876
So here is the actual raw frame
      that you're seeing,

633
00:34:05,944 --> 00:34:07,911
processing one frame at a time.

634
00:34:07,980 --> 00:34:09,212
        And in a frame,
      we actually analyze

635
00:34:09,281 --> 00:34:10,480
   multiple blocks within it.

636
00:34:10,549 --> 00:34:13,784
     The yellow crosshairs
       that you're seeing

637
00:34:13,852 --> 00:34:17,020
        are enumerating
the various blocks of the video

638
00:34:17,089 --> 00:34:18,955
     that we're analyzing.

639
00:34:19,024 --> 00:34:22,092
    This yellow histogram is
 a distribution of measurements

640
00:34:22,161 --> 00:34:25,429
       that we're making
  from each individual block,

641
00:34:25,497 --> 00:34:28,198
 and then that gets translated
into an actual digital signature

642
00:34:28,267 --> 00:34:30,667
     which I visualize here
        with a stemplot.

643
00:34:30,736 --> 00:34:34,471
           NARRATOR:
 He got the idea ten years ago.

644
00:34:34,540 --> 00:34:35,806
    The internet had become
           a platform

645
00:34:35,874 --> 00:34:38,442
    for child pornographers.

646
00:34:38,510 --> 00:34:42,512
    The technology is called
       "robust hashing."

647
00:34:42,581 --> 00:34:44,648
          All that is,
     is a very simple idea,

648
00:34:44,716 --> 00:34:48,385
is that from an image or a video
     or an audio recording,

649
00:34:48,454 --> 00:34:50,787
          you extract
     a distinct signature.

650
00:34:50,856 --> 00:34:54,758
           NARRATOR:
       As the images move
     through the internet,

651
00:34:54,827 --> 00:34:57,060
  the signatures never change,

652
00:34:57,129 --> 00:35:00,464
    no matter how many times
    the images are modified.

653
00:35:00,532 --> 00:35:02,265
             FARID:
  So if there's just one image

654
00:35:02,334 --> 00:35:04,034
     in an upload of yours

655
00:35:04,103 --> 00:35:05,535
  that has child pornography,
             we...

656
00:35:05,604 --> 00:35:07,537
   the account can be frozen,

657
00:35:07,606 --> 00:35:09,539
  the contents of that account
        can be assessed,

658
00:35:09,608 --> 00:35:11,141
        and new content
       can be discovered.

659
00:35:11,210 --> 00:35:13,910
       People don't trade
       one or two images.

660
00:35:13,979 --> 00:35:15,979
      They trade hundreds
    and thousands of images.

661
00:35:16,048 --> 00:35:20,217
  And you can very organically
grow the space of known content.

662
00:35:22,621 --> 00:35:23,820
           NARRATOR:
           It worked.

663
00:35:23,889 --> 00:35:26,056
     The commercial product
     that he helped create

664
00:35:26,125 --> 00:35:28,558
      is called PhotoDNA.

665
00:35:28,627 --> 00:35:31,428
     It has greatly reduced
       child pornography

666
00:35:31,497 --> 00:35:35,632
  on the big social networking
             sites.

667
00:35:35,701 --> 00:35:37,167
  Today, PhotoDNA is deployed

668
00:35:37,236 --> 00:35:39,436
 on almost every major internet
 company both here and abroad.

669
00:35:39,505 --> 00:35:42,305
  It is, by my understanding,
          eliminating

670
00:35:42,374 --> 00:35:46,009
    upwards of four million
child pornography images a year

671
00:35:46,078 --> 00:35:47,711
   from being redistributed.

672
00:35:47,779 --> 00:35:52,716
           NARRATOR:
 Farid is advocating a similar
     approach to terrorism,

673
00:35:52,784 --> 00:35:57,787
 but will the social networking
      platforms go along?

674
00:35:57,856 --> 00:35:59,556
            BICKERT:
         Our mission is
       to connect people,

675
00:35:59,625 --> 00:36:02,659
so we do want people to be able
        to share content

676
00:36:02,728 --> 00:36:05,295
that may even be controversial,

677
00:36:05,364 --> 00:36:06,663
   if it is important to them

678
00:36:06,732 --> 00:36:08,331
       and it's something
 that they want to communicate.

679
00:36:08,400 --> 00:36:10,367
     However, we also know

680
00:36:10,435 --> 00:36:11,801
that people won't share anything
        about themselves

681
00:36:11,870 --> 00:36:13,303
if they're not in a safe place.

682
00:36:13,372 --> 00:36:15,672
We don't allow beheading videos.

683
00:36:15,741 --> 00:36:17,374
      We also don't allow
        any terror group

684
00:36:17,442 --> 00:36:19,943
     to maintain a presence
  on our site for any reason.

685
00:36:20,012 --> 00:36:23,780
           NARRATOR:
   But persistent terrorists
          find a way,

686
00:36:23,849 --> 00:36:27,117
     and extremist content
     is readily available.

687
00:36:27,186 --> 00:36:30,020
Social networking companies say
     they have a hard time

688
00:36:30,088 --> 00:36:34,791
 drawing the line when it comes
     to defining extremism.

689
00:36:34,860 --> 00:36:37,093
             FARID:
          What we have
     is a problem of will.

690
00:36:37,162 --> 00:36:38,795
   They do not want to be put

691
00:36:38,864 --> 00:36:41,131
          at the nexus
   of criminal organizations,

692
00:36:41,200 --> 00:36:42,732
    extremist organizations,

693
00:36:42,801 --> 00:36:44,634
      and law enforcement
     and national security.

694
00:36:44,703 --> 00:36:47,637
 They feel like they don't have
    a responsibility there.

695
00:36:51,710 --> 00:36:56,146
           NARRATOR:
     It was March of 2012,

696
00:36:56,215 --> 00:36:58,315
     long before the social
      networking companies

697
00:36:58,383 --> 00:37:00,750
    cracked down on terror.

698
00:37:00,819 --> 00:37:02,152
     Somewhere in Somalia,

699
00:37:02,221 --> 00:37:05,155
  Omar Hammami was once again
       using the internet

700
00:37:05,224 --> 00:37:07,657
  to reach a global audience,

701
00:37:07,726 --> 00:37:12,429
    this time posting a plea
  on YouTube for his life....

702
00:37:12,497 --> 00:37:15,165
       (speaking Arabic)

703
00:37:15,234 --> 00:37:17,767
           NARRATOR:
    Speaking first in Arabic
      and then in English.

704
00:37:17,836 --> 00:37:20,270
     It is plainly evident
        to the world...

705
00:37:20,339 --> 00:37:22,472
            HAMMAMI:
    To whomever it may reach
      from the Muslims...

706
00:37:22,541 --> 00:37:25,041
           NARRATOR:
 ...he has worn out his welcome

707
00:37:25,110 --> 00:37:27,277
      with the leadership
         of al Shabaab.

708
00:37:27,346 --> 00:37:29,546
            HAMMAMI:
  I record this message today

709
00:37:29,615 --> 00:37:32,882
because I feel that my life may
  be endangered by al Shabaab

710
00:37:32,951 --> 00:37:35,819
    due to some differences
    that occurred between us

711
00:37:35,887 --> 00:37:38,555
regarding matters of the sharia
    and matters of strategy.

712
00:37:38,624 --> 00:37:40,557
       That was extremely
         unusual break.

713
00:37:40,626 --> 00:37:42,892
 Prior to that, jihadi disputes

714
00:37:42,961 --> 00:37:46,496
 tended to be carefully managed
       behind the scenes.

715
00:37:46,565 --> 00:37:48,765
     So this was a big deal
       when he showed up

716
00:37:48,834 --> 00:37:50,267
     and made the statement

717
00:37:50,335 --> 00:37:53,670
   that al Shabaab was trying
          to kill him.

718
00:37:53,739 --> 00:37:55,705
In order to promote that video,

719
00:37:55,774 --> 00:37:58,108
 he had signed up for a number
   of social media platforms.

720
00:37:58,176 --> 00:38:01,911
           NARRATOR:
 He also had a book to promote.

721
00:38:01,980 --> 00:38:04,581
     HAMMAMI (dramatized):
Due to the unpredictable nature
       of the environment

722
00:38:04,650 --> 00:38:06,383
     in the lands of jihad,

723
00:38:06,451 --> 00:38:09,552
I decided now is as good a time
             as any

724
00:38:09,621 --> 00:38:12,489
   to release the first part
      of my autobiography.

725
00:38:12,557 --> 00:38:14,357
   Although nothing special,

726
00:38:14,426 --> 00:38:16,893
     I thought my addition
     to the jihadi library

727
00:38:16,962 --> 00:38:19,763
     could at least provide
         some benefit.

728
00:38:19,831 --> 00:38:23,466
           NARRATOR:
       He took to Twitter
 like no terrorist had before,

729
00:38:23,535 --> 00:38:26,636
 interacting with a wide range
    of analysts, reporters,

730
00:38:26,705 --> 00:38:28,805
     and terrorism experts.

731
00:38:28,874 --> 00:38:33,910
    He did change the flavor
 of the environment of Twitter

732
00:38:33,979 --> 00:38:36,346
and the accessibility of Twitter
          for people.

733
00:38:36,415 --> 00:38:38,148
He engaged with a lot of people.

734
00:38:38,216 --> 00:38:39,683
     He was trying to talk.

735
00:38:39,751 --> 00:38:43,887
            BERGEN:
    We didn't see terrorists
tweeting in this manner before.

736
00:38:43,955 --> 00:38:45,188
   He was writing in English,

737
00:38:45,257 --> 00:38:47,157
   you know, had a reasonably
      good sense of humor.

738
00:38:47,225 --> 00:38:49,359
   He's tweeting about jihad,

739
00:38:49,428 --> 00:38:52,929
  and he's an accessible guy,
  and it's easy to follow him.

740
00:38:52,998 --> 00:38:54,264
       He would show up,
      and he was talking,

741
00:38:54,333 --> 00:38:56,199
and I was, like, "Well, I should
      just keep talking."

742
00:38:56,268 --> 00:39:02,339
           NARRATOR:
At first, J.M. Berger approached
 Omar with journalistic intent.

743
00:39:02,407 --> 00:39:04,874
     At first, I was trying
  to pump him for information

744
00:39:04,943 --> 00:39:07,444
about what was going on with him
        and al Shabaab,

745
00:39:07,512 --> 00:39:09,079
     and, you know, it was
      sort of utilitarian.

746
00:39:09,147 --> 00:39:13,450
           NARRATOR:
He began his dialogue with Omar
        in May of 2012,

747
00:39:13,518 --> 00:39:16,786
        first via email,
  and then moving to Twitter,

748
00:39:16,855 --> 00:39:21,024
   long before any crackdown
    on tweeting terrorists.

749
00:39:21,093 --> 00:39:23,993
   They debated the rationale
    for targeting civilians,

750
00:39:24,062 --> 00:39:27,263
     how religious scholars
         justify jihad,

751
00:39:27,332 --> 00:39:30,433
        and the morality
       of drone strikes.

752
00:39:30,502 --> 00:39:32,335
            BERGER:
    It just kind of turned,
         after a while,

753
00:39:32,404 --> 00:39:33,436
  into a regular conversation

754
00:39:33,505 --> 00:39:34,971
     like I have with lots
        of other people,

755
00:39:35,040 --> 00:39:36,172
 colleagues that I have online,

756
00:39:36,241 --> 00:39:39,242
     except that he is not
          a colleague.

757
00:39:39,311 --> 00:39:40,977
        Kind of an extraordinary
           regular conversation,

758
00:39:41,046 --> 00:39:42,145
                   I'd say, huh?

759
00:39:42,214 --> 00:39:44,013
   It was... it was strange;
        it was surreal.

760
00:39:44,082 --> 00:39:47,584
           NARRATOR:
    And, at times, humorous.

761
00:39:47,652 --> 00:39:50,320
         At one point,
   Omar jokingly asked Berger

762
00:39:50,389 --> 00:39:53,022
     if he ever considered
        switching sides.

763
00:39:53,091 --> 00:39:57,627
            BERGER:
 "I'd miss the music, bikinis,
      and bacon too much."

764
00:39:57,696 --> 00:39:58,828
 (computer making tweet sound)

765
00:39:58,897 --> 00:40:00,330
     HAMMAMI (dramatized):
       I see your bikinis

766
00:40:00,399 --> 00:40:02,465
         and raise you
    four wives in this life,

767
00:40:02,534 --> 00:40:05,335
        72 in the next!

768
00:40:05,404 --> 00:40:07,771
       When Omar emerged
       onto social media,

769
00:40:07,839 --> 00:40:09,939
 he was not the first jihadist
       to get on Twitter,

770
00:40:10,008 --> 00:40:11,808
      but it was something
  that hadn't really been done

771
00:40:11,877 --> 00:40:14,411
       by somebody who is
  in a war zone, representing.

772
00:40:14,479 --> 00:40:18,381
           NARRATOR:
 A terrorist on the front lines
           of jihad,

773
00:40:18,450 --> 00:40:22,552
 speaking, debating, cajoling,
          even joking,

774
00:40:22,621 --> 00:40:27,457
     an AK-47 in one hand,
a global megaphone in the other.

775
00:40:27,526 --> 00:40:29,726
            BERGEN:
  I think he was a harbinger.

776
00:40:29,795 --> 00:40:32,862
ISIS didn't come into existence
          until 2014,

777
00:40:32,931 --> 00:40:34,130
    and they took that model

778
00:40:34,199 --> 00:40:37,367
 and they kind of amplified it
         significantly.

779
00:40:40,806 --> 00:40:43,106
           NARRATOR:
    Omar was now on the run,

780
00:40:43,175 --> 00:40:46,376
    taunting the leadership
   of al Shabaab via Twitter

781
00:40:46,445 --> 00:40:49,846
 even as he tried to evade them
         in the forest.

782
00:40:49,915 --> 00:40:53,249
     HAMMAMI (dramatized):
 Shabaab has changed strategy--

783
00:40:53,318 --> 00:40:55,652
         from choosing
  the best legitimate targets

784
00:40:55,720 --> 00:40:57,687
   to hitting whatever target
            they can

785
00:40:57,756 --> 00:41:00,156
and then legitimizing it later.

786
00:41:00,225 --> 00:41:03,960
           NARRATOR:
         In March 2013,
      the U.S. government

787
00:41:04,029 --> 00:41:08,364
    put a $5 million bounty
    on Omar Hammami's head.

788
00:41:08,433 --> 00:41:11,901
      Al Shabaab assassins
  came for him a month later.

789
00:41:11,970 --> 00:41:14,237
    Because he was creating

790
00:41:14,306 --> 00:41:17,173
   a huge amount of publicity
 and bad press for al Shabaab,

791
00:41:17,242 --> 00:41:18,975
   al Shabaab had to respond.

792
00:41:19,044 --> 00:41:22,412
     HAMMAMI (dramatized):
   Just been shot in the neck
      by Shabaab assassin.

793
00:41:22,481 --> 00:41:24,180
       Not critical yet.

794
00:41:24,249 --> 00:41:25,181
  (computer makes tweet sound)

795
00:41:25,250 --> 00:41:27,116
            BERGER:
     "Seriously? You shot?"

796
00:41:27,185 --> 00:41:28,218
  (computer makes tweet sound)

797
00:41:28,286 --> 00:41:29,385
     HAMMAMI (dramatized):
          Yeah, sucks.

798
00:41:29,454 --> 00:41:32,489
        He live-tweeted
   an assassination attempt.

799
00:41:32,557 --> 00:41:35,091
            BERGER:
He uploaded a couple of pictures
         of his injury.

800
00:41:35,160 --> 00:41:36,092
      He had been grazed.

801
00:41:36,161 --> 00:41:37,827
  It wasn't a serious injury.

802
00:41:37,896 --> 00:41:39,262
       (computer tweets)

803
00:41:39,331 --> 00:41:40,730
            BERGER:
"If you want to get out of this,

804
00:41:40,799 --> 00:41:42,899
    I'd do whatever I could
  to get you a liveable deal."

805
00:41:42,968 --> 00:41:44,167
       (computer tweets)

806
00:41:44,236 --> 00:41:45,902
     HAMMAMI (dramatized):
    You know I'm not on it.

807
00:41:45,971 --> 00:41:48,338
  I appreciate the compassion,
            though.

808
00:41:48,406 --> 00:41:49,973
  I think he would have gotten
        out of Somalia,

809
00:41:50,041 --> 00:41:51,508
    maybe, if he could have,

810
00:41:51,576 --> 00:41:54,410
 but he did feel like this was
 a fight that he had a part in

811
00:41:54,479 --> 00:41:55,812
    and that he should stick
            with it.

812
00:41:55,881 --> 00:41:57,347
             KHAN:
   He said, "If I come back,

813
00:41:57,415 --> 00:41:59,215
        "I'll be in jail
    for the rest of my life.

814
00:41:59,284 --> 00:42:01,150
   So why would I come back?"

815
00:42:01,219 --> 00:42:02,485
            BERGER:
     It was kind of amazing

816
00:42:02,554 --> 00:42:05,188
 that Omar managed to hold out
       as long as he did.

817
00:42:05,257 --> 00:42:07,490
    Once he made that break,
he was eventually going to die.

818
00:42:07,559 --> 00:42:10,226
            HAMMAMI:
      <i> The reason why I'm</i>
    <i> in the forest right now</i>

819
00:42:10,295 --> 00:42:12,829
      <i> is because I'm one</i>
 <i> of the few people in Somalia</i>

820
00:42:12,898 --> 00:42:14,564
     <i> who stood out against</i>
          <i> the Shabaab</i>

821
00:42:14,633 --> 00:42:17,567
<i> blowing up innocent civilians.</i>

822
00:42:17,636 --> 00:42:20,203
           NARRATOR:
       September 3, 2013:

823
00:42:20,272 --> 00:42:22,505
          Omar granted
     a telephone interview

824
00:42:22,574 --> 00:42:24,274
   with the Voice of America.

825
00:42:24,342 --> 00:42:25,909
          INTERVIEWER:
      Are you a terrorist?

826
00:42:25,977 --> 00:42:27,844
            HAMMAMI:
  I'm definitely a terrorist.

827
00:42:27,913 --> 00:42:30,880
But I'm not a member of al Qaeda
    or a member of Shabaab.

828
00:42:30,949 --> 00:42:33,616
   I do believe in following
          my religion

829
00:42:33,685 --> 00:42:36,753
    even if that requires me
       to use explosives

830
00:42:36,821 --> 00:42:39,422
        or use an AK-47.

831
00:42:39,491 --> 00:42:42,258
          INTERVIEWER:
     What about coming back
         to your family

832
00:42:42,327 --> 00:42:43,693
   here in the United States?

833
00:42:43,762 --> 00:42:45,895
            HAMMAMI:
That's definitely not an option

834
00:42:45,964 --> 00:42:48,197
   unless it's in a body bag.

835
00:42:50,001 --> 00:42:53,870
           NARRATOR:
 Nine days later, Omar Hammami
    was killed in an ambush.

836
00:42:57,275 --> 00:42:58,541
   The meaning of Omar's life

837
00:42:58,610 --> 00:43:01,044
   ended up being conflicted
            at best

838
00:43:01,112 --> 00:43:03,313
  and kind of empty at worst.

839
00:43:03,381 --> 00:43:06,883
 He literally gave his life for
this kind of jihadist movement,

840
00:43:06,952 --> 00:43:10,053
  and yet his story is really
     just a cautionary tale

841
00:43:10,121 --> 00:43:14,557
    about why you shouldn't
       join these groups.

842
00:43:14,626 --> 00:43:17,226
      His death was a boon
  for counterterrorism efforts

843
00:43:17,295 --> 00:43:19,796
and countering violent extremism
            efforts.

844
00:43:19,864 --> 00:43:22,765
 He gave us a narrative to use

845
00:43:22,834 --> 00:43:26,803
           to counter
    this recruitment pitch.

846
00:43:26,871 --> 00:43:29,105
           NARRATOR:
   Omar Hammami seemed intent

847
00:43:29,174 --> 00:43:32,241
          on his path
   toward violent extremism.

848
00:43:32,310 --> 00:43:36,646
But could a person so determined
   be thwarted along the way?

849
00:43:40,952 --> 00:43:44,721
     Does terrorism respond
      to an intervention?

850
00:43:44,789 --> 00:43:47,523
The idea is gaining new traction
          in the West.

851
00:43:47,592 --> 00:43:52,629
    In Toronto, Mubin Shaikh
     is a leading advocate

852
00:43:52,697 --> 00:43:57,266
        of what's known
     as "deradicalization."

853
00:43:57,335 --> 00:44:00,803
     He has walked the walk

854
00:44:00,872 --> 00:44:02,905
    and walked it all back.

855
00:44:02,974 --> 00:44:04,841
            SHAIKH:
I think we've long acknowledged

856
00:44:04,909 --> 00:44:07,577
  we cannot kill an ideology.

857
00:44:07,646 --> 00:44:09,579
    We can kill a whole lot
           of people

858
00:44:09,648 --> 00:44:11,114
 who subscribe to the ideology.

859
00:44:11,182 --> 00:44:13,182
    If you are going to have
       a battle of ideas,

860
00:44:13,251 --> 00:44:14,283
       better ideas win.

861
00:44:14,352 --> 00:44:16,819
         That's proven.

862
00:44:16,888 --> 00:44:20,523
           NARRATOR:
      He is living proof.

863
00:44:20,592 --> 00:44:23,326
 He is the son of conservative
        Islamic parents

864
00:44:23,395 --> 00:44:24,861
   who emigrated from India.

865
00:44:24,929 --> 00:44:30,299
   As a teenager, he embraced
    secular Western culture.

866
00:44:30,368 --> 00:44:37,006
It led to a rift that eventually
propelled him toward extremism.

867
00:44:37,075 --> 00:44:41,878
  Ironically, six years later,
 9/11 made him rethink it all.

868
00:44:41,946 --> 00:44:43,913
     Mubin, I think, is an
interesting example of somebody

869
00:44:43,982 --> 00:44:45,982
      who, you know, went
   all the way down that path

870
00:44:46,051 --> 00:44:47,850
      and then came back,

871
00:44:47,919 --> 00:44:49,352
     and now, is attempting
    to dissuade other people

872
00:44:49,421 --> 00:44:52,055
   from doing the same thing.

873
00:44:52,123 --> 00:44:54,123
   He understands the process
 by which some of these people

874
00:44:54,192 --> 00:44:55,224
   have gone down this path,

875
00:44:55,293 --> 00:44:57,694
          and I think
     that's very powerful.

876
00:44:57,762 --> 00:44:59,829
He can talk about this in a way
     that no one else can.

877
00:45:03,134 --> 00:45:06,402
           NARRATOR:
        He went to Syria
      to study the Koran,

878
00:45:06,471 --> 00:45:08,771
to understand where he had been

879
00:45:08,840 --> 00:45:11,541
    and where he was going.

880
00:45:11,609 --> 00:45:14,110
   And he met the right imam
       at the right time.

881
00:45:14,179 --> 00:45:16,179
            SHAIKH:
      We started talking,

882
00:45:16,247 --> 00:45:19,716
and he realized that, you know,
    I was this Western kid,

883
00:45:19,784 --> 00:45:23,119
 looking the way that I looked,
     big beard, long robe,

884
00:45:23,188 --> 00:45:24,220
    and for whatever reason,
            decided,

885
00:45:24,289 --> 00:45:25,855
    "Hey, I'm going to work
         on this guy."

886
00:45:25,924 --> 00:45:28,324
             And...

887
00:45:28,393 --> 00:45:31,027
You know, I spent a lot of time
           with him,

888
00:45:31,096 --> 00:45:32,795
           and led me
    through the Koran, man,

889
00:45:32,864 --> 00:45:35,498
    verse by verse by verse.

890
00:45:35,567 --> 00:45:39,902
           NARRATOR:
Mubin Shaikh soon saw the Koran
     in a whole new light.

891
00:45:39,971 --> 00:45:43,206
            SHAIKH:
   I always give this example
  of chapter nine, verse five.

892
00:45:43,274 --> 00:45:44,741
       You know, it says,
     "Kill the unbelievers

893
00:45:44,809 --> 00:45:46,943
    wherever you find them."

894
00:45:47,011 --> 00:45:49,178
    The sheikh who taught me
          said to me,

895
00:45:49,247 --> 00:45:52,014
     "Do you normally start
        from verse five

896
00:45:52,083 --> 00:45:54,117
"or do you start from verse one?

897
00:45:54,185 --> 00:45:56,652
  Let's start from verse one."

898
00:45:56,721 --> 00:46:00,089
 And then you get the context,
   "This is about the treaty

899
00:46:00,158 --> 00:46:04,727
  that we had with the pagans
         at that time."

900
00:46:04,796 --> 00:46:07,463
       Oh, so it's a very
       specific context.

901
00:46:07,532 --> 00:46:10,166
    Then verse four, the one
    right before five, says,

902
00:46:10,235 --> 00:46:14,470
      "This does not apply
      to those polytheists

903
00:46:14,539 --> 00:46:15,972
 "who did not break the treaty

904
00:46:16,040 --> 00:46:20,109
     and did not fight you
    because you're Muslims."

905
00:46:20,178 --> 00:46:22,779
  By the end of the two years,
          I realized,

906
00:46:22,847 --> 00:46:25,581
"Man, I had it wrong all along."

907
00:46:25,650 --> 00:46:30,720
Now I'm empowered with this new
   understanding that I have.

908
00:46:30,789 --> 00:46:32,989
      And the guy told me,
       he says, "Go back.

909
00:46:33,057 --> 00:46:34,590
 "Go back and teach the people,

910
00:46:34,659 --> 00:46:36,659
  "and keep your people safe.

911
00:46:36,728 --> 00:46:39,061
     "This is not our way.

912
00:46:39,130 --> 00:46:41,197
      Show them the way."

913
00:46:41,266 --> 00:46:42,999
           NARRATOR:
     Which is what he did.

914
00:46:43,067 --> 00:46:45,968
A father of five, he has devoted
      his life to the idea

915
00:46:46,037 --> 00:46:50,306
      that what he learned
    can be taught to others.

916
00:46:50,375 --> 00:46:52,375
            SHAIKH:
     You have to show them
    that what they're doing

917
00:46:52,443 --> 00:46:54,310
 is actually not Islam at all.

918
00:46:54,379 --> 00:46:56,879
     It's this other thing
     that they've created,

919
00:46:56,948 --> 00:46:58,414
   thinking that it's Islam,

920
00:46:58,483 --> 00:47:00,983
 thinking that it's a solution,
 but in fact it's the problem.

921
00:47:01,052 --> 00:47:02,819
           REPORTER:
         ISIS was quick

922
00:47:02,887 --> 00:47:04,987
    to claim responsibility
     for today's attack...

923
00:47:05,056 --> 00:47:06,956
           NARRATOR:
     As the world searches
   for answers to extremism,

924
00:47:07,025 --> 00:47:09,325
      more and more people
         are listening

925
00:47:09,394 --> 00:47:11,494
   to messages like Mubin's,

926
00:47:11,563 --> 00:47:13,462
 asking whether what helped him

927
00:47:13,531 --> 00:47:17,333
       can be implemented
  on a more widespread basis.

928
00:47:17,402 --> 00:47:19,402
     In the United States,
        the idea is new,

929
00:47:19,470 --> 00:47:23,039
   but it's now being tested
       in the heartland.

930
00:47:27,011 --> 00:47:28,077
          Minneapolis.

931
00:47:28,146 --> 00:47:30,213
     The metro area is home

932
00:47:30,281 --> 00:47:33,249
 to the largest Somali-American
     community in the U.S.:

933
00:47:33,318 --> 00:47:36,319
       25,000 live here.

934
00:47:36,387 --> 00:47:37,987
   Many came in the mid-'90s,

935
00:47:38,056 --> 00:47:42,658
when their home country was torn
         by civil war.

936
00:47:42,727 --> 00:47:44,894
         In this refuge
      from their homeland,

937
00:47:44,963 --> 00:47:48,097
   they found relative peace
        and prosperity,

938
00:47:48,166 --> 00:47:51,267
   a peace that was recently
           shattered.

939
00:47:51,336 --> 00:47:52,335
           REPORTER:
       These are the type

940
00:47:52,403 --> 00:47:53,536
  of terrorism-related arrests

941
00:47:53,605 --> 00:47:55,838
  that we're seeing more often
          in the U.S.

942
00:47:55,907 --> 00:47:58,574
           NARRATOR:
        Nine young men,
all in their teens or early 20s,

943
00:47:58,643 --> 00:48:00,042
         were arrested.

944
00:48:00,111 --> 00:48:02,178
 They planned to make their way
           into Syria

945
00:48:02,247 --> 00:48:03,279
       to fight for ISIS.

946
00:48:03,348 --> 00:48:05,548
...dozens of whom have traveled,

947
00:48:05,617 --> 00:48:07,617
    or attempted to travel,
          overseas...

948
00:48:07,685 --> 00:48:10,653
           NARRATOR:
  The case of these young men
         is one chapter

949
00:48:10,722 --> 00:48:14,090
      in a long, sad story
       here in Minnesota.

950
00:48:14,158 --> 00:48:16,125
   The exodus began in 2007,

951
00:48:16,194 --> 00:48:20,396
  as young Somalis were called
  by the likes of Omar Hammami

952
00:48:20,465 --> 00:48:23,165
to join the ranks of al Shabaab

953
00:48:23,234 --> 00:48:25,334
         in the country
    of their ethnic origin.

954
00:48:25,403 --> 00:48:26,669
     We need more like him,

955
00:48:26,738 --> 00:48:28,571
    so if you can encourage
     more of your children

956
00:48:28,640 --> 00:48:30,239
 and more of your neighbors...

957
00:48:30,308 --> 00:48:33,276
           NARRATOR:
 But these nine young men were
    called to Syria by ISIS.

958
00:48:35,079 --> 00:48:39,282
   Minnesota has the greatest
number of terrorism prosecutions

959
00:48:39,350 --> 00:48:42,151
of any of the federal districts
     in the United States.

960
00:48:42,220 --> 00:48:44,687
           NARRATOR:
    Chief Judge John Tunheim
            believes

961
00:48:44,756 --> 00:48:47,356
 it's time for a new approach.

962
00:48:47,425 --> 00:48:51,093
But this is uncharted territory.

963
00:48:51,162 --> 00:48:53,062
 There is no national protocol,

964
00:48:53,131 --> 00:48:56,465
       no evaluation tool
   that we are able to find.

965
00:48:56,534 --> 00:48:59,468
    So that's why we decided
     we would take the lead

966
00:48:59,537 --> 00:49:01,837
  on trying to develop tools,

967
00:49:01,906 --> 00:49:05,174
so that we can provide that kind
    of assistance to judges

968
00:49:05,243 --> 00:49:09,845
   and ultimately, hopefully,
   to the Bureau of Prisons.

969
00:49:09,914 --> 00:49:11,681
           NARRATOR:
Right now, U.S. federal prisons

970
00:49:11,749 --> 00:49:15,618
          do not have
 any deradicalization programs,

971
00:49:15,687 --> 00:49:18,721
    but the judge is pushing
       for them to start.

972
00:49:18,790 --> 00:49:20,356
      For now, he's trying
          to determine

973
00:49:20,425 --> 00:49:24,660
    which of these young men
    could be deradicalized.

974
00:49:24,729 --> 00:49:29,165
 To find out, he turned to this
   man in Stuttgart, Germany.

975
00:49:29,233 --> 00:49:30,633
        DANIEL KOEHLER:
  It's like peeling an onion.

976
00:49:30,702 --> 00:49:32,201
        Layer by layer,

977
00:49:32,270 --> 00:49:34,103
            you try
  to work yourself to the core

978
00:49:34,172 --> 00:49:35,338
 and offer something that gets

979
00:49:35,406 --> 00:49:37,273
    more and more attractive
        to that person,

980
00:49:37,342 --> 00:49:40,876
  to compete with a narrative
of groups like ISIL or al Qaeda.

981
00:49:40,945 --> 00:49:43,212
           (chanting)

982
00:49:43,281 --> 00:49:47,850
           NARRATOR:
Daniel Koehler has deradicalized
      neo-Nazis for years,

983
00:49:47,919 --> 00:49:50,586
    and he says the approach
       is much the same,

984
00:49:50,655 --> 00:49:56,359
but the enticement to religious
 extremism is very compelling.

985
00:49:56,427 --> 00:50:00,629
 It's the opportunity to become
  a hero, to become a martyr,

986
00:50:00,698 --> 00:50:03,699
        to serve a cause
     greater than your own.

987
00:50:03,768 --> 00:50:06,402
           NARRATOR:
  Psychologist Arie Kruglanski
            believes

988
00:50:06,471 --> 00:50:10,339
 it is very difficult to offer
    an alternative to that,

989
00:50:10,408 --> 00:50:14,677
   but he has data that shows
    deradicalization works.

990
00:50:14,746 --> 00:50:18,981
  In Sri Lanka, he studied the
Tamil Tigers at different times

991
00:50:19,050 --> 00:50:23,352
  during their first year home
    after a long civil war.

992
00:50:23,421 --> 00:50:25,888
  Some were exposed to a full
   deradicalization program,

993
00:50:25,957 --> 00:50:29,492
        others were not.

994
00:50:29,560 --> 00:50:32,428
          KRUGLANSKI:
 We found a significant decline
          in violence

995
00:50:32,497 --> 00:50:35,464
   in the experimental group
  that received the treatment,

996
00:50:35,533 --> 00:50:37,466
as compared to the control group

997
00:50:37,535 --> 00:50:38,601
         that received
    only minimal treatment.

998
00:50:40,805 --> 00:50:44,340
   Human minds, human psyches
         are malleable.

999
00:50:44,409 --> 00:50:45,975
       They are pliable.

1000
00:50:46,044 --> 00:50:48,411
        In the same way
 as a person gets radicalized,

1001
00:50:48,479 --> 00:50:49,879
    changes from, you know,

1002
00:50:49,947 --> 00:50:52,982
  a mainstream kind of person
  to a fringe kind of person,

1003
00:50:53,051 --> 00:50:54,350
   they can be brought back,

1004
00:50:54,419 --> 00:50:55,851
     and also, they can be
        re-radicalized.

1005
00:50:55,920 --> 00:50:59,789
           NARRATOR:
    This is risky business.

1006
00:50:59,857 --> 00:51:01,757
   A failed deradicalization
            attempt

1007
00:51:01,826 --> 00:51:04,427
  can make things even worse.

1008
00:51:04,495 --> 00:51:07,396
            KOEHLER:
If you fail to convince someone

1009
00:51:07,465 --> 00:51:09,932
   that that certain ideology
       or that narrative

1010
00:51:10,001 --> 00:51:11,700
      is inherently wrong,

1011
00:51:11,769 --> 00:51:15,171
 you will inoculate that person
    against these arguments.

1012
00:51:15,239 --> 00:51:20,209
That person will leave the room
    as much more radicalized

1013
00:51:20,278 --> 00:51:23,512
    and much more convinced
that he or she is actually right

1014
00:51:23,581 --> 00:51:25,581
      about their beliefs,
    about their viewpoints,

1015
00:51:25,650 --> 00:51:28,417
      and they can go on,
       radicalize others

1016
00:51:28,486 --> 00:51:30,019
    and spread that message.

1017
00:51:30,088 --> 00:51:32,621
    That would be one risk.

1018
00:51:32,690 --> 00:51:36,492
           NARRATOR:
 Indeed, prisons in Europe have
  become jihadi universities.

1019
00:51:36,561 --> 00:51:41,497
And Judge Tunheim wants to make
 sure that doesn't happen here.

1020
00:51:48,406 --> 00:51:51,507
       Still, setting up
   a deradicalization program

1021
00:51:51,576 --> 00:51:53,709
   is neither cheap nor easy

1022
00:51:53,778 --> 00:51:56,846
  and there are many possible
          approaches.

1023
00:51:56,914 --> 00:52:00,149
     But there is no doubt
     in Mubin Shaikh's mind

1024
00:52:00,218 --> 00:52:03,452
  that these efforts can work,
    if they are done right.

1025
00:52:03,521 --> 00:52:06,489
            SHAIKH:
  It's closer to more art form
      than it is science.

1026
00:52:06,557 --> 00:52:09,892
  A lot of it is interpersonal
         communication.

1027
00:52:09,961 --> 00:52:12,995
  It's, a lot of it is, like,
     mediation principles,

1028
00:52:13,064 --> 00:52:16,499
talking to people, understanding
     where they come from.

1029
00:52:16,567 --> 00:52:19,235
Principles of social work apply
            to this.

1030
00:52:19,303 --> 00:52:22,538
          So it draws
   from multiple disciplines,

1031
00:52:22,607 --> 00:52:24,039
   but at the end of the day,

1032
00:52:24,108 --> 00:52:26,108
  it comes down to the person
     that's delivering it.

1033
00:52:26,177 --> 00:52:31,514
           NARRATOR:
 Unfortunately, this potential
  solution moves a lot slower

1034
00:52:31,582 --> 00:52:35,351
       than the wildfire
       it aims to douse.

1035
00:52:35,419 --> 00:52:38,754
  Deradicalizing an individual
          takes time.

1036
00:52:38,823 --> 00:52:40,623
         Do we have it?

1037
00:52:40,691 --> 00:52:45,027
            SHAIKH:
 There're so many young people
  that are being lost to this.

1038
00:52:45,096 --> 00:52:46,795
    What are we waiting for?

1039
00:52:46,864 --> 00:52:49,098
  Stop waiting, we don't have
      the luxury of time.

1040
00:53:12,190 --> 00:53:14,890
     <i> This</i> NOVA<i> program is</i>
       <i> available on DVD.</i>

1041
00:53:14,959 --> 00:53:20,296
 <i> To order, visit shopPBS.org,</i>
    <i> or call 1-800-play-PBS.</i>

1042
00:53:20,364 --> 00:53:22,965
     NOVA<i> is also available</i>
    <i> for download on iTunes.</i>

