Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

The online history of the Highland Park suspect reveals a fascination with violence

LEILA FADEL, HOST:

In Highland Park, Ill., the suspect in the July Fourth parade shooting was charged with seven counts of first-degree murder. And the Lake County prosecutor says he expects the accused 21-year-old will face more charges. Investigators believe he planned this mass killing for weeks, that he climbed onto a roof and used a high-powered rifle to fire more than 70 rounds into the crowd below. And he wore women's clothing and possibly a wig to hide his identity. He purchased that rifle legally, along with four other weapons. And this was after police were called to his home twice over threats of violence and suicide in 2019. The accused killer also posted online a lot - the content often violent. Researchers say it fits into an emerging profile of new extremist activity online. Joining us now to discuss this is Alex Newhouse. He's the deputy director of the Center on Terrorism, Extremism and Counterterrorism at the Middlebury Institute of International Studies. Alex, welcome to the program.

ALEX NEWHOUSE: Thanks for having me.

FADEL: So what jumped out at you? When you were looking at what the Highland Park suspect was doing online, what jumped out?

NEWHOUSE: So I think the first thing to note is that there's no clear ideological or even political motive to what the suspect was doing online and how he ended up carrying out the attack. So what we can say, though, is that his online activity fits a profile of a person immersed in these deep, deep internet communities at the very, very fringes of certain types of community activity, including types of communities that are organized around collaborative fiction and what we call alternate reality games, which are basically giant, large-scale community puzzles. And the thing to note about these communities is at the very, very fringe of them, often the participants in the communities basically lose track of their own identities and can't - and end up not even being able to distinguish between what is real and what is fake. And at that point, they start sharing content that's designed to transform their minds and make them more amenable to using violence. This is a pattern that we've seen in the past in a couple of other recent mass shootings. And it fits the pattern of the online activity of the suspect in the Highland Park shooting.

FADEL: Are these online communities actually organized or is it just a group of people posting gore online? I mean, what are they?

NEWHOUSE: They are pretty disorganized. They are pretty decentralized. And they come around - they basically come about and evolve out of a shared interest in a particular type of collaborative puzzle - a particular type of fiction. And they end up organizing themselves almost like extreme forms of fandom. So, like, you know, you have your "Star Wars" fandom, which is sort of a mainstream version of this. But at the very, very extreme edge of it, you can end up with these fandoms of people who basically combine their own personal identity with the type of content that they're organizing around and the community that they're organizing with.

FADEL: Why?

NEWHOUSE: It's a part of just, you know, finding - trying to find some sort of meaning in a world where they oftentimes feel very alienated and isolated. One of the things we know about radicalization and mobilization to violence is that alienation and isolation is the core driver of it in a lot of cases. So for this suspect, for instance, what we can say is that about two years ago, he seems to have become increasingly isolated from his in-person support networks and even from his music community. But then he ended up increasing his activity in these fringe spaces, these spaces that are obsessed with gore, obsessed with puzzles and numbers. And, again, that fits this form of extremist mobilization that we've been seeing.

FADEL: But this stuff means depictions of violence, shootings, racist, offensive posts. They're so common online. So when do you know that this is a cause for concern and there's this blurring of reality and fiction? And when is it just people posting awful stuff online?

NEWHOUSE: Yeah, and that's still an open question in a lot of cases. But what we can say is that out of these very - these type of communities, there is this sort of - this type - this shared aesthetic in the content that they're posting that we can point to and say, OK, this is the type of content that we're talking about. This is the warning sign. And it's content that's purposely designed to be incoherent and it's cobbled together from a variety, like a mishmash of different influences. And oftentimes, it can seem completely politically contradictory and paradoxical. So it's content that includes things like basically incoherent static, really, really fast cuts between different things, a lot of neon lights, a lot of very, very loud and pretty aggressive audio. And all of that is designed to break down a person's reluctance to commit violence. So as researchers of this trend, we have observed this sort of developing a shared aesthetic style that we can point to as an indicator of when these communities are moving to a point where violence becomes more likely.

FADEL: So is that what you recommend when you work with tech companies that are trying to figure out how to handle this content?

NEWHOUSE: Yeah, we work with tech companies to try to identify this aesthetic style more quickly. And we also work with them to understand that in a lot of cases, violence today, extremist violence, isn't necessarily ideological or political. So we have to change the way we think about it. So in a lot of cases, what we'll do is we'll work with them to understand all of these types of violence are social issues. They're issues built around relationships between people and community dynamics. So this suspect, for instance, although he acted alone in person, in reality, what he was doing online indicated a deep connection with other people and other communities. So we work with them - we work with tech companies and policymakers to understand, OK, let's go look and detect the communities of people, the networks of people, that are organizing around this type of content and producing it so we can better detect and disrupt those in the future.

FADEL: In the few seconds we have left, the Highland Park mayor has called this an act of terrorism. Does it fit that description for you?

NEWHOUSE: It doesn't fit an act of terrorism because it doesn't necessarily have a political motive. But what we can say is it was designed to cause panic, and it was designed to be copied.

FADEL: Alex Newhouse, deputy director of the Center on Terrorism, Extremism and Counterterrorism at the Middlebury Institute, thank you so much for your time.

NEWHOUSE: Thank you. Transcript provided by NPR, Copyright NPR.