When the Screen Becomes the Recruiter: How Americans Are Being Radicalized — and Why It’s Accelerating
A troubling pattern is showing up across headlines, court filings, and intelligence briefings: radicalization no longer needs a physical pipeline. No charismatic recruiter. No secret meeting. No in-person “community.” Increasingly, it just needs a phone and enough time alone with an algorithm.
Former U.S. diplomat Simon Hankinson recently argued that we’re seeing a rise in what he called “lone wolf amateur terrorism,” where violence or attempted violence is pursued by individuals who appear to have been pulled into extremist ideologies primarily through online interactions. Whether or not one agrees with every framing or conclusion, the underlying reality is hard to ignore: “screens alone” can now supply the propaganda, the social reinforcement, the tactical guidance, and the sense of belonging that used to require real-world proximity.
A warning shot from Australia
Hankinson pointed to the December 14, 2025 mass shooting at Bondi Beach in Sydney, where attackers opened fire at a Hanukkah celebration, killing 15 people and injuring dozens—an event Australian authorities have treated as terrorism. The details are horrifying, but the broader takeaway matters: even in a country known for strict gun laws and relatively rare mass shootings, a small number of determined actors can still produce catastrophic harm—especially when motivated by extremist ideology.
The modern challenge isn’t only weapons access. It’s ideological acceleration. The “why” travels faster than the “how,” and online ecosystems are built to move “why” at the speed of outrage.
The American cases that should sober us
Two recent U.S. cases underscore the point Hankinson raised: people who are not recent immigrants, not raised in a household shaped by a foreign terrorist organization’s worldview, and yet allegedly attempted to support the Islamic State after becoming immersed in online propaganda and contacts.
1) John Michael Garza Jr. (Texas)
The U.S. Department of Justice announced charges against a Texas man accused of providing bomb components and money to individuals he believed were acting on behalf of ISIS, including giving instructions to an undercover agent on bomb-making.
2) Christian Sturdivant (North Carolina)
In North Carolina, DOJ announced charges against an 18-year-old accused of attempting to provide material support to ISIS, after communicating with undercover personnel and allegedly discussing a planned New Year’s Eve attack using knives and hammers.
It’s important to say plainly: allegations are not convictions, and every defendant deserves due process. But the pattern law enforcement keeps describing is consistent: radicalization and mobilization can happen largely online, with “micro-communities” that feel intensely real to the participant even when they’re entirely digital.
This isn’t only an “Islamist terrorism” problem
We need to be careful and honest here. The temptation in public discourse is to treat “radicalization” as a single-lane road that always ends in the same ideological neighborhood. But U.S. government assessments have been warning for years that lone offenders and small groups—motivated by a range of ideologies and personal grievances—pose a persistent threat, and that online ecosystems are key accelerants.
The FBI has explicitly noted that radicalization to violence increasingly takes place online, where propaganda, recruitment, and incitement can be distributed at scale.
In other words: the machine is the problem. Extremist ideologies differ, but the digital pathway—rage, isolation, identity hunger, reinforcement, escalation—looks eerily similar across causes.
What actually happens online: the radicalization “loop”
Most radicalization is not a single epiphany. It’s a loop. Here’s a common pattern researchers and security agencies repeatedly describe (even if they label the steps differently):
A wound meets a story.
Loneliness, humiliation, resentment, fear, boredom, grief—then a narrative appears that explains it all and offers someone to blame.Curiosity becomes consumption.
A teen watches one clip “just to see.” Then another. The feed learns what spikes emotion and serves more of it.Community becomes simulation.
Likes, comments, DMs, “follower” status, inside jokes, and constant content create a feeling of belonging without accountability.Identity hardens.
The person becomes what they consume. The ideology is no longer an opinion; it’s a self.Escalation becomes a virtue.
Moderation is mocked as cowardice. Restraint is framed as betrayal. Violence becomes “meaning.”
DHS bulletins have repeatedly warned that extremist actors use online messaging and calls for violence to motivate supporters, and that lone offenders are difficult to detect and disrupt.
The new frontier: AI companions and synthetic persuasion
Hankinson’s most unsettling point wasn’t just about social media. It was about AI-mediated relationships—convincing avatars and chatbots that can simulate empathy, attraction, validation, and shared purpose.
Even without assuming sinister conspiracies, the risk is real: a system optimized for engagement can become a personalized persuader. It can mirror a user’s anger, intensify it, and present radical content as “understanding” rather than manipulation. The question Hankinson asked—“who controls them?”—is worth taking seriously, because incentives alone are dangerous. A profit motive can push people into increasingly extreme content simply because extremity retains attention.
This is where Christian reflection gets painfully practical.
A Christian lens: formation is never neutral
Scripture’s warnings about the heart and the mind aren’t abstract. They’re diagnostic.
“The heart is deceitful above all things…” (Jeremiah 17:9)
“Do not be conformed to this world, but be transformed by the renewal of your mind.” (Romans 12:2)
“Bad company corrupts good morals.” (1 Corinthians 15:33)
“Guard your heart, for everything you do flows from it.” (Proverbs 4:23)
In earlier generations, “company” meant people you physically spent time with. Today, “company” includes what you scroll, what you binge, what you argue with at midnight, what you let narrate your life. If a person can spend six hours a day in digital “companionship” with unseen voices—and those voices are optimized to inflame, flatter, and isolate—then we shouldn’t be surprised when minds deform.
The incarnational logic of the Christian faith matters here: God didn’t send a viral message; He entered the world in flesh. That’s not just theology—it’s a rebuttal to disembodied formation. When “the screen becomes confessor, teacher, and companion,” the question isn’t merely what are we watching? It’s who is discipling us?
Parents, churches, and communities: what counter-radicalization looks like in real life
Hankinson’s proposed countermeasure was profoundly simple: reclaim in-person time and connection. And it aligns with what many threat assessments imply: isolation is gasoline; embodied relationships are firebreaks.
Here are concrete, non-hysterical steps that actually help:
1) Make “offline presence” a spiritual discipline, not a punishment.
Family meals. Shared work. Real conversations. Eye contact. Serving together. These are not quaint traditions; they’re resilience training.
2) Teach discernment the way you teach literacy.
Kids need a framework:
Who is speaking?
What do they want from me?
How do they make money or gain power?
What emotions are they trying to trigger?
What would they say if the goal were truth rather than control?
3) Normalize confession before crisis.
If a young person is afraid to say, “I’m watching dark things” or “I’m obsessed,” you’ve lost the chance to intervene early. The goal is honesty, not interrogation.
4) Replace algorithmic belonging with covenant belonging.
Church is supposed to be a community where people are known—not performed. When people don’t feel known, they go find a tribe that will take them, even if it’s destructive.
5) Treat devices like power tools.
Power tools require training, limits, and supervision because they can build—or they can maim. Phones are power tools for the mind.
The deeper unease
The extreme cases—terror plots, mass killings—are not the daily reality for most Americans. But they function like a flare in the night sky: they reveal how vulnerable the human person can be when identity is outsourced to feeds, grievances are amplified without correction, and companionship is simulated rather than lived.
So the final question isn’t only, “Can online echo chambers create terrorists?” It’s also, “What are they doing to the rest of us—quietly?” If a screen can move someone from curiosity to violent allegiance, it can certainly move millions from patience to contempt, from empathy to dehumanization, from truth-seeking to permanent suspicion.
Hour by hour, scroll by scroll—who is forming us?
References
Holliday, S. (2026, January). Are Americans being radicalized online and converting to Islam? The Washington Stand.
Reuters. (2026, January 22). Australia begins day of mourning for victims of Bondi Beach attack.
Reuters. (2025, December 17). Alleged Bondi gunman charged with 15 murders as funerals begin for victims.
U.S. Department of Justice, Office of Public Affairs. (2025, December 29). Texas man charged with providing bomb components and funding to individuals he believed were involved with foreign terrorist organization (Press release).
U.S. Department of Justice, Office of Public Affairs. (2026, January 2). FBI disrupts alleged New Year’s Eve attack, man charged with attempting to provide material support to ISIS (Press release).
Associated Press. (2026, January 2). FBI says it disrupted a New Year’s Eve attack plan inspired by Islamic State group.
Federal Bureau of Investigation. (n.d.). Terrorism.
Federal Bureau of Investigation, & U.S. Department of Homeland Security. (2021, May). Strategic intelligence assessment and data on domestic terrorism.
U.S. Department of Homeland Security. (2024). Homeland Threat Assessment 2025.