Before you try to convince someone else they've been manipulated — read this carefully
What you're about to read is not a conspiracy theory. The mechanisms described on this page are documented, peer-reviewed, and in several cases — admitted by governments themselves. We've just put them together. In order. Side by side. The question isn't whether influence operations exist. The question is: are you already inside one?
You've had the conversation. Maybe at Christmas dinner, over a beer, or in a text thread that turned ugly fast. Someone you know believes something you find incomprehensible — a government cover-up, a shadowy global elite, a media conspiracy so vast it beggars belief. You present the facts. They have an answer ready. You show them evidence. They dismiss the source. You ask how they came to believe what they believe. They stare at you like you've insulted their mother.
Eventually you walk away shaking your head, utterly convinced of one thing: that person has been brainwashed.
Brainwashing is a word that conjures Cold War imagery — shadowy interrogation rooms, cult compounds, men in white coats. The modern reality is far more mundane, far more pervasive, and far harder to detect. It happens on your phone, while you're eating breakfast.
Before we go any further, let's establish something that gets lost in the noise: governments have, as a matter of documented, declassified historical fact, run programs specifically designed to manipulate what their own citizens believe. This is not a theory. It is in the files.
In June 2007, the CIA released a 702-page document known as the "Family Jewels" — compiled in response to a 1973 directive asking agency employees to report activities inconsistent with the CIA's charter. Among its contents: confirmation of a program called Project Mockingbird — a telephone intercept operation conducted March–June 1963, targeting Washington-based journalists publishing classified materials. Approved at the highest levels, including coordination with Attorney General Robert F. Kennedy and Secretary of Defense Robert McNamara.
That is the confirmed program. Congressional investigators found considerably more. The Church Committee — the Senate investigation convened in the mid-1970s to examine intelligence abuses — concluded: "The first [concern] is the potential, inherent in covert media operations, for manipulating or incidentally misleading the American public. The second is the damage to the credibility and independence of a free press which may be caused by covert relationships with U.S. journalists and media organisations."
CIA Director Frank Wisner, key architect of the broader effort, reportedly referred to the agency's media network as his "Mighty Wurlitzer" — an instrument, he claimed, he could play to produce almost any propaganda effect desired, across a network of recruited journalists at major newspapers and broadcasters.
The lesson here is not paranoia. It is precision. It is the habit of asking, about every piece of information you consume: who produced this, why, and what do they want me to believe as a result? That habit — rigorous, consistent, uncomfortable — is the only real defence. And most people have never developed it.
The blunt instruments of the Cold War — recruited journalists, planted stories, wiretapped phones — have been superseded by something far more elegant and far more total.
Every social media platform you use is built on a system that learns what keeps you engaged and serves you more of it. Outrage keeps you engaged. Fear keeps you engaged. The feeling of being part of an embattled, truth-knowing minority keeps you very engaged. The algorithm is indifferent to whether what it's showing you is true. It only cares that you don't put the phone down.
Nudge units — departments of behavioural scientists whose explicit job is to design environmental cues that steer your decisions without you noticing — operate at the national level across governments worldwide, as well as through the World Bank, the United Nations, and the European Commission. Nobel laureate Daniel Kahneman identified two modes of human thinking: the fast, automatic, emotional brain — and the slow, deliberate, rational one. Every sophisticated manipulation technique, from advertising to propaganda to algorithmic curation, is designed to activate the first and bypass the second. By the time your rational mind gets involved, the emotional conclusion has already been reached. The reasoning you then construct is not analysis. It is justification.
Now scale that up. Apply it not to cutlery — but to who you vote for. Who you distrust. What you believe constitutes a threat. What kind of person deserves your sympathy. This is not an accident. It is a design choice, made by people who understand your psychology better than you do.
Once you know what to look for, the signs are unmistakable. Pay careful attention — because some of these will feel uncomfortably familiar.
Raise a topic that challenges their worldview and watch what happens before a single word is spoken. A flash of contempt. A jaw that tightens. Eyes that harden. A brainwashed person does not engage your argument — they react to the threat of it. The emotion arrives before the thought, every time, because the belief is not held intellectually. It is held as identity. Challenging it does not feel like a debate. It feels like an attack on the self. This is by design. When belief becomes identity, it becomes essentially immune to evidence.
Ask someone who has genuinely reasoned their way to a conclusion how they got there and they will walk you through it — the sources they weighed, the doubts they wrestled with, the moment something clicked. Ask a brainwashed person the same question and something strange happens: they recount the topic. More detail. More fervour. More certainty. But the origin is simply not there. The belief has no traceable roots because it was never grown — it was installed. This inability to reconstruct the reasoning process is one of the most reliable indicators that no reasoning process ever took place.
You offer a counter-argument. Instead of engaging the argument, they engage you. Suddenly you're naive. Compromised. Brainwashed. One of them. The ad hominem — attacking the person rather than the argument — is the hallmark of reasoning that has no other defence. It is what you reach for when the evidence isn't there. A person who cannot defeat your argument but can make you seem untrustworthy, stupid, or corrupt has achieved the same practical outcome: the belief survives intact. Watch how quickly the conversation shifts from what you said to who you are.
These may be the most insidious phrases in the brainwashed vocabulary — precisely because they sound like appeals to evidence while being the exact opposite. A genuine appeal to authority involves a named, verifiable, credentialed source whose reasoning can be examined and challenged. When someone cannot name the expert, cannot cite the study, cannot point you to the original source — but speaks with total confidence about what everyone knows — they are not reasoning. They are reciting. The belief was handed to them pre-certified, and they accepted the certification without ever checking the credentials. "Experts say so" is not an argument. It is the end of an argument, specifically designed to make further questioning feel ignorant or dangerous.
These four signs are not a profile of a stranger. They are a mirror. Every single one of them describes a process that happens to intelligent, well-intentioned, educated people — every day — because the systems engineering these responses are not aimed at the foolish. They are aimed at everyone. The foolish are just easier. You, reading this and feeling certain these signs apply to someone else and not to you — that certainty is exactly what the machine is counting on.
Everything described on this page — the nudge programs, the manipulation of public belief, the use of federal power to shape what citizens think and do — has been discussed in academic papers and government policy documents for years. In April 2024, it was said out loud, on hidden camera, by someone who claimed to know how it works from the inside.
Note: The individual named in the original recording is identified in publicly available reporting. We have chosen to use a false name here — "David Marsh" — out of an abundance of caution and to keep focus on what was said, not who said it. The source video is publicly available and verifiable.
"David Marsh," identified in the underlying investigation as a Contracting Officer for the CIA and former FBI official, was recorded in an undercover investigation by Sound Investigations. In the footage, he boasted that the FBI "can put anyone in jail… set 'em up."
"We call it a nudge," he said — describing the bureau's capacity to target individuals it considers problematic, including named journalists and public figures. He claimed the FBI "did what we wanted" in the case of one high-profile media personality, saying they "took his money away" and "chopped his legs off."
He also claimed the FBI uses "embellished" news and "fake social media" to "really get people mad" — describing a deliberate strategy of emotional manipulation through fabricated or distorted online content.
He stated that at least 20 undercover FBI agents were present at the US Capitol on January 6, 2021, saying: "There always are when there's a big protest in DC. Just in case it gets out of hand."
Let that settle for a moment. Not a whistleblower filing a formal complaint. Not a declassified document released fifty years later. A man describing, casually, over what appears to be a social setting, the institutional capacity to manufacture the conditions to imprison anyone — framed with the same language used by behavioural scientists and government policy units worldwide. "We call it a nudge."
The word "nudge" in this context is not metaphorical. It is the precise technical term from Kahneman's behavioural economics — the same framework embedded in government policy units across the democratic world. Its use here, in this context, by someone describing the targeting of private citizens, is the connection this page has been building toward. The academic language and the operational language are the same language. They describe the same toolkit. The question is only who is being pointed at, and why.
"David Marsh" has not been charged with any offence in connection with these statements. The CIA and FBI have not officially commented on the specific claims made in the recording. The video exists. The words were said. What you do with that information is your choice — but apply the same standard you would apply to anything else on this page: find the original source. Watch it yourself. Form your own view. Do not take our word for it. Do not take anyone's word for it.
Here is a question worth sitting with: when you feel the weight of public opinion pushing you toward a belief — the sense that everyone seems to think a certain way — how much of that crowd is real?
Astroturfing is the deliberate creation of fake grassroots support — organised campaigns or paid actors designed to look like spontaneous public opinion. The term comes from AstroTurf, the synthetic substitute for real grass. The analogy is precise: just as artificial turf is engineered to look like something it isn't, astroturfing manufactures the appearance of mass public sentiment that does not exist.
It exploits one of the deepest cognitive tendencies in human beings: conformity. If you believe the crowd supports something, you are far more likely to align your own views accordingly — even ignoring evidence, and even overriding your own prior beliefs. The manufactured consensus does not need to convince you directly. It simply needs to make dissent feel lonely, fringe, and socially dangerous.
Research shows that roughly 74% of accounts in documented astroturfing campaigns engage in coordinated co-tweeting and co-retweeting — synchronised patterns rare among real users, leaving fingerprints that researchers can detect. But by the time detection happens, the damage to public perception is already done.
Modern astroturfing uses social media, forums, and comment sections to flood discussions with manufactured support or opposition, effectively drowning out the voices of real citizens. State actors have entered this space — China's government, for instance, employs paid online commentators known as the "50 Cent Party," tasked with seeding pro-regime narratives across forums and presenting them as spontaneous public sentiment, making fabricated support a tool of state-level information warfare.
This is not limited to authoritarian states. In democratic contexts, astroturfing functions as a mechanism for manufacturing consent — power reproduced not through censorship but through the orchestration of discourse, where the appearance of grassroots consensus is shaped by elite interests. You are not being told what to think. You are simply being shown — repeatedly, convincingly — that everyone else already thinks it.
The echo chamber works in tandem with astroturfing. The algorithm learns what you engage with and feeds you more of it. Over time, your feed becomes a perfect reflection of your existing beliefs — amplified, validated, and stripped of challenge. Every scroll confirms what you already think. Every post you see has already been pre-selected to agree with you. The world, as your phone presents it, seems to be overwhelmingly on your side. That sensation of consensus is engineered. The people who disagree with you have simply been algorithmically removed from your view — and you from theirs.
Chase Hughes is a retired US Navy Chief, Harvard-educated neuroscientist, and the leading military and intelligence behaviour expert, with 20 years developing the most advanced behaviour skill courses available to government agencies worldwide. He trains elite groups on behaviour profiling, deception detection, interrogation, and advanced influence techniques. He is the bestselling author of The Ellipsis Manual and The Behavior Operations Manual. Unlike almost everyone working at this level, Hughes has chosen to make what he knows public. His reasoning is simple: these tools are already being deployed on you. You deserve to understand the mechanism.
Central to Hughes' work is a distinction most people never think about: you do not have one brain making decisions. You have, broadly, three layers — the rational neocortex (the part that reads this article), the emotional mammalian brain (the part that feels), and the ancient brainstem (the part that keeps you alive). Manipulation, at its most sophisticated, does not target your rational mind at all. It targets the mammalian layer — the part that does not speak language, cannot be argued with, and responds only to primal signals of threat, belonging, authority, and tribe.
Hughes identifies four ancient survival mechanisms hardwired into the mammalian brain — passed down from our ancestors because they kept them alive. He calls them the FATE model. They are your fate because they operate beneath conscious awareness, and anyone who knows how to trigger them can steer your behaviour without touching your rational mind at all.
F — Focus. Novelty creates automatic human focus. Anything new, surprising, or unexpected hijacks attention before the rational brain can intervene. Social media feeds are engineered as novelty machines — every scroll delivers something new before the previous stimulus has been processed. Your focus is being harvested before you have formed a single conscious thought about what you are looking at.
A — Agitation. Changing the emotional environment causes hyperfocus. When the mammalian brain detects threat or disturbance, it narrows attention to the source. Outrage, fear, moral disgust — these are agitation signals. They do not just make you angry. They make you attentive in a way that bypasses critical screening. The content that most reliably produces agitation is therefore the content most reliably consumed, shared, and believed.
T — Tribal Signals. The mammalian brain cannot distinguish between real social consensus and manufactured consensus. It only reads the signal: is the tribe with me or against me? Hughes notes that social media falsifies tribal agreement — showing you that vast numbers of people believe a particular thing, triggering the ancient conformity instinct regardless of whether those numbers are real. Your mammalian brain does not fact-check. It counts.
E — Emotion. Anything experienced with strong emotion is stored deeply and durably in memory. The mammalian brain uses emotional intensity as a marker for importance — the stronger the feeling, the more permanent the impression. This is why propaganda is not delivered as dry information. It is delivered as story, as outrage, as fear, as pride. Emotion is not a side effect of the message. Emotion is the delivery mechanism.
The following is taken directly from Chase Hughes' own explanation, as delivered on camera:
"If your critical part of your brain — where you can criticise or scrutinise what's really going on — it's made to rip that out of your head and make you extremely suggestible. So you'll see here's the pattern that you're going to see on social media: you'll see extreme fear, then you'll see a baby elephant video, then some sad video, then a dog getting rescued. So it's up and down and up and down. Then a dog getting eaten, or a dog biting a pig to death.
There's an animal inside of our brain. We call this the mammalian part of our brain. So I'm going to pump that animal up, and then take it back down. Pump it up and take it down. Professional hypnotists — who hypnotise people for a living — they have a word for this. It's called fractionation. Fractionation means I'm vacillating between a high..."
Fractionation, in clinical hypnosis, describes the process of repeatedly bringing a subject in and out of a heightened emotional or focused state. Each cycle deepens suggestibility. Each transition lowers the critical threshold a little further. By the time a professional hypnotist has run this cycle several times, the subject will accept suggestions they would never have entertained at the start of the session.
Your social media feed runs this cycle dozens of times per minute. The sequence Hughes describes — extreme fear, cute animal, tragedy, rescue, violence — is not random. It is the fractionation cycle, industrialised and automated. The mammalian brain is pumped up, brought down, pumped up, brought down. The critical, scrutinising part of the brain — the part reading this article — is progressively disconnected from the process. What remains is a brain in a state of open, unguarded suggestibility. And in that state, the next piece of information it receives goes in deep — without the screening, without the doubt, without the questions it would normally ask.
Hughes puts it plainly: "Your brain versus a one trillion dollar computer — you're going to lose. I'm going to lose. And I can spot all of the things — and I'm still going to lose."
Hughes describes a technique developed in intelligence operations in the 1950s and 1960s called the Alice in Wonderland technique — deliberate confusion used to make the brain grab onto the first solid idea it is offered.
The principle: when a person is confused, their brain behaves like someone falling. Limbs flail. And the first solid object they contact, they grip — regardless of what it is, even if it is a thorn bush. "So the brain corollary to this is: if a person is confused, the first logical piece of information they hear after being confused will be automatically accepted — or more automatically accepted — without being screened or scrutinised by the brain."
Fast-moving content, contradictory information, information overload, constant breaking news — these are not failures of the modern media environment. They are the Alice in Wonderland technique at scale. A confused population grabs the first coherent narrative offered to it. Whoever controls the narrative offered at the moment of maximum confusion controls what the population believes.
Hughes draws on the work of anthropologist Robin Dunbar, who established that the human brain evolved to manage meaningful social relationships with approximately 150 people — the size of ancestral hunter-gatherer groups. Beyond that number, our capacity for genuine social processing breaks down.
Social media has blown this number into the millions. Your mammalian brain — which has not changed in 200,000 years — is now attempting to process tribal signals from thousands of strangers simultaneously. It cannot do this. Instead it defaults to shortcuts: who seems to be in the majority? Who is the authority figure? What does the tribe appear to believe? These shortcuts are the exact levers that algorithmic manipulation is designed to pull. You are not failing to think clearly. Your hardware was never built for this environment — and the environment was built knowing that.
Hughes maps the progression that effective psyops follow — and that effective social media manipulation replicates at scale: Idea → Ideology → Identity.
First, you encounter an idea. It is just a thought — held loosely, open to challenge. Then, through repetition, emotional reinforcement, and tribal validation, the idea becomes an ideology — a framework through which you interpret the world. Then, at the final stage, the ideology becomes identity. It becomes who you are. And at that point, challenging the belief does not feel like an intellectual disagreement. It feels like a personal attack — triggering the hardwired social fear of rejection by the tribe.
Hughes notes this is obvious: "We are being engineered to have an us-versus-them tribalistic mentality." And the social fear of how others perceive us — one of the most powerful forces in human psychology — has been "hardcore weaponised" over the past 15 to 20 years to divide people who, in reality, have far more in common with each other than with those funding the divisions.
Hughes is not a fringe figure or a conspiracy theorist. He trains government agencies, law enforcement, and Fortune 500 companies. His work is grounded in peer-reviewed neuroscience and decades of operational intelligence experience. When he describes these techniques, he is not speculating about what might be happening to you. He is describing the documented operational toolkit — the same one being deployed on you right now, at scale, by platforms and actors who have invested billions into understanding exactly where your psychological architecture is weakest.
Now consider what governments around the world have begun doing — quietly, with relatively little public debate about the real reason why.
Australia — As of December 2025, Australia barred social media for children under 16. The ban includes ten major platforms including YouTube, Instagram, TikTok, X, Reddit, and Snapchat. Social media companies must use multiple verification methods including government IDs and facial recognition. Companies face fines of tens of millions of dollars for repeated breaches.
Denmark — The Danish government announced in November 2025 that political parties agreed to ban social media for children under 15, expected to become law as soon as mid-2026.
Norway, Spain, Slovenia, Portugal, France, Poland — Norway has begun work on legislation to set an absolute minimum age of 15 for social media. Spain will ban under-16s entirely. Slovenia is drafting a ban for under-15s. France's National Assembly approved legislation banning children under 15. Poland is preparing legislation banning under-15s and holding platforms responsible for age verification.
United Kingdom — Britain is considering an Australia-style ban for children under 16, with the technology minister signalling it could arrive as early as 2026.
United States — Multiple states have enacted age verification and parental consent laws. California, New York, Florida, Nebraska, Virginia, Louisiana, Utah, and Oregon have all passed legislation restricting children's social media access in various forms. Federal legislation has advanced but not yet passed.
The official justification offered by every government on this list is some version of the same thing: mental health, cyberbullying, online predators, screen addiction. These are real concerns. They are not the whole story.
What is happening here is governments acknowledging, in the language of child protection, something they are not quite willing to say directly: these platforms are extraordinarily powerful behaviour-shaping systems, and developing minds are particularly vulnerable to them. The same algorithmic manipulation, astroturfing, echo chamber mechanics, and nudge systems described on this page work even more effectively on brains that have not yet developed the capacity for sustained critical thinking. The platforms know this. They have always known it. Now, apparently, so do the governments.
The Australian Government stated that its ban will protect young Australians from risks arising from "design features that encourage them to spend more time on screens, while also serving up content that can harm their health and wellbeing." Note the phrase: design features. Not accidents. Not unintended side effects. Features. Deliberately built. Working as intended. On your children first — and, with only slightly less efficiency, on you.
Compare the public messaging of the information ecosystem — what it presents as neutral, objective reality — with the operational mechanics underneath. When these two columns diverge, pay attention.
Now. Take every single one of those four warning signs. And apply them honestly to yourself.
Nobody believes they are the one who has been manipulated. That is not a coincidence. It is the point. The most effective influence operation leaves no fingerprints. It feels like clarity. It feels like finally seeing what others are too blind or too cowardly to face.
The machinery shaping your beliefs and the machinery shaping the beliefs of the person you're convinced has lost their mind — it is the same machinery, run by the same class of people, optimised for the same outcome: a population that is certain, emotional, tribal, and easy to move.
The trick is that yours never tastes like anything at all. The first step is the hardest one: genuine doubt. Not about them. About yourself. Create distance from your feed. Seek out the original source, not the summary of the summary. Ask how you know what you know. That discomfort you're feeling right now — that's your System 2 finally getting a word in.
All claims on this page are sourced from: declassified CIA documents (Family Jewels, 2007) · Church Committee Senate Report (1976) · Published behavioural science and nudge theory literature · Cognitive psychology research (Kahneman) · MIT Technology Review · Peer-reviewed studies in political psychology and coercive persuasion · Undercover video footage, Sound Investigations, April 2024. This page is for public information and community education. It is journalism. It is information. What you do with it is your choice.
Shinysideout.com.au ◆ Compiled March 2026 ◆ Updated as events develop ◆ All sources verifiable ◆ No redactions required