Or: How to Turn "This Might Kill You" Into "Scientists Disagree" With Just a Few Million Dollars
It's 2 AM. You're doom-scrolling through your phone (as one does) when you stumble across a study claiming your favourite energy drink isn't just harmless—it's basically liquid vitamins. The research sparkles with prestigious university logos. The lead scientist has more degrees than a thermometer. CNN is already running headlines like "Energy Drinks: The Heart-Healthy Beverage Doctors Don't Want You to Know About."
You're feeling smugly vindicated about your caffeine dependency when you spot it. That tiny footnote trying to hide like a pimple under foundation: "Study funded by MegaCorp Energy Drinks Inc."
Record scratch. Freeze frame.
Welcome to fake science, where billion-dollar industries have discovered something revolutionary: it's way easier to buy the research you want than to make products that won't, you know, slowly turn your customers into cautionary tales.
Think of it as commissioning custom Yelp reviews. Except instead of your neighbourhood sushi place, you're buying scientific legitimacy for chemicals that might be turning your liver into modern art.
The Economics of Manufactured Truth
Here's what's genuinely chef's kiss about this whole operation: fake science isn't some cartoonish villain in a basement lab mixing suspicious chemicals while cackling maniacally.
That would be too obvious. Too risky. Too much like actual fraud that attracts people with badges and search warrants.
No, this is elegant.
You take the entire scientific process—this beautiful, chaotic system humans built for separating truth from wishful thinking—and give it just enough of a corporate "makeover" that it spits out exactly the conclusions your quarterly earnings need.
Real science works like your favourite detective show. Start with a mystery. Follow the evidence. Accept whatever truth emerges, even when it's inconvenient, complicated, or makes everyone look stupid.
Fake science? Same show, different ending. Someone's already written the conclusion in permanent marker. The detective's job is working backwards, manufacturing "evidence" that supports the predetermined verdict.
You're still technically doing detective work. Gathering clues, writing reports, and presenting findings with appropriate academic gravitas. You're just doing it with the intellectual integrity of someone trying to prove their ex was definitely the toxic one.¹
The genius is in the optics. Same universities. Same peer-reviewed journals. Same press conferences with serious-looking people in lab coats.
The only difference? Who's signing the checks and what they expect for their investment.
Once you understand this basic economic reality, the specific techniques start making perfect sense. In that horrible, inevitable way that most human corruption does.
¹ Spoiler alert: it was probably you.

The Four-Act Corporate Science Drama
Act I: Manufacturing Your Evidence
(The "Totally Legitimate Research" Phase)
This is where you seize control of the research process like a director who's tired of actors going off-script.
You don't falsify data—that's amateur hour and potentially federal-prison-adjacent. Instead, you design studies with the same care a magician designs tricks. The outcome isn't rigged through fraud. It's rigged through creative methodology.
Some popular techniques:
The Time Warp: Test your sketchy product for six months when the real damage takes six years to show up. (It's like testing whether smoking is dangerous by only studying people who started yesterday.)
The Homeopathic Dose: Use amounts so small that even a hummingbird would say "that's barely noticeable." Then conclude your product is "safe at tested levels."²
The Worse Alternative: Compare your questionable chemical to something so obviously dangerous that battery acid would look heart-healthy by comparison.
It's like conducting a study on whether jumping off buildings is dangerous, but limiting your research to falls from ground-floor windows. Then concluding that "gravity-assisted building exits show minimal risk in controlled environments."
² This is where you start recognizing patterns in studies that mysteriously always favour whoever funded them. Wild coincidence, surely.
Act II: Purchasing Academic Credibility
(The "Why Forge Harvard When You Can Buy It?" Phase)
Manufacturing evidence is only half the magic trick. You also need people to believe it.
Which brings us to a delicious irony about universities. For all their noble rhetoric about "truth" and "knowledge," they also have budgets. And fundraising goals. And professors who'd prefer their research funded by someone other than the Department of Gradually Shrinking Government Grants.
So you start writing checks. Magnificent ones.
You don't just fund random studies. You endow entire academic programs with names like "The ExxonMobil Center for Totally Objective Climate Research." You sponsor conferences where your hand-picked experts can debate burning questions like "Is our product dangerous?" and "Define 'dangerous,' really."
You're not technically buying specific results—that would be gauche, like wearing socks with sandals or putting pineapple on pizza.³ You're just creating an ecosystem where researchers who happen to find favourable results are the ones who get continued funding, prestigious positions, and invitations to conferences that don't serve terrible coffee.
The Purdue Pharma playbook here is particularly chef's kiss. They didn't just fund random opioid research. They created entire pain management programs at medical schools. Complete with educational materials that treated addiction risk like a minor footnote and pain management like a sacred calling.
When doctors graduated, they weren't thinking, "I learned this from Big Pharma." They were thinking, "I learned this at Johns Hopkins."
(Which is technically accurate. Johns Hopkins just learned it from Big Pharma first.)
³ Fight me.
Act III: Manufacturing Uncertainty
(The "If You Can't Control Truth, Control What People Think About Truth" Phase)
But here's where reality becomes inconveniently... real.
Sometimes legitimate scientists, funded by legitimate sources, produce legitimate research that makes your product look about as safe as juggling chainsaws during a thunderstorm.
When this happens, you pivot to the most psychologically sophisticated part of the operation. You stop trying to prove your product is safe. Instead, you make it impossible to prove it's dangerous.
You weaponize science's greatest strength—its natural caution and self-correction.
Legitimate researchers are always careful with conclusions. They acknowledge limitations. They call for more research. They use words like "suggests" and "indicates" instead of "definitely proves beyond any shadow of doubt."
These are features, not bugs. They make science reliable over time.
But they're also vulnerabilities. And corporations have both funding and creativity in terrifying abundance.
So you amplify that natural scientific uncertainty until it becomes paralyzing doubt. You fund studies that "seem" to contradict inconvenient research. You hire scientists to question methodology with the intensity of film critics dissecting Marvel movies. You organize conferences where carefully selected experts debate whether correlation implies causation while Rome burns.
You create a media narrative where "scientists disagree" becomes the story. Even when 95% of scientists agree, and the 5% who disagree all happen to cash checks from the same mysterious benefactor.
It's like turning a minor disagreement about pizza toppings into a decade-long philosophical symposium on whether food exists. Complete with peer-reviewed papers and expert testimony. Until everyone gives up and orders from the restaurant sponsoring the debate.
Act IV: The Nuclear Option
(The "Silence the Troublemakers" Phase)
When manufacturing evidence, buying credibility, and creating doubt still aren't enough to protect quarterly projections, you deploy the nuclear option.
This is where things get ugly in that "I can't believe this is legal" way.
You're not attacking research anymore. You're attacking researchers. Personally.
The toolkit reads like a masterclass in institutional bullying:
- Question their methodology, funding, credentials, and mother's maiden name
- Bury them in lawsuits that cost more than most people's houses
- Use freedom of information laws to demand every email they've ever sent
- Get friendly politicians to threaten their funding
- Make their professional lives so miserable that "shutting up" starts looking like the smart career move
The Monsanto approach to inconvenient cancer research is textbook. When international health organizations suggested their weed killer might cause cancer, they didn't just fund counter-studies.
They went full stalker-ex-boyfriend on individual scientists.
They demanded personal communications. Lobbied Congress to defund research organizations. Made it crystal clear that studying glyphosate's cancer risks was going to be expensive, time-consuming, and career-limiting.
Nothing illegal happened. Scientists just discovered that having professional opinions is surprisingly expensive when certain people disagree with them.
Your Bullshit Detection System
Time to transform you into a cynical, sophisticated consumer of scientific information. You're welcome.
Good news: Once you know what to look for, spotting fake science becomes embarrassingly easy. Like learning to identify bad toupees—once you develop the eye, you see them everywhere and wonder how you ever missed them.
Red Flag #1: Follow the Money
(No, Seriously, Always)
Make this as automatic as checking your phone after questionable seafood.
If the Sugar Association funds sugar safety research, your internal alarm should start playing death metal. If ExxonMobil sponsors climate studies, that warning bell should sound like a smoke detector with a dying battery.
This doesn't automatically invalidate research—even industry-funded studies can be legitimate. But it means you should interrogate that research like it's trying to sell you a timeshare.
Red Flag #2: Suspiciously Convenient Timing
Pay attention to when research gets published with the intensity of a presidential campaign.
If vaccine safety studies drop right before policy votes, that's interesting timing. If climate research gets massive media attention precisely when environmental regulations are being debated, either the universe developed dramatic irony or someone's playing chess while we're playing checkers.
Real research follows scientific rhythm: hypothesis, experiment, peer review, publication, then maybe someone notices.
Fake science follows corporate calendars. Real research says, "We finally have enough data." Fake research says, "We need something by Thursday's Senate hearing."
Red Flag #3: Conclusions That Sound Like Marketing Copy
Real science is messy, complicated, and full of caveats that would make lawyers proud.
If studies conclude controversial products are not just safe but beneficial—with no meaningful limitations mentioned—you might be reading research designed to reach that conclusion rather than discover it.
Legitimate papers are littered with phrases like "findings suggest," "further research needed," and "limitations include approximately everything."
Fake science sounds like marketing copy. Complete with definitive safety claims that go way beyond what any reasonable data interpretation would support.
Red Flag #4: The Same Experts, Everywhere, All the Time
When the same handful of "experts" keep appearing in media, congressional testimony, and industry conferences—all singing identical tunes about controversial products being totally fine—start asking uncomfortable questions.
Who's booking their travel? Paying speaking fees? Funding research?
Legitimate consensus emerges organically from many independent researchers reaching similar conclusions through different studies across different institutions over time.
Fake consensus gets manufactured by putting the same industry-friendly voices everywhere until their minority opinion sounds mainstream through sheer repetition.

The Greatest Hits
Let's take a nostalgic trip through history's most successful "science" campaigns.
The Tobacco Industry: They invented the playbook when "Mad Men" was a documentary rather than drama. Their internal research showed cancer links clearer than neon signs, but their public position remained "more research needed" for decades.
They funded studies designed to muddy crystal-clear waters and promoted scientists who questioned cancer connections with sports-fan enthusiasm. Made a coherent public discussion about cigarette dangers impossible while knowing exactly how dangerous they were.
The Fossil Fuel Industry: Took detailed notes and ran the same play with climate change. Funded think tanks and contrarian scientists who created the appearance of ongoing debate decades after actual scientific consensus.
So successful we're still having "debates" about settled science while the planet literally catches fire.
The Pharmaceutical Industry: Elevated this to art. They fund academic centers and "key opinion leaders" who consistently find their products safe and effective for increasingly broad populations with increasingly creative diagnoses.
The opioid crisis? That's what happens when an entire medical community gets convinced addiction isn't a serious risk for prescription painkillers.⁴
⁴ Spoiler: It was.
Your Defence Against Scientific Gaslighting
1. Follow the Money Like Your Life Depends on It
(Because It Might)
Make checking funding sources as automatic as looking both ways before crossing streets.
Who paid? What financial relationships exist? Who benefits if these findings become gospel?
This isn't paranoia. It's understanding that financial incentives affect human behaviour, including people with PhDs and good intentions.
2. Look for Actual Consensus
(Not the Manufactured Kind)
Individual studies can be flawed, biased, or wrong—even legitimate ones by honest researchers with the best intentions.
What matters is evidence weight across multiple studies by different researchers using different methods funded by different sources over time.
If one study says Product X is safe and fifteen say it's dangerous, you're not witnessing genuine controversy. You're seeing one strategic study surrounded by orchestrated media campaigns.
3. Read the Fine Print
(Yes, the Boring Parts)
Most people only read headlines and conclusions. That's like judging movies by posters alone.
Methodology sections tell you how the research was conducted. Limitations sections reveal what studies couldn't determine. The conflicts of interest sections show who might want specific outcomes.
Basic questions you don't need PhDs to ask: How long? How many people? Proper controls? Do conclusions match findings?
4. Be Suspicious of Unnatural Certainty
Real science is comfortable with uncertainty and limitations. That's how knowledge advances—through careful, incremental progress with course corrections.
Fake science presents itself with reassuring certainty that isn't scientifically justified.
If research claims to have definitively settled complex questions with single studies, be suspicious. If conclusions don't acknowledge limitations or call for future research, you might be looking at advocacy cosplaying as inquiry.
Real scientists know how much they don't know. Fake scientists know exactly what clients need them to conclude.
Why This Matters
Look, I know this sounds exhausting. Like I'm suggesting you become a part-time detective every time you encounter scientific studies.
But this isn't just about becoming a more informed consumer of scientific information (though that's important).
It's about preserving evidence-based decision-making in a world where evidence can be manufactured to order with fast-food efficiency.
When industries successfully muddy waters around basic facts—product safety, climate reality, medication effectiveness—we lose the ability to make rational collective decisions about anything important.
It's like navigating with a GPS hacked by people who profit from you getting lost. Technically, you still have navigation. It's just not helping you reach destinations. And the longer you drive around lost, the more money they make selling gas and roadside assistance.
The stakes aren't academic. Real people get sick. Ecosystems collapse. Democratic institutions crumble when we can't distinguish genuine scientific inquiry from elaborate corporate marketing in lab coats.
When we can't trust information needed for good decisions—about health, environment, economy—we make bad decisions. And people who benefit from bad decisions are usually the same ones who worked so hard to confuse us.
Funny how that works.

The Bottom Line
Fake science exploits our natural trust in scientific institutions and desire for simple answers to complicated questions. It takes advantage of most people not having time to dig into methodology and funding with investigative journalist intensity.
The defence isn't becoming cynical about all scientific research—that would be throwing out humanity's most valuable reality-understanding tool.
It's becoming more sophisticated consumers of scientific information. Better at distinguishing genuine inquiry from performance pieces designed to advance particular agendas.
Science at its best is humanity's attempt to figure out what's true, regardless of what we prefer to be true.
Fake science is what happens when people with enough money decide they'd rather manufacture needed truth than discover existing truth.
The difference matters more than we want to admit. Recognizing it might be one of the most essential modern survival skills.
Because in a world where reality itself became a competitive marketplace, distinguishing authentic from artificial truth isn't just useful.
It's survival.
I don't sell memberships or anything, but if you want to buy me a beer, I won't refuse.