Thorneverse
Essays · Part I of VIII

Built On Her Back

How Social Media Built Its Empire on Women's Labor and Is Now Squeezing Them Out

April 2026 22 Min Read By Yaznil.Cross
— § —

A note on how I noticed this. While researching for a dark romance novel I was writing — Enjoying Them Both, a standalone set inside an OnlyFans-style creator platform I called ThornyFans — I had to study how that kind of business actually works from the inside. How subscribers behave. How creators price their content. The strange one-sided relationship between a creator and her audience. How the camera changes what's intimate, what's work, and what's risky. Somewhere in that research, while watching how indie creators ran their social media, I started seeing that OnlyFans and similar adult-platform links were quietly disappearing from Instagram bios — moved to highlights, hidden behind redirect services, buried in coded workarounds. I went looking for when that happened, and the rabbit hole opened up.

This essay uses Instagram as the main example because Meta's actions are the most documented and the timeline is the cleanest, but the same pattern is happening across the major social platforms in different forms. OnlyFans and Fansly are the example adult-creator platforms here — the same dynamics apply to similar services. My concern isn't only what's already happened to creators in those categories. It's the direction. As a self-published dark romance author building out a Patreon to reach my readers more directly, I'm watching what platform suppression looks like one rung over from where I work. Indie authors who write dark content, transgressive fiction, or anything a platform one day decides reads as adult are one policy update away from the same treatment. I'm watching what happens next door because next door is closer than most people think.

One more note before the body. The platform suppression I'm describing affects creators of every kind — male, female, non-binary, trans, gay, straight, every walk of life. But this essay's main point lands specifically on the harm to women, because that's where the historical pattern, the financial evidence, and the cultural weight all line up. The structural argument applies to everyone. The thesis is gendered on purpose.

— § —

I. The Bio Link That Disappeared

Look at any OnlyFans creator's Instagram profile right now. Notice what's missing.

The bio used to say "link in bio" with an actual link to her page. Now it says "links in highlights" or "VIP in stories" or some carefully-worded redirect. The link itself moved. It got pushed one layer deeper into the app — into Stories, into Highlights, into a Linktree, into a coded landing page that exists specifically to throw Instagram's automated systems off the trail.

This isn't a fashion choice. It's a survival move.

Instagram has been tightening its rules on direct adult-platform links for years, and the squeeze sped up in October 2025 when Meta rolled out PG-13 content limits for teen accounts. Accounts with OnlyFans links in their bios disappear from teen feeds entirely — not the accounts, the links. Teens can't follow these accounts, can't see their content, can't be followed back. The platform never announced a hard ban on adult-platform links for adult users. It didn't have to. The fear of getting punished did the work. Creators noticed their reach dropping, links being removed, and shadowbans, and the workaround spread on its own. Move the link. Bury the funnel. Survive.

That's the surface story. But the surface story is a cover for something bigger, and once you start pulling on it, the whole machine becomes visible.

— § —

II. What the Platforms Allow vs. What They Block

Open Instagram Reels. Scroll for sixty seconds. What do you see?

Women in workout sets, dancing in panties, bending over for the camera with a caption about "2026 goals" or "AI/ChatGPT" stamped over the visual. The image is doing the work that gets the views. The caption is doing the misdirection. The algorithm sees nothing in the text to flag and serves the post to millions. Some of these thumbnails — the thumbnails, the still frame Instagram itself picks to represent the video — are deliberately rear-angled, thong-framed, body-focused. They are not accidents. They are the product.

Meta's image scanning tools can absolutely detect when someone's showing a lot of skin. They've had this technology since at least 2018. They use it selectively. Posts that get good engagement get a longer leash. Posts that don't get pulled. The platform isn't failing to moderate — it's choosing which content to push and which to bury, and the choice consistently favors content that gets the most views while keeping advertisers and lawyers comfortable.

Then look at what does get blocked. A link in a bio. A link that, by the way, doesn't lead to nudity. An OnlyFans link — or a Fansly link, or any similar adult-creator-platform link — leads to a paywall. A profile page with a clean photo, a subscribe button, and a payment processor. To see anything explicit, a user has to enter payment information, agree to age verification, and actively consent. That's more steps than walking past a Playboy on a 1990s convenience-store counter. More steps than clicking an R-rated movie on Netflix. Way more steps than the average Reels feed already provides for free.

So: the watching stays. The views stay. The algorithm still pushes the content. The thumbnail stays.

The thing being squeezed is the link. The thing the link does is route the customer to where the creator gets paid.

That's the whole change. Not less adult content on Instagram. Less adult money leaving Instagram.

— § —

III. The Kids Defense Doesn't Hold

The official reason given is child protection. It's the only reason that gets traction in Congressional hearings, parental advocacy groups, or news coverage. So let's actually look at it.

If the goal were really to keep minors from seeing sexual content, the obvious move would be to crack down on the sexual content itself — the Reels, the dance trends, the thirst traps, the body-focused thumbnails. The platforms don't do this. They can't do this without crashing their own engagement numbers. So they block the link instead.

But blocking the link doesn't reduce a single minute of exposure. The kid scrolling Reels for six hours a day is being served the same content whether the creator's OnlyFans is one tap away or three. The image, the dance, the framing, the audio cues — all of it stays. What changes is whether the creator gets paid for the attention that was harvested from that kid, that adult, that bot, that scroll.

The platforms know this. The lawmakers, in the moments when they're being honest, know this. The advertisers definitely know this — which is why the policy works for them. Their brands stay clean. Their ads run next to "lifestyle content." The fact that the lifestyle content is basically the same thing that would have been called softcore in 2005 gets covered up with hashtags and captions.

Meanwhile, the actual harm to children — if we're being serious about it — has nothing to do with OnlyFans links. It has to do with kids using social media at all. The U.S. Surgeon General put out a formal warning in 2023 calling teen social media use a "profound risk" to youth mental health. Jonathan Haidt's 2024 book The Anxious Generation pulled together the existing research, and the turning point sits right around 2010-2012, when smartphones became standard among kids. Brain imaging studies show developing teen brains literally rewire around heavy social media use — the brain's reward system gets dulled, attention spans shrink, the parts that handle planning and impulse control don't develop properly.

The science is settled enough that there's no real debate left. What's left is people refusing to act on it.

So if everyone agrees kids shouldn't be on these platforms in the amounts they currently are, then "make the platforms safer for kids" is solving the wrong problem. It's like demanding kid-friendly cigarettes. The honest answer is to age-gate the platform at the door — verify ID, refuse access, the same way bars, casinos, R-rated theaters, and tattoo shops have always operated. Adults inside, kids outside, no need to clean up the experience for the adults because the kids aren't there.

Australia tried it. In late 2024, Australia passed a law banning social media accounts for users under 16. The platforms have to verify age and refuse access. The screaming was loud and brief, then most people complied — because the privacy concerns people pretend to have when convenient already went out the window the moment their phones started scanning their faces forty times a day.

If platforms wanted to protect kids, the model exists. They don't want to use it, because the model removes the kids. And the platforms do not want to remove the kids. Children's attention is too valuable. So they keep the kids, clean up the visible policies, squeeze the adult creators, and call it protection.

It is not protection. It is containment.

— § —

IV. The Public Square Got Sterilized

Step back from the platform for a second and look at what happened to physical space over the same period.

Twenty years ago, there were places adults went to be adults. Dive bars, billiards halls, smoke shops, lounges, social clubs, pool halls, adult arcades, Blockbuster's curtained back room. There were carnivals with paintball and shooting galleries and rides that made you sign a waiver. There were record stores with chairs, bookstores with reading rooms, barbershops where men hung out for three hours over a single haircut.

Most of that is gone. Not because anyone voted on it. Not because adults stopped wanting it. The third places — what sociologist Ray Oldenburg named in the 1980s, meaning the spaces that aren't home and aren't work — collapsed under a mix of new regulations, gentrification, fear of lawsuits, and the fact that social life moved into the smartphone. What's left has been rebuilt around children. Bowling alleys with kid arcades over the pool tables. Movie theaters with playpens. Carnivals that are 90% squirt guns and funnel cake. Water parks where any adult bathing suit makes someone uncomfortable.

For an adult without children, going out in public has become kind of unwelcome. The world isn't built for them. Everything is set up assuming everyone is either parenting or trying to. If a couple wants a Saturday afternoon that isn't a kid factory or a $200 night out, the options are vanishing.

This is not a separate thing from the platform suppression. It's the same thing in a different form.

The rule running both real-world and online public spaces is now: children might encounter this, therefore adults cannot have this. This flips how public life used to work. For most of human history, public space was an adult space, and parents managed their children inside it — taught them what to look at, what to ignore, what to ask about, what to walk past. Now it's the opposite. The space gets cleaned up so parenting becomes optional, and the adults who used to inhabit those spaces are quietly pushed out.

The cost of being pushed out hits hardest on people whose work, art, expression, or income depended on adult spaces. Sex workers. Adult artists. Writers of dark fiction. Musicians whose lyrics aren't radio-safe. Bar owners. Smoke shop operators. Anyone whose business needed adult third places to exist.

You don't have to like every one of those businesses to see what's happened. Adult life has been pushed out of the public commons and into the private home, where it can be watched, taxed, or judged at will.

— § —

V. An American Tradition: Use Them, Then Demonize Them

There is a name for what the platforms are doing right now. Economists call it the extractive cycle. America has run this play four documented times.

First with chattel slavery. African labor was kidnapped, transported, and worked to death building the farming and industrial base of American wealth — cotton, tobacco, sugar, rice, infrastructure, money that compounded for two hundred years. When slavery became politically and financially impossible to keep going, the country pivoted. The same labor was reframed as a problem. Reconstruction got shut down. Sharecropping and convict leasing replaced formal slavery. The wealth built on Black backs stayed where it was. The Black labor that built it was demonized into the present day.

Second with immigrant labor. Chinese workers built the railroads. Italian and Irish workers built the cities. Mexican workers built the farming industry and most of the construction industry. Each wave was used while the building was happening. Each wave was demonized once the building was done. The Chinese Exclusion Act of 1882. The "No Irish Need Apply" era. The current immigration debate, in which the same political movement that benefited from cheap immigrant labor for fifty years now treats those workers as a huge danger. The pattern is exact.

Third with overseas sweatshop labor. American corporations spent thirty years moving manufacturing to Asia and Latin America, where they could pay pennies on the dollar, ignore worker protections, and avoid environmental rules. The Nikes and Apples and Walmarts were built on that arrangement. Once the empires were established and the supply chains were locked in, the conversation shifted. Now outsourcing is bad. Now we should support American manufacturing. The wealth is already booked. The workers in Bangladesh and Vietnam and Mexico get blamed for taking jobs they were paid pennies to do.

Fourth — and this is the one no one has named yet — with women's bodies on social media platforms.

Instagram, TikTok, Snapchat, Twitter (now X), and the broader creator economy were built on viewing data mostly generated by women posting body-focused content. The dancing trend on TikTok wasn't an accident — it was the engine that made TikTok inescapable. The Instagram Reels pivot happened because Reels engagement, fed by women in workout sets and bikinis, beat every other type of content they had ever measured. The recommendation algorithms that now power the global advertising economy were trained on the viewing patterns of women shaking their bodies in front of phones for free.

The platforms didn't just tolerate that content. They were built on it. Their stock value comes from it. Their billionaire founders are billionaires because of it. The infrastructure for serving you a Coca-Cola ad in 2026 was paid for by a generation of women whose unpaid algorithm-feeding labor trained the system that now serves you that ad.

Now the empires are built. The user bases are captured. The ad rates are locked in. Network effects have done their work. And — predictably — the platforms have begun to quietly squeeze the labor that made them.

Same play. New industry.

The OnlyFans link is the cleanest evidence. The women whose content trained the algorithm are now being prevented from turning that algorithmic exposure into independent income. Their attention stays on Meta's platform. Their revenue gets routed away from any path that doesn't pass through Meta's pocket.

If Instagram could legally take a five percent cut of OnlyFans revenue, every one of these restrictions would evaporate inside a week. Not "be reviewed." Not "be studied." Disappear. The whole moral story falls apart the moment money is involved. Meta has reversed every restrictive policy it has ever held the moment a revenue path opened up. They allowed cryptocurrency promotion when they tried to launch their own. They allowed gambling promotion in markets they had partnerships in. They allowed political ads, banned them, allowed them again, on revenue logic dressed up as policy logic.

The morality is the costume. The revenue protection is the body underneath.

— § —

VI. A Small Business Hiding in Plain Sight

Strip the cultural coding off and look at the actual business.

An OnlyFans creator runs a sole proprietorship. She produces content (the product), markets it on social media (getting customers), maintains a subscriber base (keeping customers), files Schedule C taxes (legal compliance), manages her own books (operations), handles customer service (retention), and lives or dies by repeat business (revenue model). She has variable monthly income, slow seasons, customer churn, platform fees, equipment costs, and burnout risk.

She is, by every business measure, the same kind of business owner as any other small business owner in America.

The convenience store owner has a physical location, inventory, employees, and overhead. He sells legal products to consenting adults. He pays self-employment tax. He markets to his community. He competes with other stores. His revenue varies month to month. He files Schedule C. He, too, is a small business owner.

The corner muffin shop. The mobile detailer. The freelance graphic designer. The Etsy seller. The independent author publishing through Amazon KDP. All of them are running the same kind of business. The product differs. The structure does not.

I am self-published through Amazon KDP. KDP takes a cut of my royalties and provides infrastructure: hosting, distribution, payment processing, customer access, copyright protection on the platform side, a global storefront. In exchange, I keep most of the revenue and own my work. That arrangement is fair. It would be insane for me to expect KDP to charge me nothing while doing all of that.

OnlyFans takes 20% from creators. For that 20%, the platform provides hosting, age verification, payment processing, copyright protection, fraud protection, customer subscription management, and a built-in audience. Compared to a literary agent's 15% plus a traditional publisher's 75-85% of what's left, 20% is a steal. Compared to Spotify's 30% plus per-stream rates that pay artists fractions of a cent, OnlyFans's deal is favorable. Compared to Uber's 25% and far less infrastructure, OnlyFans is generous.

The OnlyFans creator and I are running the same kind of business. We both rejected traditional gatekeepers — publishers and studios respectively — and went directly to audiences through platforms. We both keep most of our revenue. We both own our work. We both face stigma from people invested in the gatekeeper systems we bypassed. We both saw a wave of women joining our industries because the gatekeepers were particularly hostile to women.

The only meaningful difference between us is the type of content. And the type of content is being used to legitimize my business and stigmatize hers.

That is not a moral distinction. It's a class distinction. A respectability distinction. She is doing the same thing the porn industry did for fifty years; the porn industry was tolerated because men ran it. The Playboy magazine empire was tolerated because Hugh Hefner ran it. Strip clubs are tolerated because (overwhelmingly male) owners run them. Studios are tolerated because executives run them. The friction begins specifically when women cut out the male middleman.

FOSTA-SESTA in 2018 — passed under "anti-trafficking" framing — destroyed independent escort advertising platforms exactly when independent sex workers were beginning to disrupt the pimp-and-trafficker model. Backpage and Craigslist Personals were eliminated. Trafficking did not measurably decrease. Independent sex workers lost their main tools for vetting clients and were pushed back into less safe conditions.

The pattern is consistent across decades and across countries. Restrictions get tightest exactly when women's independence gets highest.

— § —

VII. The Harm Hierarchy Is Upside Down

If we're going to restrict things in the name of public welfare, let's at least restrict the things that are actually killing people.

Cigarettes kill approximately 480,000 Americans per year, according to the CDC. Plus another 41,000 from secondhand smoke. The product has no nutritional value, no point at which you've had enough fun and stop, and is addictive on first regular use. The entire industry's revenue model depends on the addiction continuing. Nicotine on first use produces nausea and dizziness in most people, not pleasure. The "relief" smokers describe is the relief of withdrawal symptoms ending — relief from a problem the cigarette itself created. The product is a chemical addiction installed up front, then a lifetime subscription to managing the addiction. It is legal, taxed, marketed in stadiums and on television, and sold at every corner store in the country.

Alcohol kills approximately 178,000 Americans per year directly. That number doesn't include the damage that follows. About 37% of all sexual assaults in the U.S. involve alcohol. Roughly 13,000 people die annually in alcohol-related vehicle crashes, separate from the direct medical deaths. About 40% of all violent crimes committed in America involve alcohol. Around 55% of domestic violence incidents involve alcohol. Bar fights and assaults around drinking establishments generate something on the order of half a million reported incidents per year. The Super Bowl runs Budweiser ads. Every stadium in America is sponsored by a beer company. Every grocery store has an aisle of it.

Obesity-related illnesses — heart disease, type 2 diabetes, stroke, certain cancers — kill an estimated 300,000 Americans per year. Fast food and heavily processed food are not the only causes, but they are major ones. America's obesity rate is around 42% of adults, and how much fast food advertising surrounds American urban and suburban environments is a well-documented cause. Industries that profit from products people get hooked on deliberately saturate the environments where their biggest customers live. This is the same logic that places liquor stores densely in poor neighborhoods and casino advertisements heavily in working-class media.

Guns: approximately 47,000 Americans died from gun-related causes in 2023 — roughly half suicides, the rest mostly homicides and a smaller number of accidents. Countries with high gun ownership and tight regulation — Switzerland, Czech Republic, Finland — do not have America's gun violence rates. The variable isn't whether guns exist. The variable is easy access combined with no real system of accountability. Annual safety inspections, registering guns by their ballistics, mandatory loss reporting, and serious consequences for transferring firearms to disqualified individuals — the same kind of system America already applies to cars — would not violate the Second Amendment and would dramatically reduce the body count.

Pornography consumption: the actual research consensus is that there is no clear evidence that links porn use to increased sexual violence. Some studies have found inverse correlations during periods of legalization and openness. The strongest version of the harm research is narrow — it concerns specific violent or coercive content and heavy users — and does not support the common cultural claim that porn drives sexual crime.

OnlyFans-style solo content creation: zero documented deaths. The performer is alone. There is no risk of catching anything from a partner. There is no one being pressured on set. There is no producer drugging anyone. There are no contracts that trap performers. There is no scene shot when someone changed their mind but felt unable to walk off set. From a public health, bodily autonomy, and worker rights standpoint, it is the cleanest version of sex work that has ever been technologically possible.

Roughly a million Americans die per year from the things at the top of an honest harm-ranked list. Solo sex work contributes essentially zero. Yet the cultural conversation devotes more outrage to the latter than the former. Beer ads run during the Super Bowl. OnlyFans links cannot run in an Instagram bio.

The harm ranking and the moral ranking are upside down.

— § —

VIII. The Real Thesis

Here is what this is actually about.

Modern platform morality is corporate financial strategy with a child-safety mask, and its main victims are women who figured out how to make money without a corporate middleman.

The platforms built their empires on women's bodies. Now that the empires are built, those same women are being suppressed under the language of protection. The kids are not the reason. The advertisers are not the reason. The morality is not the reason. The control is the reason. It always was.

Every wave of moral panic laws in American history — the Comstock Laws of the 1870s, the Hays Code of the 1930s, the anti-pornography crusades of the 1980s, FOSTA-SESTA in 2018, the OnlyFans link suppression of the 2020s — has been sold as protecting women and children. Every one has, in practice, mostly ended up restricting women's ability to make money and control their own bodies. What they say and what they actually do don't match. They have never matched. Because what they say isn't describing the result. It's selling it.

The harm we tolerate at huge scale — cigarettes, alcohol, fast food, easy gun access — kills nearly a million Americans a year. The harm we loudly restrict — adult content, sex work, women running their own businesses — kills essentially no one. We have organized a culture in which the things that destroy us are sponsored at halftime and the things that don't are quietly removed from public visibility.

This is not a defense of any specific industry. It is not an argument that everyone should have an OnlyFans, or that pornography is good, or that sex work is morally fine, or that the platforms should show explicit content. Those are separate values arguments and people can hold them however they want.

This is an argument about who decides, on what evidence, and in whose interest.

The current arrangement is: platforms decide, based on revenue, in their own interest, with moral language as the marketing. That is not a public-health policy. That is a corporate strategy with a press release attached.

If we want to actually protect children, we age-gate the platforms at the door. We take the Australian model seriously. We stop letting eleven-year-olds spend six hours a day in an adult attention environment and then claim we're protecting them by deleting some links.

If we want to actually reduce harm, we look at what's killing people and we regulate based on that. Cigarettes first. Alcohol marketing second. Junk food saturation third. Gun friction fourth. Honest accountability for industries that profit from products people get hooked on.

If we want to actually support women — and not just say we do — we leave the small businesses they have built for themselves alone. We stop treating their independence like a problem to be managed. We stop accepting "protection" rhetoric from institutions whose financial interests are the actual driver. We let women own their own businesses, keep their own revenue, and exist in a public square that doesn't quietly evict them every time they figure out a way through.

The bio link is small. The pattern under it is not.

The bio link is just the place the pattern got visible.

— § —

Yaznil Cross