Menopause, menstrual cups, vulva, endometriosis, abortion, and UTIs—these are essential health topics, not inappropriate content. Yet, many women’s health companies and creators report that platforms like Meta, Amazon, Google, LinkedIn, and TikTok regularly restrict or remove such content.
Thus, in a powerful show of collective resistance, six women-led health-tech startups have filed formal complaints with the European Commission, accusing tech giants of systematically censoring women’s health content.
The companies have cited repeated incidents of shadowbanning, content takedowns, and ad restrictions.
This action marks one of the first high-profile uses of the EU’s new Digital Services Act (DSA), legislation aimed at curbing digital harms and holding tech platforms accountable for how they moderate online content.
The complaints, coordinated by advocacy campaign CensHERship and investment firm The Case For Her, signal a turning point in the fight against digital discrimination in healthcare.
Who’s taking a stand?
The six companies leading the charge include:
- Bea Fertility (at-home fertility kits)
- Aquafit Intimate (vaginal health products)
- Geen (sexual and reproductive health education)
- HANX (sexual wellness and condoms)
- Lactapp (breastfeeding support)
- Daye (gynaecological diagnostics)
These startups allege that despite following advertising policies and sharing medically accurate content, their posts and ads are often flagged, restricted, or outright banned—while male-focused content (e.g., erectile dysfunction ads) continues to be approved without issue.
“We can’t improve women’s health if we can’t talk about our bodies using anatomically correct language. We urge the platforms to help level the playing field for women’s health content.”
-Clio Wood and Anna O’Sullivan, co-founders of CensHERship.
What are they demanding?
The coalition has made several demands to the EU Commission and tech companies:
- Investigate systemic bias in platform content moderation
- Establish transparent, fair policies for health-related content
- Provide clear and accessible appeals for content removal
- Create non-discriminatory advertising guidelines that allow safe, open dialogue around women’s health
A pattern of suppression
These aren’t isolated incidents. A 2025 report by the Center for Intimacy Justice paints a troubling global picture. Based on 159 businesses, organizations and content creators that talk about sexual and reproductive health, the report found:
- 63% had organic content removed from Meta
- 66% had ads rejected on Google
- 64% faced takedowns on Amazon
- 55% had organic content removed from TikTok

This led to estimated revenue losses ranging from $10,000 to $5 million per year per company.

Similarly, a 2024 report by CensHERship found that 95% of respondents—ranging from doctors and charities to wellness startups—had faced some form of censorship across platforms like Instagram, TikTok, YouTube, X, and LinkedIn.
Real business impact
The effects of this censorship go beyond inconvenience. Many of these startups report serious hits to revenue and brand reach. This is particularly damaging in a sector that already struggles with limited access to capital.
“When femtech companies face digital suppression and censorship by the big tech platforms, they struggle to reach customers, limiting their ability to advertise and generate revenue.”
-Cristina Ljungberg, co-founder of The Case For Her.
Specific cases cited
Each startup has documented multiple examples:
- Bea Fertility: Content using the words vagina and vaginal canal was blocked, while “semen” was allowed
- Aquafit Intimate: LinkedIn removed educational posts about endometriosis and postpartum care
- Geen: Content about vulva diversity and gender bias was labelled “political”
- HANX: Condom ads were banned, despite platform policies allowing them
- Lactapp: Instagram muted the word teta (Spanish for “breast”) in breastfeeding videos
- Daye: Ads for diagnostic tampons were flagged under unrelated policies
What are the platforms saying?
Amazon stated it has a “robust appeals process” and encourages sellers to contact support if they believe content was misclassified.
Google said it encourages advertisers to appeal if they disagree with ad rejections.
TikTok noted that its community guidelines apply to everyone and that content creators can appeal decisions.
Meta has not released an official statement in response to these concerns.
What’s next?
With these formal complaints submitted, the European Commission is now under pressure to investigate and take action. The startups hope that the DSA will become a helping hand in holding tech platforms accountable not only for censorship but for the broader structural inequalities in how digital space is regulated.
This case is about more than content moderation. It’s about women’s access to healthcare, entrepreneurs’ access to fair markets, and society’s ability to talk openly about the human body without stigma or shame.
As this unfolds, it may reshape the future of femtech, digital policy, and how we define fairness in the online health ecosystem.
-By Rinkle Dudhani and the AHT Team