Why TikToks For You page isnt a safe space for women


TikTok's For You page has revolutionised the way we consume content online. Creators catapult to virality overnight, and algorithms dictate the videos we consume, rather than our following lists. This is a positive for creators and influencers in some ways, as it can make it simpler to find new and relevant audiences and grow. But as clever as these algorithms are, they don't always get it right. And it's all too easy for a video to end up on the screen of the wrong person.
This is a particular concern for female creators, with the rise of misogyny online. In fact, 60 percent of teen girls in the U.S. have encountered harassment on social media platforms, according to a 2023 survey by Pew Research Center. In the UK, the Office for National Statistics found that 23 percent of women aged 16 to 24 have reported incidents of sexual harassment, with the UK government recently recognising violence against women, which includes online abuse, as a national threat.
Dealing with misogyny online is difficult for anyone, but it becomes a particular issue when posting content online is your job. Not only do you rely on the income you make on social media, but you also must spend a lot of time on these apps. When you're dealing with misogynistic abuse on a daily basis, going online becomes unenjoyable, unsafe, and a threat to your career.
Misogyny on the FYP
When influencer and author Hari Beavis's recipe content and vlogs started to go viral on TikTok, she quickly became subject to abuse. "If one of my videos hits a male audience, I will not look at the comments because I know there will be hate or misogynistic comments," she tells Mashable. This includes comments sexualizing her body: "There are endless comments about my breasts on some videos," she says.
Beavis isn't alone in her experiences either. Eliza Hatch is the founder of Cheer Up Luv, a feminist platform she founded in 2017 to address the public sexual harassment women face across the world. But it wasn't on that platform, but her own personal social media page, when Hatch started posting videos of herself skateboarding that she found herself subject to an extreme amount of online misogyny. "When I started posting about my skateboarding journey on TikTok, I initially received a lot of negative comments from men telling me how shit I was and that skateboarding isn't for girls," she says. "I was really surprised at that level of vitriol I was experiencing just from posting about my personal experiences of skateboarding."
One of the reasons online misogyny is such a concern right now is because of the rise of incel culture online, and social media algorithms play a big part in this. A 2024 algorithmic study by University College London found that the amount of misogynistic content shown to teen users on TikTok could increase by 400 percent in just five days, which has a huge knock-on effect for women sharing content online.
Certain incel groups online promote networked misogyny, when high-profile figures encourage others to actively harass women online, explains Dr Carolina Are, a social media researcher at Northumbria University's Centre for Digital Citizens.
"Some of them may just be reacting to content they see, while others may search keywords or be part of groups where content is dropped to trigger harassment," Dr Are says, adding: "This is to destabilise individual creators, but it's also more sinister: it's an active campaign to silence specific users, and to create a homogenous idea of acceptable femininity based on archaic gender roles."
One of the issues with algorithms that promote recommended content rather than showing your posts to a list of people who have chosen to follow you is that it's almost impossible to hide your posts from people you don't want them to reach. Plus, algorithms are driven by engagement, but it doesn't matter why or how someone is engaging with the content. When someone who is using a social media app to promote misogyny sees and engages with a video, it's pushed out to a similar audience because the algorithm thinks they might engage with it too, which turns into a toxic cycle in which the creator is suddenly exposed to a huge amount of trolling.
Furthermore, many people use apps like TikTok anonymously. "On TikTok, a lot of people have faceless accounts because not everyone makes and posts TikToks, so they feel they can say anything they want, and in my case, it's always from men about my body," Beavis says. Hatch agrees that she feels safer on apps where she's interacting with people who have chosen to follow her: "On Instagram, I've got such an amazing community that I always feel protected — everyone stands up for each other," she says.
"A lot of accounts, particularly on platforms like TikTok, are anonymous and just exist to lurk, either for harassment purposes or because they're not interested in creating content," Dr Are explains. "Without a stable identity, these users aren't encouraged to act in a civil way."
Making social media safer for women
Finding safe spaces online for women is, therefore, becoming increasingly difficult, as more and more social media apps switch to algorithmic formats that show you recommended posts. For Beavis, this meant stopping posting on TikTok for a period.
"I didn't post on TikTok for a few months as it really got me down and I didn't see the point in posting if my videos were just going to an audience that [was] going to objectify me," she says. It was also one of the reasons she made a private community specifically for women, called The Big Tittie Committee. As you might have guessed from the name, the community is designed for women with larger breasts, sharing tips, recommendations, and experiences. For Beavis, the obvious decision was to make this account private with an admissions process. "I wanted this to be my private female empowerment page — a place without men on social media," she says.
Aside from making private accounts, there's no simple solution to eradicating this type of misogyny from social media for the creators experiencing it. But is there any way of making these algorithms safe for women? "I don't think these mechanisms are inherently harmful because they are similar to offline life — they just act on a much more heightened scale and with [fewer] checks and balances," Dr Are says. "If I have one person on the street harassing me, it's just one person and I can hopefully report them to the police," she continues. "If I go viral, it's like a whole city is harassing me all at once without any consequences for them and no protection for me."
Relying on ourselves, not platforms
According to the TikTok Community Guidelines, misogyny is called a hateful ideology. When asked for comment, a TikTok spokesperson echoed some of this definition and told Mashable: "Misogyny is a hateful ideology that is not tolerated on TikTok, and we remove 88 percent of content found to break these rules before it is reported to us." The app will take action against comments that violate its policies.
But the creators Mashable spoke to noted that they feel as though a lot of the responsibility falls on them to manage these comments themselves, by deleting and reporting them. "In my own experience, following some of my videos going viral, I was the one who had to insert keywords such as 'rape' in my comment filters," Dr Are says.
The added responsibility of moderating what is an overwhelmingly hateful comment section is a particular issue when you're using social media as your main form of income, having to choose between continuing to earn money by posting or refraining from doing so to protect yourself from misogyny. "With a larger following, as it grows, you become susceptible to more unwanted comments," says Beavis, who has over 300,000 followers on Instagram.
It's unfortunate, given that influencing is a career that's dominated by women (an estimated 77 percent of influencers globally are women), that online spaces feel far from safe. "Social media can be so amazing in so many ways, but for a woman, it can be really destructive," Beavis says. Hatch agrees that as much as she's tried to ignore the abuse, it has changed her experience of using social media: "I definitely don't post as much about skateboarding anymore — it has had a lasting impact," she says.
Ultimately, all of this goes to show just how novel algorithms like the For You page are and the potential dangers of relying on automated systems, as well as artificial intelligence, to keep us safe online. And it leads us to ask: Is there a world in which social media platforms can prioritise healthy, meaningful usage and interactions over a pursuit for maximal, and ultimately unhealthy, levels of engagement? The idea already feels utopian, but it's essential if we're to hope for a feminist digital world.
What's Your Reaction?






