As many of us grapple with another four-weeks of lockdown, social media has proved a welcome escape to the mundane realities of everyday life. When you can’t go outside and travel remains off the cards, sometimes only endless scrolling proves to fill the void or at least, deliver a chuckle or two. But for all the dance trends and memes platforms like Instagram and TikTok have provided in recent months as the pandemic comes to grip the world, there is a darker side to these platforms that needs to be known.
In a joint investigation by triple j Hack and Four Corners, it’s been revealed that there’s a dark side to TikTok, which can expose users to pro-eating disorder content with real-life harms, while simultaneously appearing to ‘shadow-ban’ other content it deems harmful. As the report from authors Avani Dias, Jeanavive McGregor and Lauren Day notes, “the TikTok algorithm is exposing Australians to dangerous content while controlling which people and political movements get users’ attention.”
As anyone who frequents TikTok will know, you don’t pick the videos you watch. Instead, these videos are served to you in an endless stream. It might seem innocent enough, with the app claiming that each feed is unique to the user, but where things get problematic is that a set of computerised instructions work out what you want and give you more of it, based on your own data surrounding your location, gender, age and even facial data.
The report finds that the algorithm is so powerful and unlike anything the world has seen before, making it very hard to break the cycle due to a design that sees users never really get to the end of the content. In the investigation, researchers spoke with two young Aussie women who had pro-eating disorder content repeatedly appear in their feeds. As one 19-year-old told them, “Before TikTok, calorie counting had never crossed my path.”
Another 22-year-old who had been in and out of hospital due to an eating disorder over the past five years explained, “As I got sicker and I got more obsessive, all I could do was just flick through my phone and look at this footage.” She added, “I spent hours on it and just fixated on it.”
Despite TikTok claiming that there are in fact mechanisms in place to stop the spread of this kind of content, it appears there are ways to get around such safety mechanisms. Currently, searching for terms related to eating disorders doesn’t return any actual videos, but instead links to the Butterfly Foundation’s helpline. “Our teams consult with NGOs and other partners to continuously update the list of keywords on which we intervene,” a TikTok spokesperson told the ABC. The app also bans “content depicting, promoting, normalising, or glorifying activities that could lead to suicide, self-harm, or eating disorders.”
Still, simply by using deliberate misspellings and coded language means users are able to navigate around these safety measures, allowing the harmful content to continue popping up int he feeds of those online, in their very own For You pages. It’s estimated that 4 per cent of Australians – or roughly one million people – are affected by eating disorders. Of these, almost two thirds (63 per cent) are thought to be female. Teenagers with eating disorders are more likely to experience poor mental health and impaired functioning in social environments, and research has repeatedly shown that social media can exacerbate these issues, flooding teens with images of “ideal” body types that are often unattainable, and filtered images.
Interestingly, the flip side of the algorithm suggests that while the app fails to moderate this negative content, it can easily hide content which constructively discusses issues like racism and disabilities. Perth TikToker Unice Wani told the publication that despite having over 595,000 followers, when she spoke about race and racism as a Black woman, her videos did not perform well. “You tend to get a lot of shadow bans for speaking up about stuff such as racism,” she told Four Corners. “I guess they focus more on the white girls dancing and stuff like that.” It comes after a number of Black creators from the US went on strike earlier this year, protesting the fact that white TikTokers were becoming hugely successful off of their dance moves, while the same success rarely ever extended to themselves.
To read the full investigation, visit the story over at the ABC’s official website here.
If you need support, give Butterfly Foundation a call on 1800 33 4673 or chat online.
If you are in distress, please call Lifeline on 13 11 14 or chat online.
Under 25? You can reach Kids Helpline at 1800 55 1800 or chat online.
Source: Read Full Article