close
close

Social media accelerates trolling—just look at Raygun. How can we stop viral moments from spreading so quickly?

Social media accelerates trolling—just look at Raygun. How can we stop viral moments from spreading so quickly?

For Australian breakout player Rachael “Raygun” Gunn, the 2024 Paris Olympics were marred by a flood of online trolling. Gunn’s appearance sparked fierce reactions on social media.

Viral videos and memes mocking Gunn’s unconventional style spread quickly, along with a flood of racist, sexist and body-shaming comments. Trolls accused her of “dishonoring” the Olympics and called for her disqualification. Days after the event ended, anonymous online critics continued to attack Gunn’s looks, talent and identity.

The Raygun incident is a perfect example of how a combination of spectacular events, the design of social media platforms and human psychology can reinforce each other and create a storm.

The factors that lead to trolling are complex, but they also point us to a solution.

Algorithmic acceleration

The crux of the problem lies in the way social media platforms are designed. Algorithms that prioritize engagement and virality can amplify and spread content—good or bad—quickly.

It can turn an initial handful of trolls into a mob. Posts and memes that elicit strong emotional reactions, whether positive or negative, are more likely to be shared and commented on, increasing their momentum and reach. As criticism of Raygun spread across platforms, more and more people joined in.

Such “algorithmic acceleration” creates echo chambers where extreme views are normalized. In an echo chamber, users are only shown information that reinforces their existing beliefs.

This can amplify negative sentiments and create a more hostile online environment. Dissenting voices are often silenced.

Anonymity, disinhibition and dark leisure time

The relative anonymity of the online space also plays an important role. People feel encouraged to say things they would never dare to say in person. This “online disinhibition effect” reduces their restraint and empathy and makes trolling more likely.

Surprisingly, “fun” also plays an important role in trolling. Trolls are not only motivated by malice. For some, trolling is a dark form of recreation.

They find joy in humiliating, inflicting pain, and eliciting reactions from victims and others. This often leads trolls to seek out topics and spaces in which they can “play” and live out their sadistic nature.

Meme culture and virality

The viral spread of content such as memes is another typical feature of online trolling. Trolls weaponize the visual language of memes, creating satirical or derogatory images that can be easily shared and remixed.

These can quickly develop a life of their own, become ingrained in the cultural consciousness and make the original aim of the abuse even more visible.

In Raygun’s case, trolls have taken her clothing and gestures and turned them into endlessly reproduced caricatures. Such meme-driven harassment reinforces the personal nature of the attacks.

How can we combat trolling?

Combating online trolling is a complex challenge that requires platforms, policymakers and communities to tackle from different angles.

Technological solutions

Improved moderation tools. Social media platforms are investing in AI-powered tools to detect and remove malicious content, but these tools can struggle with context, leading to false positives and false negatives.

Improved user reporting and blocking features. To help people manage their online interactions more effectively, we need better reporting mechanisms and the ability to mute and block trolls.

Behavioral change

It is equally important to consider the psychology of human behavior. If we want people to change their behavior online, who should we target and how?

Psychological support servicesProviding mental health resources to individuals who are victims of online abuse can help mitigate the personal impact of trolling.

Awareness campaigns. Raising awareness of the impact of trolling and promoting digital literacy can help users and society in general navigate the internet more safely. This includes understanding how algorithms work and recognizing the signs of echo chambers.

Involve those around you in a positive way. Research highlights the critical role of bystanders in curbing violent behavior, including trolling. When trolls cannot be stopped directly, a social media community can set rules and expectations. Harmful behavior can be called out by engaged bystanders, setting a positive tone that deters bad behavior.

Regulatory measuresPolicymakers should consider introducing strict regulations to deter individuals from engaging in online harassment and trolling.

All of these changes can be difficult to implement. It’s hard to change human behavior on a large scale, and providing comprehensive psychological support requires resources. But the measures above would likely deter occasional trolls from engaging in this nasty behavior.

Where to from here?

The above solutions can provide some relief, but they have limitations. Awareness campaigns may not reach all users. The global and decentralized nature of the Internet makes it difficult to enforce regulations consistently across jurisdictions. Bystanders may be unwilling to engage.

To combat trolling, technology companies, governments and ordinary internet users must work together to create safer online environments.

This includes developing more sophisticated AI moderation tools, encouraging cross-platform collaboration to curb trolling, and fostering a culture of accountability and respect online.

Gunn has spoken about the “devastating” impact of the online hate she has experienced in the wake of the Olympics. It’s just one incident among many, but it highlights the urgent need for better ways to combat toxic online behavior on a global scale.

Leave a Reply

Your email address will not be published. Required fields are marked *