Cyberbullying: Is Your Brand Safe by Design?

Last Updated: December 16, 2021

The Future of Branding is “taking a stand”, according to customer experience analyst, Brian Solis. And larger consumer brands must consider the type of brand experience that social media platform users will have as they move through their day. Carlos Figueiredo, Director of Community, Trust & Safety, Two Hat Security validates the theory that brands need to think about how they are viewed in the context of the overall conversation happening in a social community. Do they want alignment with positive user experience or are they okay with toxicity and negativity associated with their ad or logo? Carlos recently spoke at the International Bullying Prevention Conference in Chicago International Bullying Prevention Conference and has offered marketers these key takeaways.

Snapchat, Kik and Facebook have all openly struggled with issues of cyberbullying on their platforams and while they may be some of the biggest and most ubiquitous, they are hardly alone. Social networks, game communities, and even online schools contend with cyberbullying, incitement to self- harm, solicitation of nude images, and other threatening online behaviors to which children often fall victim. Some of these platforms collaborate openly to try and solve the problem; others prefer to work quietly behind the scenes, perhaps unsure how to confront the problems in their communities. 

Learn More: Is It Time to Rethink Corporate Websites — Again? 

In the near future platform operators including online game communities and messaging apps could be forced into greater levels of transparency. Public reporting for all operators may become law in the United Kingdom, and historic fines are now being levied against social networks in the EU and elsewhere for allowing online harms to occur in their communities. Network operators and others are being held responsible for bad experiences that users suffer on their servers. Could a brand sponsoring that experience also be held responsible? What if the brand is the creator of the experience and the app or network simply the host in the eyes of the law? 

Clarifying these somewhat complex relationships and determining how to assign responsibility are daunting challenges, as are the potentially serious legal and financial penalties for networks that fail to address cyberbullying and illegal content on their platforms. But the real discussion brands need to have about online harms is not about risk; rather, we need to talk about the opportunity created when the user experience is designed for enjoyment and safety. 

The New Duty of Care

As a matter of law, Duty of Care in physical spaces is well-established. It means to hold reasonably responsible the owners and operators of places like nightclubs or stadiums for the safety and security of their users. In a place like a school, or a public park, where children are present, the Duty of Care on the operator is both different and greater. Online, the concept of Duty of Care is increasingly applied in much the same way. 

Governments in London, Sydney, Washington, Paris and Ottawa are considering or introducing new laws, financial penalties and even prison time for those who fail to stop or prevent online harms. This has caused major platform operators to begin redesigning products – or the entire way they build new products – by adopting a new twist on an old principle: Safety by Design. Australia’s eSafety commission, led by Julie Inman-Grant, is a leader in introducing Safety by Design to digital communities. Brands that wish to be at the vanguard of protecting children from online bullying should consider adopting the same principle when planning the user’s experience with their brand. 

Safety by Design

A future internet where cyberbullying is no longer a problem is achievable but to get there we must consciously-design and purposefully-create each experience so as to inherently minimize the opportunity for bullying (or abuse, or extortion, etc.). 

The challenge is that currently neither brands nor large social platforms are organically inclined towards Safety by Design — they are, in fact, hardwired for Growth by Design. To bridge the gap, bear in mind that safety and profit are not mutually exclusive. In fact, they are interdependent. 

We need to adapt the way brands create products and experiences.

Chat filters are a good start
 

As a starting point, every experience your brand creates, sponsors, or aligns with should have standard filters in place to protect users against cyberbullying, hate speech, images of acts of terrorism / propaganda that incites, images of child sexual abuse, and any encouragement of suicide or self-harm. Absent these filters, even the most positive and engaging brand messages will be lost or negatively recalled. 

Chat and image filters are available by API and offer support for 20-plus languages. They work across the web, as well as in apps, and even in encrypted environments. Requiring these filters on every brand experience is an early to-do on your Safety by Design checklist, but there are many more associated tasks, and you will need help to complete them.

It’s not an in-house job
 

Your brand may be well known and far-reaching, with deep pockets, and possess excellent technical competency. But even if you spend millions every day on content moderation, like Facebook, managing cyberbullying and child sexual predators away from every digital brand experience is not something you can successfully expect to accomplish alone. Why? It takes years of developing datasets and training AI models to identify subversive text and images. It takes international cooperation to standardize definitions of online harms and draft effective global policy. Simply, it takes a village to protect our children.

Brands are savvy when it comes to acquiring marketing and advertising technologies but where does content moderation acquisition sit in that ecosystem? Will brands demand major ad networks guarantee only vetted, safe content around their brand? Will agencies position themselves as the Safe by Design creative engineering alternative for brands that want to lead the way in bullying prevention? Will purposefully-built Safe by Design brands become the standard? With the trend toward future enforcement, and increased consumer awareness, the market will respond and it likely won’t take long.

Learn More: The Secret to Great UX: Put the User Last

In summary

 While the impetus to compel brands to adopt Duty of Care principles is for now not a matter of law, the social pressure from consumers tired of feeling bullied or harassed online is tremendous. Brands have the option to remain static and let social networks and app makers worry about cyberbullying on their platforms, but they also have the opportunity to create a new kind of experience based on authentic concern and available advances in technology for the safety and well-being of users. Proactively addressing these truths will benefit both consumer and the brands with which they interact.

Carlos Figueiredo
Carlos Figueiredo

Director of Community, Trust & Safety, Two Hat Security

Committed to cultivating industry collaboration and fostering strong communities around digital citizenship and online safety, Carlos spent the last ten years (including six years at Disney Interactive Studios) keeping online communities healthy. One of the founding members of the Fair Play Alliance, he currently works alongside brilliant people at Two Hat Security and partner organizations to tackle the big online behavior challenges of our connected times. Carlos has moderated a panel at the Game Developers Conference 2018, spoken at a panel at LA Games Conference 2018 and FOSI Conference, and was a speaker at both RovioCon 2018 and Game UX Summit 2018. Aside from being a co-founder of the Fair Play Alliance, Carlos also serves as the Steering Committee Member.
Take me to Community
Do you still have questions? Head over to the Spiceworks Community to find answers.