Get on the fast track        

 

Dec 12, 2018    Uncategorized

Anonymity on the Internet: The Perils of Privacy

ALL POSTS

Guardians of the Internet

Aside from your money and information, there is one more very personal feature that the internet can strip away from you. Your mental stability.

With millions of people connected to the internet, there’s no doubt that some of them are going to resemble the more heinous side of humanity. A side lacking the most human of capabilities, such as empathy, sympathy or even a conscience. And there’s no doubt that these lifeless individuals jump at the chance to show the world how cruel they can be.

While we might never understand the intentions of this particular demographic, besides wanting to spread hate, fear, or pain, we’re lucky to have content moderators who protect our news feeds from becoming tainted with this gut-wrenching material.

From beheadings, dick pics, animal cruelty, murders conducted by child soldiers, and far more narratives that you couldn’t nor wouldn’t want to imagine, content moderators filter out the bad so we can enjoy the good. But while the general population remains ignorant about these atrocities being conducted across the world wide web, the moderators whose job it is to assess the varying materials are impacted in an entirely different way.

Imagine having to sit at a computer every day, for 8-9 hours, looking through some of the most sick and twisted ideations that come from the minds of even sicker, and even more twisted individuals. It goes without saying that the resulting psychological and emotional toll is going to be high. The problem is, so is their importance to society.

Psychological counseling and high turnover rates

At industry giants like YouTube or Facebook, where daily user submissions pour in by the thousands, the reliance on content moderators is extremely high. Ever since the 2016 election scandal where Russian efforts influenced Donald Trump’s campaign, content moderation has become increasingly important.

YouTube also has a battered history of letting unfiltered content slip past their moderators. Take for example Staci Burns and her 3-year-old son Isaac, who just last year happened across a demented version of the children’s animated series “Paw Patrol”, in which the characters demonstrated different ways to commit suicide. The difference with YouTube, however, was that they relied more heavily upon a filtering algorithm rather than actual human beings to determine what is and what isn’t suitable for viewing. But given the amount of new attention being brought to content moderation, they are making haste to switch to real-life people.

The problem is, there is a serious lack of people willing to become or remain content moderators. But after watching “a war victim being gutted” or a “cat thrown into a microwave”, how can you blame them? Companies like Facebook and Microsoft have even offered psychological counseling for their moderators, but many individuals find that it is not enough. Some moderators have even taken out lawsuits claiming that they were not protected properly or not given effective avenues to deal with the onslaught of disturbing content they had to witness. All of which are issues that marketing companies in Texas don’t really have to deal with.

In other scenarios, moderators can become desensitized to graphic material due to the sheer volume of content that they review. Sarah Katz, a former moderator for FB cited that she grew numb from her experience. She also noted that accounts older than 30 days old would only have their content taken down, rather than their whole account, citing perhaps another flaw in Facebook’s moderation procedures.

Perspective from an agency

As regular audience members, parents, and civil minded beings, we’re extremely appreciative of content moderators and their efforts to protect us. But, as a digital marketing firm in Austin, our relationship with moderators becomes a little more strained.

Even though most of our clients have products that the general population’s soft as s*** psyches can handle, some of the people we work with don’t sit well with others. While this could go into a whole other conversation about the evolution of PC culture and how it’s hurting mankind more than it benefits mankind, our discussion revolves around how our media advertising agency interacts with moderators. Actually, let me rephrase – our discussion revolves around how moderators TORMENT us.

One of our most loyal clients, Austin Labiaplasty and Vaginal Rejuvenation, is a vaginal reconstructive surgeon and despite their desire to contribute to society and empower women, they face a never-ending reign of scrutiny from content moderators. So while your intentions might be good, it’s left up to the moderator and network guidelines to determine whether your content lives or not.

Another client that our digital marketing firm in Austin has faced censorship issues with is a tantric sex coach by the name of Sarrah Rose. Promoting for Tantric Activation has proven to be even more difficult than ALVR, as the moderators are filtering our activity even more rapidly than before. While this might be frustrating from a media advertising agency point of view, as creatives and problem solvers, we embrace it.

Great problems drive great solutions

For our copy team, it was difficult to employ messaging and content writing that promoted ALVR’s product without getting flagged. But rather than wallow in defeat, it lead to the development of smarter, quicker copy, that got the message off in a much more coy and subtle manner. Overall, the content was far better than what it would’ve been had the moderators not been considered. It pushed us to find a better way to advertise a product, which as a digital marketing agency, is one of our main objectives.

The situation with Tantric, however, was a little more difficult. Since moderators also factor in where advertisements lead to and the content of landing pages, we had to find a totally new platform where we could advertise with fewer restrictions. This lead us to Reddit, where we found a thriving marketplace for sex coaches and tantric sex courses. Had we not considered moderators, we wouldn’t have found this unknown revenue stream.

In conclusion, we would like for content moderators, and our current population in general, to stop being such pussies. But, until that day comes, we will continue striving to find ways to beat algorithms and get our content through filters. Because we stand behind the products we promote. We consciously believe that they have the ability to improve people’s lives and benefit our society. Otherwise, we’d never work with them.

Our media advertising agency looks at issues and problems like these as the crest of a hill, once we get over it, we’re charging downhill at an extraordinary pace. And that’s how all the problems in our digital marketing world are perceived. As just another trail that leads to a solution.

So next time your groveling about internet censorship just remember, it’s so little Sophia doesn’t watch ISIS recruitment videos. And ask yourself – can the system be beaten?

Let Your Curiosity Take Control