29th June 2017
San Francisco intern, Max Reichardt considers the immense responsibilities that come with immense scale.
As many of us are now aware, Facebook recently surpassed 2 billion users, 27% or so of the world population. Milestones aside, what might seem like just a newsworthy statistic is much more than mere spectacle. It serves as a reminder of the burgeoning presence - and growing influence - of social networks in our lives, and in the world at large.
For any company, corporate social responsibility is an important part of marketing, public relations, and simple moral fiber. For a behemoth like Facebook, CSR assumes new meaning; Facebook must consider its massive role in facilitating the communications of over a fourth of the world, let alone ensure they follow simpler elements of CSR like philanthropy or ethical advertising practices.
Ethics, in Facebook’s realm, are embodied in their choices relating to what users can see, hear, or post; we still determine what we think, luckily. But there is no shortage of influence regarding that, as well. Facebook faces a unique challenge in how they determine what appears in each user’s unique News Feed, whether that news is “fake” or not.
Between conventional accounts for people, groups, and businesses, Facebook’s rules are fairly clear-cut. No spam, nothing illegal, nothing pornographic, and restrictions on terror and hate speech. However, what constitutes “hate speech” and Facebook’s role in preventing it has proved complex and controversial.
During the 2016 presidential election, criticism of their “Trending Topics” algorithm surfaced. The trending function was not only accused of liberal programming bias, but was also prone to showing users more of the stories that they seemed to like. While the latter is a seemingly harmless oversight - by a poor AI only trying to do its job, mind you - such a model invites confirmation bias, which can restrict the free, open thought and discussion that drives a healthy democratic election.
Facebook has since modified the algorithm, which now gives users a choice between a variety of left- and right-wing publications for fairness’ sake. Authenticity and relevancy of content is Facebook’s aim for each unique user, though the subjective analysis behind such determinations makes this problem difficult to solve on Facebook’s global and governmentally-diverse scale.
As Facebook inevitably and exponentially grows, so do its unique challenges and ethical considerations. Creating a communications platform for a quarter of the world is no small feat, and maintaining it is no easy task, but they must face the challenge they have created - sleep in the bed they made, so to speak. Until this great ethical challenge of our time is solved, the majority of us will likely continue to use the world’s largest social network to circulate memes, watch videos of cute animals, and post about our lunches. Bon apetit.
Max Reichardt is an under-graduate at California Polytechnic State Unversity, San Luis Obispo, and is participating in Grayling's summer internship program.
29th June 2017
It takes confidence to tell a new story after a crisis. Here’s how you find it.
David Schraeder dispenses some crisis management advice.Your company has taken a hit and its reputation is slightly less stellar than it was before the matter all began. You and your executive team...Read More
26th June 2017
Is social now the name of the game?
Grayling's Ashton Shurson sees changes at gaming shindig, E3. Last week marked the annual gathering of gaming’s biggest publishers, developers, media – and now fans. The Electronic...Read More
23rd June 2017
OK Computer, What Next?
Jon Meakin looks to the past to predict the future.As Radiohead limber up for their umpteenth time at the Glastonbury Festival, this week I was reminded that it is 20 years since the release of the...Read More