29th June 2017
San Francisco intern, Max Reichardt considers the immense responsibilities that come with immense scale.
As many of us are now aware, Facebook recently surpassed 2 billion users, 27% or so of the world population. Milestones aside, what might seem like just a newsworthy statistic is much more than mere spectacle. It serves as a reminder of the burgeoning presence - and growing influence - of social networks in our lives, and in the world at large.
For any company, corporate social responsibility is an important part of marketing, public relations, and simple moral fiber. For a behemoth like Facebook, CSR assumes new meaning; Facebook must consider its massive role in facilitating the communications of over a fourth of the world, let alone ensure they follow simpler elements of CSR like philanthropy or ethical advertising practices.
Ethics, in Facebook’s realm, are embodied in their choices relating to what users can see, hear, or post; we still determine what we think, luckily. But there is no shortage of influence regarding that, as well. Facebook faces a unique challenge in how they determine what appears in each user’s unique News Feed, whether that news is “fake” or not.
Between conventional accounts for people, groups, and businesses, Facebook’s rules are fairly clear-cut. No spam, nothing illegal, nothing pornographic, and restrictions on terror and hate speech. However, what constitutes “hate speech” and Facebook’s role in preventing it has proved complex and controversial.
During the 2016 presidential election, criticism of their “Trending Topics” algorithm surfaced. The trending function was not only accused of liberal programming bias, but was also prone to showing users more of the stories that they seemed to like. While the latter is a seemingly harmless oversight - by a poor AI only trying to do its job, mind you - such a model invites confirmation bias, which can restrict the free, open thought and discussion that drives a healthy democratic election.
Facebook has since modified the algorithm, which now gives users a choice between a variety of left- and right-wing publications for fairness’ sake. Authenticity and relevancy of content is Facebook’s aim for each unique user, though the subjective analysis behind such determinations makes this problem difficult to solve on Facebook’s global and governmentally-diverse scale.
As Facebook inevitably and exponentially grows, so do its unique challenges and ethical considerations. Creating a communications platform for a quarter of the world is no small feat, and maintaining it is no easy task, but they must face the challenge they have created - sleep in the bed they made, so to speak. Until this great ethical challenge of our time is solved, the majority of us will likely continue to use the world’s largest social network to circulate memes, watch videos of cute animals, and post about our lunches. Bon apetit.
Max Reichardt is an under-graduate at California Polytechnic State Unversity, San Luis Obispo, and is participating in Grayling's summer internship program.
15th November 2016
Will Kunkel, Executive Vice President for Creative and Content in Grayling New York, on the final of our #7for17 trends, Live and Uncut‘Timing is everything’ has been a favorite line to many but...Read More
8th November 2016
Danica Ross, Grayling San Francisco US Executive Vice President, on how brands can guide themselves through the ‘the new space race’ – part of our #7for17 trends series.In an era where brands...Read More
3rd November 2016
Russell Patten, Chair of Grayling’s European Public Affairs practice, looks at one of the major political trends as part of our #7for17 series. It’s been a turbulent year in politics, with the...Read More