29th June 2017
San Francisco intern, Max Reichardt considers the immense responsibilities that come with immense scale.
As many of us are now aware, Facebook recently surpassed 2 billion users, 27% or so of the world population. Milestones aside, what might seem like just a newsworthy statistic is much more than mere spectacle. It serves as a reminder of the burgeoning presence - and growing influence - of social networks in our lives, and in the world at large.
For any company, corporate social responsibility is an important part of marketing, public relations, and simple moral fiber. For a behemoth like Facebook, CSR assumes new meaning; Facebook must consider its massive role in facilitating the communications of over a fourth of the world, let alone ensure they follow simpler elements of CSR like philanthropy or ethical advertising practices.
Ethics, in Facebook’s realm, are embodied in their choices relating to what users can see, hear, or post; we still determine what we think, luckily. But there is no shortage of influence regarding that, as well. Facebook faces a unique challenge in how they determine what appears in each user’s unique News Feed, whether that news is “fake” or not.
Between conventional accounts for people, groups, and businesses, Facebook’s rules are fairly clear-cut. No spam, nothing illegal, nothing pornographic, and restrictions on terror and hate speech. However, what constitutes “hate speech” and Facebook’s role in preventing it has proved complex and controversial.
During the 2016 presidential election, criticism of their “Trending Topics” algorithm surfaced. The trending function was not only accused of liberal programming bias, but was also prone to showing users more of the stories that they seemed to like. While the latter is a seemingly harmless oversight - by a poor AI only trying to do its job, mind you - such a model invites confirmation bias, which can restrict the free, open thought and discussion that drives a healthy democratic election.
Facebook has since modified the algorithm, which now gives users a choice between a variety of left- and right-wing publications for fairness’ sake. Authenticity and relevancy of content is Facebook’s aim for each unique user, though the subjective analysis behind such determinations makes this problem difficult to solve on Facebook’s global and governmentally-diverse scale.
As Facebook inevitably and exponentially grows, so do its unique challenges and ethical considerations. Creating a communications platform for a quarter of the world is no small feat, and maintaining it is no easy task, but they must face the challenge they have created - sleep in the bed they made, so to speak. Until this great ethical challenge of our time is solved, the majority of us will likely continue to use the world’s largest social network to circulate memes, watch videos of cute animals, and post about our lunches. Bon apetit.
Max Reichardt is an under-graduate at California Polytechnic State Unversity, San Luis Obispo, and is participating in Grayling's summer internship program.
12th February 2018
Data, data, everywhere – but what about the people?
Grayling’s Jon Meakin, on the need to put human experiences at the heart of storytelling.In recent weeks I’ve been spending a lot of time talking to Grayling clients and would-be clients about our...Read More
5th February 2018
Changes to the Facebook news feed are a challenge and an opportunity
The changes to Facebook are both a challenge and an opportunity for brands, argues Grayling creative director, Will Kunkel.The recently announced changes at Facebook – namely that the news feed will...Read More
22nd January 2018
Grayling San Francisco managing director, Alan Dunton asks what can be learned from Snap’s latest travails. Snap, the-struggling-to-swim social network, has an information problem … in this case,...Read More