29th June 2017
San Francisco intern, Max Reichardt considers the immense responsibilities that come with immense scale.
As many of us are now aware, Facebook recently surpassed 2 billion users, 27% or so of the world population. Milestones aside, what might seem like just a newsworthy statistic is much more than mere spectacle. It serves as a reminder of the burgeoning presence - and growing influence - of social networks in our lives, and in the world at large.
For any company, corporate social responsibility is an important part of marketing, public relations, and simple moral fiber. For a behemoth like Facebook, CSR assumes new meaning; Facebook must consider its massive role in facilitating the communications of over a fourth of the world, let alone ensure they follow simpler elements of CSR like philanthropy or ethical advertising practices.
Ethics, in Facebook’s realm, are embodied in their choices relating to what users can see, hear, or post; we still determine what we think, luckily. But there is no shortage of influence regarding that, as well. Facebook faces a unique challenge in how they determine what appears in each user’s unique News Feed, whether that news is “fake” or not.
Between conventional accounts for people, groups, and businesses, Facebook’s rules are fairly clear-cut. No spam, nothing illegal, nothing pornographic, and restrictions on terror and hate speech. However, what constitutes “hate speech” and Facebook’s role in preventing it has proved complex and controversial.
During the 2016 presidential election, criticism of their “Trending Topics” algorithm surfaced. The trending function was not only accused of liberal programming bias, but was also prone to showing users more of the stories that they seemed to like. While the latter is a seemingly harmless oversight - by a poor AI only trying to do its job, mind you - such a model invites confirmation bias, which can restrict the free, open thought and discussion that drives a healthy democratic election.
Facebook has since modified the algorithm, which now gives users a choice between a variety of left- and right-wing publications for fairness’ sake. Authenticity and relevancy of content is Facebook’s aim for each unique user, though the subjective analysis behind such determinations makes this problem difficult to solve on Facebook’s global and governmentally-diverse scale.
As Facebook inevitably and exponentially grows, so do its unique challenges and ethical considerations. Creating a communications platform for a quarter of the world is no small feat, and maintaining it is no easy task, but they must face the challenge they have created - sleep in the bed they made, so to speak. Until this great ethical challenge of our time is solved, the majority of us will likely continue to use the world’s largest social network to circulate memes, watch videos of cute animals, and post about our lunches. Bon apetit.
Max Reichardt is an under-graduate at California Polytechnic State Unversity, San Luis Obispo, and is participating in Grayling's summer internship program.
22nd September 2017
Lessons from an earthquake - by someone who was there
Executive vice president, Lucia Domville was on business in Mexico City when the earthquake struck this week. She reflects on the role of the media and social media... good and bad.I lived through...Read More
11th September 2017
3 KPIs that can actually hurt ROI
September is AMEC Measurement Month making today an ideal time to discuss the big question: How do you measure success? It’s the question that shows up on every RFP. And the one that the...Read More
28th August 2017
When is news not news?
Sifting fact from fiction in a Twitter timeline.Like many people, I now look to Twitter as a source of breaking news, analysis and opinion (so so much opinion). The wonderful thing about Twitter, of...Read More