29th June 2017
San Francisco intern, Max Reichardt considers the immense responsibilities that come with immense scale.
As many of us are now aware, Facebook recently surpassed 2 billion users, 27% or so of the world population. Milestones aside, what might seem like just a newsworthy statistic is much more than mere spectacle. It serves as a reminder of the burgeoning presence - and growing influence - of social networks in our lives, and in the world at large.
For any company, corporate social responsibility is an important part of marketing, public relations, and simple moral fiber. For a behemoth like Facebook, CSR assumes new meaning; Facebook must consider its massive role in facilitating the communications of over a fourth of the world, let alone ensure they follow simpler elements of CSR like philanthropy or ethical advertising practices.
Ethics, in Facebook’s realm, are embodied in their choices relating to what users can see, hear, or post; we still determine what we think, luckily. But there is no shortage of influence regarding that, as well. Facebook faces a unique challenge in how they determine what appears in each user’s unique News Feed, whether that news is “fake” or not.
Between conventional accounts for people, groups, and businesses, Facebook’s rules are fairly clear-cut. No spam, nothing illegal, nothing pornographic, and restrictions on terror and hate speech. However, what constitutes “hate speech” and Facebook’s role in preventing it has proved complex and controversial.
During the 2016 presidential election, criticism of their “Trending Topics” algorithm surfaced. The trending function was not only accused of liberal programming bias, but was also prone to showing users more of the stories that they seemed to like. While the latter is a seemingly harmless oversight - by a poor AI only trying to do its job, mind you - such a model invites confirmation bias, which can restrict the free, open thought and discussion that drives a healthy democratic election.
Facebook has since modified the algorithm, which now gives users a choice between a variety of left- and right-wing publications for fairness’ sake. Authenticity and relevancy of content is Facebook’s aim for each unique user, though the subjective analysis behind such determinations makes this problem difficult to solve on Facebook’s global and governmentally-diverse scale.
As Facebook inevitably and exponentially grows, so do its unique challenges and ethical considerations. Creating a communications platform for a quarter of the world is no small feat, and maintaining it is no easy task, but they must face the challenge they have created - sleep in the bed they made, so to speak. Until this great ethical challenge of our time is solved, the majority of us will likely continue to use the world’s largest social network to circulate memes, watch videos of cute animals, and post about our lunches. Bon apetit.
Max Reichardt is an under-graduate at California Polytechnic State Unversity, San Luis Obispo, and is participating in Grayling's summer internship program.
23rd November 2017
Black Friday: From Sales Opportunity to Brand Opportunity
Although Thanksgiving is a holiday based on the premise of being grateful, it has long been tied to consumerism. Originally slated for the last Thursday in November, it is said Thanksgiving’s...Read More
21st November 2017
Did Justice League even have a chance?
Crystal Yang wonders whether 'negativity bias' is responsible for the flop of Justice League.If I were to partake in #WCW, every Wednesday would be a tribute to Gal Gadot. I can’t go as far...Read More
16th November 2017
What MiFID II will mean for the future of Investor Relations
Lucia Domville looks ahead to some seismic changes in the IR space in 2018.Starting January 2018 MiFID II regulation (Markets in Financial Instruments Directive 2004/39/EC) will kick in, and with it,...Read More