Calendar

Facebook’s Thin Line: Is This the New Normal?

By Troy Hudson

When Facebook announced last month that personal data from approximately 50 million users had been scraped by data firm Cambridge Analytica, that number was so staggeringly large that it would have been easy to assume it almost didn’t matter. Like the amount of consumer plastic that ends up in the ocean each year (8 million metric tons), the number of people displaced due to climate change since 2008 (21.5 million), or the distance to our closest neighboring galaxy (2.5 million light years), a figure like 50 million people is so large as to seem almost inconsequential.

Only it does matter — not just because it may or may not have helped Cambridge Analytica unduly influence the 2016 presidential election in favor of Donald Trump, but because of what it implies about a company that touches the lives of almost everyone on the planet, yet has remained frustratingly opaque regarding the privacy concerns of its users.

As of April 5, Facebook now estimates closer to 87 million users were affected by the privacy loophole exploited by entities like Cambridge Analytica and others. Effectively, said Facebook CEO and founder Mark Zuckerberg, “I would assume if you had that setting turned on, that someone at some point has accessed your public information in this way.” In other words, given Facebook’s ever-changing privacy controls, odds are very good that your information has been compromised at some point.

As troubling as the breach itself and its potential implications are, Facebook’s handling of it raises still more concerns. They knew about the issue as early as 2011, but since technically every user had at some point agreed (or didn’t explicitly refuse) to let their public profiles be searchable in this way, Facebook did nothing to stop it. It was only when former Cambridge Analytica data scientist Christopher Wylie came forward as a whistleblower that Facebook started to take the problem seriously.

In the wake of the scandal, many are questioning Zuckerberg’s leadership (he has insisted he’s not stepping down). Facebook has always existed on the bleeding edge of culture and technology, a step or two ahead of government regulation. And since its inception, Zuckerberg has been the company’s autonomous leader. He is Facebook’s CEO, chairman of the board and majority shareholder, meaning he is technically answerable to no one. Is this an appropriate corporate structure for a company that handles the valuable personal information of almost 2 billion users?

We trust Facebook with enormously important personal data, but most of us seldom realize this or take steps to limit access to our public profiles. Facebook’s public stance is that the more data we give the site access to, the better tailored our experience with it will be. This is undoubtedly correct, but it doesn’t necessarily follow that the privacy we surrender for this functionality is worth the risk. Facebook has certainly not proven to be a trustworthy steward of our data, so why do we keep placing our trust in them?

So much of our economy relies on trust — from credit-card companies, to car manufacturers like Tesla, to lodging services like Airbnb and VRBO — and trust in these institutions has been eroding rapidly following one scandal after another in 2018. Have we put too much trust in corporations? And if so, can we right the ship before another major catastrophe is upon us? Or are we content to maintain the current course and call this the new normal?

Categories: Calendar

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s