Logo

Is Facebook a Career Liability?

Published: Mar 20, 2018

 Consulting       Job Search       MBA       Technology       Workplace Issues       
Article image

"You should delete Facebook." "Why you shouldn't delete Facebook." "Should you delete Facebook?"

All of those are actual headlines that have flashed across my computer screen at some point in the last few days, and they raise real questions about our digital lives—questions that go far beyond the usual career-related concerns we typically cover here (in a nutshell: protecting your feed and not posting stuff that will give recruiters an excuse to write you off).

Rather, the details now emerging from the variety of stories swirling around Facebook suggest that many of those privacy settings offered little more than a fig leaf for users—and they paint one of the world's most attractive employers in a very dim light indeed.

Here's a quick summary of just two of the stories—both interlinked—that have rocked the firm of late:

1. The Cambridge Analytica Allegations

The story: According to Facebook's account, it allowed an academic—Aleksandr Kogan—to harvest its user data through a personality test app—a fairly common occurrence, and a major reason why you should think twice before you provide your authentication details to any third party on Facebook. However, where the story gets murkier is that Kogan used the app to collect location data, and to harvest data from each user's contacts—provided that those users had not updated their privacy settings to explicitly prevent such a thing from occurring. It is estimated that the data of up to 50 million Facebook users was harvested in this way, with the vast majority of users being unaware that their profile data was being accessed at all. 

According to Facebook, Kogan then breached its rules by sharing that information with Cambridge Analytica, a data analysis company that specializes in aiding electoral candidates. 

Why it's significant: There are so many threads to untangle in this story, but from Facebook's perspective, the biggest one has to be the "data leak", and how the company has responded to it. To date, the company has not admitted that the leak has touched anywhere close to 50 million accounts, despite credible reports to the contrary. And it has become entangled in the discussions over Cambridge Analytica's role in the 2016 US election, plus its activities elsewhere in the world, including Kenya and, most likely, the Brexit vote. At best, at this point Facebook looks like a company that had no idea of the power or reach of the network it had created. At worst, it looks complicit in some extremely shady work related to politics and dark money. 

Gullible and/or guilty: neither is a good look for a company that has positioned itself at the heart of the hyperconnected world we live in.

2. Lack of transparency over fake news and disinformation

The story: In an era where "fake news" has come to be a byword for "I don't agree with what you're saying", it's easy to forget that it began as a term to describe a very specific issue: stories and memes about political candidates that spread far and wide during the 2016 candidate despite not being true. One of the places where those stories spread most successfully: Facebook—and largely, it would seem, thanks to the work of firms like Cambridge Analytica, which was able to figure out exactly which users were most likely to spread those stories. 

To date, Facebook has not made any significant strides towards dealing with the issue, and seems, again, to have been less than forthcoming in terms of admitting its own role in perpetrating it.  As a NY Times story today pointed out, that resistance goes all the way to the top of the firm: Alex Stamos, Facebook's Chief Information Officer "had advocated more disclosure around Russian interference of the platform and some restructuring to better address the issues, but was met with resistance by colleagues," up to and including COO Sheryl Sandberg. 

Why it's significant: It's long been common knowledge that Facebook has a problem with bad actors using its site to seed and spread disinformation. Much like the Cambridge Analytica problem—which is both a subset of the fake news problem and a separate issue thanks to the data leak involved—Facebook's biggest problem is that there is no good answer for it to give to the public. Again, neither of the options to describe the firm's behavior paints it in a good light: it either knew that these kinds of dark influence campaigns were going on and chose to do nothing about them; or it was played by people who understood the potential of its network better than the company did. 

Gullible and/or guilty: neither is a good look for a company that has positioned itself at the heart of the hyperconnected world we live in. 

Would you take a job at Facebook?

At this point, anyone who's working for Facebook, or thinking about joining the firm, has to be having second thoughts about having their name associated with a brand that is becoming ever more tarnished by the day. While we're not there yet, it's suddenly not unthinkable to imagine a world where Facebook isn't one of the most powerful and influential companies on the planet. Where elite talent doesn't want to go and work because of the stain of allegations of corruption, political meddling and/or downright incompetence. That alone is a remarkable shift in a very short period of time. 

There's obviously a lot of life left in both of these stories, and no telling where some of the revelations will take us. While there's no way of telling what shape the company will be in when the dust has settled, one thing is certain: it's going to have a much harder time selling itself to the kinds of candidates it has thrived on to get to where it is today—and that's only going to add to its difficulties as it faces up to an uncertain future.

***