Investors and influencers of Facebook and Apple have openly challenged and beseeched the tech giants to acknowledge and address the damage being done to children, adults, and even the very social fabric of society, by these companies ignoring, and even intentionally taking advantage of, the addictive nature of Facebook and other social media platforms, and how open to tampering they are, as well as the addictive nature of the iPhone and other electronic devices.
Comparisons to big tobacco and to the Tylenol tampering incident have been made in terms of both malfeasance, and of how to take responsibility for damaging events, and to turn it around.
As former Vice President for User Growth at Facebook (he was at Facebook from 2005 to 2011), Chamath Palihapitiya, put it back in November, “I think we have created tools that are ripping apart the social fabric of how society works… The short-term, dopamine-driven feedback loops we’ve created are destroying how society works.” (You can view Palihaptitia’s full talk on YouTube here.)
Last month (January, 2018) Roger McNamee, both a major investor in Facebook and, by many accounts, one of Mark Zuckerberg’s earlier mentors (he is credited with, among other things, convincing Zuckerberg to not sell the early version of Facebook to Yahoo, and urging Zuckerberg to hire Sheryl Sandberg and, more generally, with mentoring Zuckerberg for a few years), in both an OpEd piece in the Washington Post, and as quoted in the Guardian, pointed a finger squarely at Facebook, saying that Facebook needs to acknowledge that it has “some responsibility for what others do on its platform and that it is prepared to make fundamental changes to limit future harm.”
Riffing on the tainted Tylenol example, McNamee said in his Washington Post OpEd that “I recommend that Facebook follow the example of Johnson & Johnson during the Tylenol poisonings in 1982. Johnson & Johnson did not cause the tampering. It was not technically required to take responsibility, but it knew it was the right thing to do. The company took immediate and aggressive action to protect its customers. It took every bottle of Tylenol off every retail shelf and redesigned the packaging to make it tamper-proof. There was a substantial economic cost in the short run, but the company built trust with customers that eventually offset it.”
McNamee than outlined the steps that he believes that Facebook should take in order to both correct the issues, and to regain user confidence in the platform.
I recommend that Facebook follow the example of Johnson & Johnson during the Tylenol poisonings in 1982.
(This is Mark Zuckerberg’s blog post to which McNamee is referring.)
Then, he adds, “The same tools that make Facebook so addictive for users and so effective for advertisers are dangerous in the hands of bad actors… The company needs to change the priorities of its algorithms and retool its business model. It needs to act like Johnson & Johnson.”
(Back in November we told you about another Facebook founder, Sean Parker, acknowledging that Facebook was intentionally addicting people. “And that means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever,” said Parker, adding that “The thought process that went into building these applications, Facebook being the first of them, … was all about: ‘How do we consume as much of your time and conscious attention as possible?’ “)
Going on, McNamee then says that “Facebook also owes its users a personal apology. Thanks to Facebook’s negligence, 126 million Americans were exposed to Russian manipulation, and most of them do not realize it. To compensate, Facebook must notify every user touched by Russian election interference with a personal message explaining how the platform was manipulated and how that manipulation harmed users and the country. They should include copies of every post, group, event and ad each user received. Facebook is the only entity able to break through to users trapped in its filter bubbles.”
We think that this last is worth repeating. Facebook is the only entity which can tell its users “Hey, we showed you something that may have influenced you yet was patently, intentionally false. Here it is…”
And McNamee is almost certainly correct that there are millions of Facebook users who have no idea that they were shown false items, designed to influence them.
Last week, in an interview with CNBC, McNamee, along with NYU professor Scott Galloway, suggested that Facebook should be split from Instagram, as well as splitting Google and Amazon into smaller, discrete companies.
“These companies have run unchecked for a very long time. The unchecked power produces really bad outcomes,” McNamee told CNBC.
Meanwhile, but relatedly, two of Apple’s largest shareholders, Jana Partners and the California State Teachers’ Retirement System (who knew?) exhorted Apple to recognize that it is “no secret that social media sites and applications for which the iPhone and iPad are a primary gateway are usually designed to be as addictive and time-consuming as possible,” and “that there is also a growing societal unease about whether at least some people are getting too much of a good thing when it comes to technology…”
“In fact,” the open letter to Apple points out, “even the original designers of the iPhone user interface and Apple’s current chief design officer have publicly worried about the iPhone’s potential for overuse, and there is no good reason why you should not address this issue proactively.”
So what, with increasing recognition of the addictive nature of pocket devices and social media – and with the recognition that it is intentional, and so quite likely to not get better, if not get worse, in the near-term, are parents and others to do to address these concerns?
This seems a near-perfect time to invoke that age-old wisdom in the Serenity Prayer which is, of course, oft-invoked in addiction programs:
God grant me the serenity
to accept the things I cannot change;
courage to change the things I can;
and wisdom to know the difference.
Accept, as has Chamath Palihapitiya, the things you cannot change, and take the things you can change into your own hands. Says Palihapitiya, “I can’t control them (Facebook and other social media platforms). I can control my decision, which is that I don’t use that sh*t. I can control my kids’ decisions, which is that they’re not allowed to use that sh*t.”
|Get notified of new Internet Patrol articles!
You might also like some of our other articles: