The Social-media Dilemma: Manipulation and Alteration of Society by Big Tech
Written by C. Mitchell Shaw
Earlier this month, Netflix released a documentary titled The Social Dilemma. Featuring former insiders from Big Tech and social-media platforms, the documentary is an exposé of the ways a “handful of tech designers” exert “control over the way billions of us think, act, and live our lives,” according to the official website for the film.
This writer obtained the documentary from an alternative source, since — given Netflix’s propensity for sexualizing children with projects such as Cuties — I do not have (and would never have) a Netflix account.
The film begins by listing the credentials and bona fides of the people who — in their interviews — lay out the evidence that Big Tech companies, including social-media platforms used by billions around the globe, control, manipulate, polarize, distract, and divide users they promise to connect — all while monitizing those users. The panel includes those who have been employed by Facebook, Instagram, Google, YouTube, Apple, Twitter, Palm, Mozilla, and Pinterest. Some of these former Big Tech employees worked near the top of the organizational charts at the companies where they were employed: a senior vice president of engineering at Twitter, a president of Pinterest (who was also a director of monitization at Facebook), a head of development and head of consumer product at Twitter, and a co-inventor of Google Drive, Gmail Chat, Facebook Pages, and the ubiquitous Facebook like button.
Given the caliber of the people speaking, what they have to say carries quite a bit of weight. And what they have to say is damning indeed to the oft-repeated claims of Big Tech that the platforms used by billions of Earthlings are neutral sources of information and open platforms for free speech. In fact the truth is in direct opposition to those claims.
One interviewee — Tim Kendall, whose resume includes being a former executive at Facebook in charge of monitization, former president at Pinterest, and CEO at Moment — made the point that while tech has made many lives better and richer — by reuniting lost family members, finding organ donors, and making “meaningful, systemic changes” — the fact remains that many of the people who helped develop the platforms that brought about those positive things “were naive about the flip side of that coin.”
This writer has long said that tech is a double-edged sword: While one side makes users’ lives richer and fuller — by innovative approaches that allow greater access to information and communication and doing in minutes what would have previously taken years — it has also allowed greater manipulation, isolation, and surveillance. Kendall’s “flip side of the coin” is becoming more and more problematic for society and individuals. Tristan Harris, former design ethicist at Google, is seen going over his remarks for a presentation and saying, “If you ask people. ‘What’s wrong in the tech industry right now?’ there’s a cacophony of grievances and scandals.” Those include harvesting of users’ data, tech addiction, fake news, polarization, and election hacking. But after listing those issues, Harris asks, “But is there something that is beneath all these problems that’s causing all these things to happen at once?”
Harris continues, honing the purpose of his intended remarks for the presentation: “I want people to see there’s a problem happening in the tech industry, and it doesn’t have a name, and it has to do with one source.”
Harris said that after working at Google for a while, he realized — as someone who himself was addicted to e-mail — that no one at Google was working to make it less addictive. So, he put together a presentation, a “call to arms.” He describes it as basically saying, “Never before in history have 50 designers — 20- to 35-year-old white guys in California — made decisions that would have an impact on two billion people.” He goes on to say, “Two billion people will have thoughts that they didn’t intend to have because a designer at Google said, ‘This is how notifications work on that screen that you wake up to in the morning.’” He sent the presentation to 15 to 20 of his closest colleagues at Google and was “nervous” about it because he “wasn’t sure how it was going to land.”
The next day, he was barraged with positive comments. People saying they agreed, that they see how it affects their kids, and that something should be done about it. He learned that Larry Page, one of the co-founders of Google, had been made aware of the presentation in three separate meetings that day. He saw it as a “cultural moment” where Google was going to have to take this seriously. “And then, nothing.”
The reason for this is apparent. Nothing happened to fix the problem because the problem is by design. Google doesn’t want e-mail to be less addictive because Google wants e-mail to be more addictive. Because if it were less addictive, there would be less data for Google to harvest, less ad revenue, and less opportunity to manipulate people’s choices based on what Google knows about them from that data harvesting. Google’s entire business model is based on more use, not less.
One segment of the documentary really hones in on this point. Building on the principle that states, “If you are not paying for the product, then you are the product,” several of the interviewees make points such as (1) Google, Facebook, Twitter, Instagram, et al. make billions of dollars offering free services and platforms, (2) the user is not the customer paying to make the product available; advertisers are, (3) Big Tech sells something to those advertisers for those billions of dollars in profits, and (4) your data — much of it very personal — is what is sold to those advertisers.
Enter the addictive nature of social media, e-mail, etc. that is deliberately baked into those services. Harris states that Big Tech companies are “competing for your attention.” He goes on to say, “Their business model is to keep people engaged on the screen.” Kendall adds that the goal is to “figure out how to get as much of this person’s attention as we possibly can.” He phrases it as, “How much time can we get you to spend? How much of your life can we get you to give to us?”
Jaron Lanier, author of a book about why people should delete their social-media accounts right now, says the idea that it is merely our attention that is the product is “a little too simplistic.” In actuality, “It’s the gradual, slight, imperceptible change in your own behavior that is the product.”
And that is the real point.
After all, if advertisers can merely show you a trinket, left-handed widget, or service and hope you will find it interesting enough to purchase it, the pool is too small. If, though, they could — based on a profile they have built on you, knowing your likes, dislikes, habits, and morals — shift the way you think, they could sell you something that you may never have considered otherwise. So, that is precisely what they do. They create cleverly-designed ad campaigns to manipulate your desires and convince you to change your behavior.
That would be bad enough if the end result were merely mounting debt to cover clothes, furniture, electronics, and other things. But the implications are staggering. Consider elections, legislation, societal norms. By allowing data-mining and data analysis of your most personal thoughts and ideas, you have handed the keys to the kingdom over to people who have their own agenda. And that agenda is much larger than billions of dollars in profit.
The Social Dilemma makes the point — and drives it home — that modern digital advertising is based on accurate predictions about users’ responses to ad campaigns. But is it really a prediction if the data to which a person is exposed is filtered to alter their response? By way of analogy, if this writer predicts that you will be angry and then insults your mother, is it a wonder that your response to that insult is to get angry? Causing a particular behavior is not the same as predicting it.
By manipulating users’ perception — via filtered feeds, timelines, comments, and more — Big Tech is altering your thoughts, ideas, and choices. Big Tech is altering you. And while individual developers create the tech that accomplishes that manipulation, it is the tech itself — powerful algorithms free from human error — that actually implement the manipulation to change your mind and your behavior.
Social-media interaction is less about personal communication between two or more people and more about psychological hacking by an impersonal computer program. Users are controlled by the devices they think they own. And those devices are in turn controlled by developers and the companies that employ them.
An interesting quote that serves as a transition from one segment to another in the documentary says, “There are only two industries that call their customers ‘users’: illegal drugs and software.” It is worth noting that one of those industries openly acknowledges that the way to keep “users” coming back for more is to make them addicted. The other knows it is true, but will not openly acknowledge that truth.
Kendall says that he realized he was addicted to the tech he had helped build. He says that one moment really stands out to him: When he was the president of Pinterest, he came home one day and said, “I couldn’t get off my phone once I got home — despite having two young kids who needed my love and attention.” He found “classic irony” in the fact that he was “falling prey to” something he had created for others. “Some of those moments, I couldn’t help myself,” he said, adding, “It’s interesting that knowing what was going on behind the curtain, I still wasn’t able to control my usage.”
That addictive problem is far worse in younger people. So are its effects. Jonathan Haidt, Ph.D., points to the correlation between social-media usage and emotional problems in preteens and early teens. He shows that the increased rate of self harm among those age groups — including suicides and attempted suicides — follows the same trend as the growth in numbers of those age groups who have social-media accounts. The basic idea is that by seeking approval on a scale afforded by social media is a losing proposition that leaves the vulnerable — do you remember being that age? — feeling even more exposed and empty. Add the extent of Internet bullying to the equation and the prospect is bleak indeed.
Another aspect is the polarizing effect of social-media timelines and news feeds. Since what is filtered for you is based on Big Tech’s algorithms taking what they already know about you into account before showing you anything, users tend to live in a bubble. Liberals see more and more liberal posts while conservatives see more and more conservative posts. The result is that person A cannot imagine why person B — who must see the same information — cannot see why his position is wrong. But that’s just the thing: Person B does not see the same information. His feed is tailored to keep him engaged and slightly shifting in his views.
As The Social Dilemma shows, this is a valuable tool for destabilizing nations. By leading people down “rabbit holes” of ideas that are similar enough to what those people already believe — and just different enough — polarization becomes radicalization.
The solution is so simple that nearly everyone knows it and misses it at the same time. Refuse to cooperate with an evil system designed to manipulate you, isolate you, and polarize you while promising to connect you. If that means deleting your social-media accounts, then so be it.
For everything wrong with Netflix, they got this one right.
C. Mitchell Shaw is a freelance writer and public speaker who addresses a range of topics related to liberty and the U.S. Constitution. A strong privacy advocate, he was a privacy nerd before it was cool. He hosts and produces the popular Enemy of the [Surveillance] State podcast.
Courtesy of The New American