→ View All
Gotcha Day
Five years later….
Max's Helping Paws Foundation
Listed under: Animals
This Holiday Season, Donate to Monterey County Gives.
Recent much-hyped studies are inconclusive, but prove one thing: Facebook’s algorithm pushes hate.
Science headlined its report on its study of Meta and Facebook "Wired to Split."
Background: Facebook spent many years thwarting efforts by social scientists to obtain data required to analyze the impact the social media giant has on society. So it was a bit of a shock to people who care about this stuff to learn last week that the newly named Meta had cooperated with academic researchers on not one but four rather extensive studies.
All four studies looked into Facebook’s influence on its users’ political attitudes and moods, seeking to address a long-held concern that the massive social network feeds the polarization that sees Americans more and more at one another’s throats.
Three of the four studies were published in the journal Science, which headlined its cover “Wired to Split.” (The fourth study was published in the journal Nature.) Science's accompanying illustration depicted “blue“ and “red” caricatures staring at screens while facing away from one another. Each group was perched on its own bubble, which was formed by the top of Meta’s infinity-ribbon logo.
Obviously, the headline and the illustration suggested that Science’s three studies had reached a damning conclusion regarding Facebook‘s impact on the electorate. While that is not the case, exactly, and the pieces in the journal were quite balanced, it is also certainly not the case that any of these studies took Meta off the hook.
However, in a press release accompanying the publication of the studies, Meta’s president of global affairs, Nick Clegg, wrote that the studies add to a “growing body of research showing there is little evidence that key features of Meta’s platforms alone cause harmful ‘affective’ polarization.”
Journalists writing in a variety of outlets from The Wall Street Journal to The Atlantic took strenuous issue with Clegg’s characterization, while others including the Libertarian monthly Reason attacked Science. (Pun intended.)
Rather than litigate this debate, I would like to focus on one finding that has not been getting the attention I believe it deserves.
One of the newly published papers deals with an experiment in which Facebook’s algorithm, which uses mysterious techno-magic to determine what users find in their feeds, was replaced with an unfiltered feed that simply showed the most recent posts first—in "reverse chronological" order.
This resulted in the spread of more untrustworthy content than usual, because the algorithm does filter misinformation. However, eliminating the algorithm “cut hateful and intolerant content almost in half.”
That was one of the biggest impacts reported from any of the studies. I believe this study proves that Facebook's vauted algorithm pushes that kind of polarizing content.
Outrage and Addiction
It has been known for several years that Facebook intentionally feeds its users hateful and intolerant messages. This is not an accident—in Silicon Valley terms, “it’s not a bug—it’s a feature.”
Social media algorithms have been intentionally rigged to favor the basest human emotions: fear, outrage, anger etc. Tristan Harris, former Google ethics guru, calls this “a race to the bottom of the brainstem.”
The social media platforms do this for one reason: By accessing human beings’ basest emotions, they can turn us into addicts. They do this because online, the coin of the realm is attention, and in the attention economy, every online business is in constant competition for our eyeballs.
The code designed to trigger fear, anger, shame, even depression, is baked into the technology.
The question of how to attract and keep us online and engaged is being answered at Tristan Harris’s alma mater, the Persuasive Technology Lab at Stanford University, where a professor named B.J. Fogg, along with the author Nir Eyal, contribute to the discipline and field of behavioral engineering. Eyal wrote the book on the topic, unapologetically titled Hooked.
The blurb on that book's dust jacket cheerfully introduces this field of “psychological marketing.”
Ever wonder why some gadgets and apps are so addictive? Do you sometimes feel you are not fully in control of using them? The answer is often found in the design. ... Strategic product design nurtures customer engagement and takes control of user behavior. Companies such as Twitter, Instagram, Pinterest and others are getting quite good at this.
It’s difficult to read that now and appreciate the playful use of the “addictive” metaphor. And please note that the goal is “engagement”—Kara Swisher, the best-loved and most-feared journalist in Silicon Valley, notes that engagement “used to be such a nice word. Now it’s the nicotine of tech.”
Eyal and his army of acolytes (he has taught at Stanford’s Graduate School of Business and School of Design, and has a gazillion views on YouTube) call what they do “designing consumer behavior.”
This kind of behavioral-modification-as-business might sound scary outside Silicon Valley, home of the cutthroat libertarian ethos captured in Facebook’s longtime motto, “move fast and break things,” and the place where “disruption” is an ultimate good. In the Valley, Eyer’s proscriptions became dogma.
Matt Mullenweg, founder of Wordpress, gave Eyer a one-sentence review: “Read Hooked or the company that replaces you will.”
Virtually everyone who designs the experiences that occupy the average American for countless hours a day has read Hooked or been influenced by its ideas.
From Hooked to Zucked
Roger McNamee, the esteemed Silicon Valley VC who was Mark Zuckerberg’s first mentor—he introduced Zuck to his longtime COO Sheryl Sandberg—says Facebook’s intentional move to the dark side inspired him to write his book: Zucked: Waking Up to the Facebook Catastrophe.
“Facebook realized that appealing to outrage and fear was much more successful than appealing to happiness,” McNamee says. “Because one person’s joy is another person’s jealousy. Whereas if you’re afraid or outraged, you share stuff in order to make other people also afraid or outraged, because that just makes you feel better.” Because…misery loves company, as grandpa used to say.
According to McAfee, Facebook learned this nefarious trick from Cambridge Analytica, the disgraced company co-owned by Trump funder Robert Mercer, where Steve Bannon served as vice president and boardmember. Christopher Wiley, Cambridge Analytica chief data analyst-turned-whistleblower, gave investigators documents proving that the company was mainly interested in discovering what he would later call Facebook users’ “demons.”
In addition to stealing and revealing information about whether users were introverts or extroverts, Cambridge Analytica also secured information about what it called “sensational interests.“ The Wiley documents show that that the company harvested data revealing peoples’ interest in the following: “…militarism, guns and shooting, martial arts, crossbows, knives, violent occultism, drugs, black magic, paganism ... credulousness, the paranormal, the environment, flying saucers…”
McNamee says he started noticing “something really wrong” two years before the Cambridge Analytica scandal, between January and October 2016, during the Democratic primary and Brexit debate, “where it was clear that Facebook had an influence that was really negative because it gave an advantage to inflammatory and hostile messages.”
“Civility is a mask, and they want to strip you of that, and get to your raw underlying emotions,” McNamee says. And they do so “to find out what your real biases are, because that's where all the behavioral prediction value is.”
This, as we have seen, has resulted in the amplification of the most hostile voices in society.
McNamee, who made a lot of money with his early investment in Facebook, and has made tons more all over the Valley, is now calling for radical government intervention.
“I believe we can get rid of the harm without having to eliminate what we'd like about the products,” he says. “They're going to be a lot less profitable, but tough noogies. I mean, corporations are not allowed to destroy civilization just because it's more profitable than building civilization.”
Support California Local
Breaking news article about a local or state topic.
You are subscribed!
Look for our confirmation message in your email inbox.
And look for our newsletter every Monday morning. See you then!
You're already subscribed
It looks like you're already subscribed to the newsletter. Not seeing it in the email inbox of the address you submitted? Be sure to check your spam folder or promotions folder (Gmail) in case your email provider diverted it there.
There was a problem with the submitted email address.
We can't subscribe you with the submitted email address. Please try another.