Radiolab Podcast: Facebook’s Supreme Court

Since its inception, the perennial thorn in Facebook’s side has been content moderation. That is, deciding what you and I are allowed to post on the site and what we’re not. Missteps by Facebook in this area have fueled everything from a genocide in Myanmar to viral disinformation surrounding politics and the coronavirus. However, just this past year, conceding their failings, Facebook shifted its approach. They erected an independent body of twenty jurors that will make the final call on many of Facebook’s thorniest decisions. This body has been called: Facebook’s Supreme Court.

So today, in collaboration with the New Yorker magazine and the New Yorker Radio Hour, we explore how this body came to be, what power it really has and how the consequences of its decisions will be nothing short of life or death.

https://www.wnycstudios.org/podcasts/radiolab/articles/facebooks-supreme-court

Click here for other topics tagged “Facebook”

Netflix Documentary: The Social Dilemma…and related articles

Here are a couple of posts around the theme of Knowledge and Technology. Netflix has recently put out a documentary called “The Social Dilemma” (trailer linked below). It touches upon some commonly discussed themes around the dangers of communications technologies and social media. 

What’s interesting is that despite what people agree are problematic outcomes, there are disagreements among root causes. 

This is just a great line from a NYTimes Article

The trouble with the internet, Mr. Williams says, is that it rewards extremes. Say you’re driving down the road and see a car crash. Of course you look. Everyone looks. The internet interprets behavior like this to mean everyone is asking for car crashes, so it tries to supply them. 

from: ‘The Internet Is Broken’: @ev Is Trying to Salvage It

 

From the “Social Dilemma Fails to Tackle the Real Issues in Tech”, which takes a critical view of the argument put forward in The Social Dilemma:

Focusing instead on how existing inequalities intersect with technology would have opened up space for a different and more productive conversation. These inequalities actually influence the design choices that the film so heavily focuses on—more specifically, who gets to make these choices.

https://slate.com/technology/2020/09/social-dilemma-netflix-technology.html

From “The Risk Makers: Viral hate, election interference, and hacked accounts: inside the tech industry’s decades-long failure to reckon with risk”

The internet’s “condition of harm” and its direct relation to risk is structural. The tech industry — from venture capitalists to engineers to creative visionaries — is known for its strike-it-rich Wild West individualistic ethos, swaggering risk-taking, and persistent homogeneity. Some of this may be a direct result of the industry’s whiteness and maleness. For more than two decades, studies have found that a specific subset of men, in the U.S. mostly white, with higher status and a strong belief in individual efficacy, are prone to accept new technologies with greater alacrity while minimizing their potential threats — a phenomenon researchers have called the “white-male effect,” a form of cognition that protects status. In the words of one study, the findings expose “a host of new practical and moral challenges for reconciling the rational regulation of risk with democratic decision making.”

https://onezero.medium.com/the-risk-makers-720093d41f01

 

Facebook is out of control. If it were a country it would be North Korea

This is a company that facilitated an attack on a US election by a foreign power, that live-streamed a massacre then broadcast it to millions around the world, and helped incite a genocide.

I’ll say that again. It helped incite a genocide. A United Nations report says the use of Facebook played a “determining role” in inciting hate and violence against Myanmar’s Rohingya, which has seen tens of thousands die and hundreds of thousands flee for their lives.

https://www.theguardian.com/technology/2020/jul/05/facebook-is-out-of-control-if-it-were-a-country-it-would-be-north-korea

OK Cupid: “We Experiment On Human Beings!”

The ethics of human experimentation on the internet has been greatly debated, especially in light of the revelation that Facebook engaged in experiments on its users without their consent. Another site, OK Cupid, proudly states that they experiment on humans and whether or not you realize it, if you’re on the internet then you’re being experimented on all the time.

“We noticed recently that people didn’t like it when Facebook ‘experimented’ with their news feed. Even the FTC is getting involved. But guess what, everybody: if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.”

http://blog.okcupid.com/index.php/we-experiment-on-human-beings/

Radiolab Podcasat: The Trust Engineers

“When we talk online, things can go south fast. But they don’t have to. Today, we meet a group of social engineers who are convinced that tiny changes in wording can make the online world a kinder, gentler place. So long as we agree to be their lab rats.

Ok, yeah, we’re talking about Facebook. Because Facebook, or something like it, is more and more the way we share and like, and gossip and gripe. And because it’s so big, Facebook has a created a laboratory of human behavior the likes of which we’ve never seen. We peek into the work of Arturo Bejar and a team of researchers who are tweaking our online experience, bit by bit, to try to make the world a better place. And along the way we can’t help but wonder whether that’s possible, or even a good idea.”

http://www.radiolab.org/story/trust-engineers/

Ethics of social experiments on Facebook

In late 2014, there was an uproar about revelation that facebook was conducting social experiments on its users by changing (or manipulating) users’ newsfeeds to see the effect of more positive stories vs. more negative ones for example. This issue brought up questions around the ethics of experimenting on subjects without their knowledge or consent.

With over a billion users on facebook from all races, social classes, and nations including every possible cross section of the human race, this platform allows for experimentation on a scale never before possible. Social science experiments require large samples to increase the validity of their findings and facebook offers just that.

What if asking people for consent somehow changed the validity of the results? What if people chose not to participate and our ability to use this potentially revolutionary tool (facebook) was now limited?

These are some of the many issues to consider. Below are a few of articles about the issue.

1. Facebook sorry – almost – for secret psychological experiment on users

“Facebook published the results of a 2012 study in the Proceedings of the National Academy of Sciences. Unbeknown to users, Facebook had tampered with the news feeds of nearly 700,000 people, showing them an abnormally low number of either positive or negative posts. The experiment aimed to determine whether the company could alter the emotional state of its users.”

http://www.theguardian.com/technology/2014/oct/02/facebook-sorry-secret-psychological-experiment-users

2.Furor Erupts Over Facebook’s Experiment on Users

“A social-network furor has erupted over news that Facebook Inc., in 2012, conducted a massive psychological experiment on nearly 700,000 unwitting users.”

http://www.wsj.com/articles/furor-erupts-over-facebook-experiment-on-users-1404085840

3. Facebook Experiments Had Few Limits

“Thousands of Facebook Inc. users received an unsettling message two years ago: They were being locked out of the social network because Facebook believed they were robots or using fake names. To get back in, the users had to prove they were real.”

http://www.wsj.com/articles/facebook-experiments-had-few-limits-1404344378

4.On the ethics of Facebook experiments

“Facebook found itself in the hot seat once again this week following the publication of a study that experimentally manipulated the content of more than 600,000 users’ newsfeeds. The study finds that increasing positive content in users’ newsfeeds makes them post more positive content themselves. Likewise, increasing the amount of negative content a user sees increases the number of negative posts.”

http://www.washingtonpost.com/blogs/monkey-cage/wp/2014/07/03/on-the-ethics-of-facebook-experiments/

5.Facebook emotion study breached ethical guidelines, researchers say

“Researchers have roundly condemned Facebook’s experiment in which it manipulated nearly 700,000 users’ news feeds to see whether it would affect their emotions, saying it breaches ethical guidelines for “informed consent”.”

http://www.theguardian.com/technology/2014/jun/30/facebook-emotion-study-breached-ethical-guidelines-researchers-say