counter statistics
Health News

Leaked documents show Facebook puts profit before public goods


November 8, 2021 – A leaked bunch of papers from within Facebook show that the social media giant’s internal research has uncovered a myriad of issues on the platform related to public health and other issues, but has done virtually nothing about it.

The files were leaked by a whistleblower, former Facebook employee Frances Haugen, who shared tens of thousands of documents with the Securities and Exchange Commission, Congress and a consortium of news organizations. She has since testified before the Senate Trade Subcommittee on Consumer Protection and European lawmakers.

Reinforcement of ‘Anti-Vaxxers’ and other misinformation

President Joe Biden caused a stir in July when he said thanks to the rampant misinformation about the COVID-19 vaccine, social media platforms like Facebook “kill people – I mean they are really, look, the only pandemic we have, are among the unvaccinated, “he said.” And they kill people. “

While he was forced to walk back the statement, the leaked documents indicate that he was not necessarily wrong.

According to the newspapers, in March – a time when the White House was preparing a $ 1.5 billion campaign against vaccine wrong information – some Facebook employees thought they had found a way to counteract those lies on the platform, while at the same time prioritizing legitimate sources such as the World Health Organization.

“Given these results, I assume we hope to start as soon as possible,” one employee wrote.

But Facebook ignored some of the proposals and executives dragged their heels to implement others. Another proposal aimed at anti-vaccine comments, were also ignored.

“Why would you not remove comments? Because involvement is the only thing that matters, ”Imran Ahmed, CEO of the Center for Countering Digital Hate, an Internet watchdog group, told The Associated Press. “It attracts attention and attention equals eyeballs and eyeballs equal advertising revenue.”

Facebook’s algorithms – which determine the content you see in your stream – also help spread the wrong information.

“It’s not like the anti-wax contingent was created by Facebook,” said Dean Schillinger, MD, director of the Health Communication Research Program at the University of California-San Francisco. “The algorithm said, ‘OK, let’s find certain people with certain political beliefs and let’s link them to anti-vaxxers,'” which reinforces the misinformation. “It’s definitely something new.”

If that was not enough, it appears that Facebook may have misled Congress about the company’s understanding of how COVID disseminated information on the platform. In July, two top Democrats from the House wrote to Facebook CEO Mark Zuckerberg asking for details on how many users saw incorrect information about COVID and how much money the company made from those posts.

“At this time, we have nothing to share in answering the questions you raised, other than what Mark said in public,” the company said in response.

But the leaked papers show that by that time, Facebook’s researchers had conducted several studies on COVID misinformation and produced major internal reports. Employees were able to calculate the number of views obtained by a widely shared piece of misinformation. But the company did not admit it to Congress.

Keeping this knowledge a secret was a great missed opportunity to ensure that science-supported information reaches the general public, says Sherry Pagoto, PhD, director of the UConn Center for mHealth and Social Media.

“We know how to spread misinformation, so how can we think more about spreading good information?” she says. “They have all kinds of data about the characteristics of messages that go far. How can we use what they know in the field of health communication to come up with a plan? ”

In an email, a Meta spokesman (amid the uproar, Facebook announced a new corporate name) said: “There is no silver bullet in fighting misinformation, so we’re taking a comprehensive approach , which includes the removal of more than 20 million pieces of content that violate our COVID misinformation policies, permanently ban thousands of repeat offenders from our services, connect more than 2 billion people with reliable information about COVID-19 and vaccines, and work with independent fact-checkers. ”

Ignore Instagram’s effect on vulnerable teens’ mental health

Combating misinformation is not the only way Facebook and its subsidiaries could have acted to protect public health. The company was also aware of its negative impact on young people’s mental health, but has publicly denied it.

Instagram, which is owned by Facebook, is extremely popular among teenage girls. But the photo-sharing application repeatedly exposes them to images of idealized bodies and faces, which can lead to negative self-comparisons and pressure to look perfect.

Pro-eating disorder content is also widely available on the platform. Social science and mental health researchers have been looking at the effect of social media on mental health for years, especially for adolescents. Studies have found links between Instagram use and depression, anxiety, low self-esteem, and eating disorders.

The Facebook newspapers revealed what Instagram researchers called a “deep dive of teenage mental health.” And there were serious problems: The internal research showed that the platform made body image issues worse for 1 in 3 teenage girls, and 14% of teenage boys said Instagram made them feel worse about themselves. The data associated use of the application with anxiety and depression. And among teens who reported suicidal thoughts, 6% of American users and 13% of British people linked that impulse directly to Instagram.

Jean Twenge, PhD, author of iGen: Why today’s super-connected children grow up less rebellious, more tolerant, less happy – and completely unprepared for adulthood, has been studying the effects of social media on young people for almost a decade.

“I was not surprised that Facebook found that social media can have significant links to depression and self-harm. The academic research has shown this for years, ”she says. “I was amazed at how in-depth their research was on exactly the attitude of teenage girls who use Instagram. Their research really built on what we already knew. ”

As with Facebook’s findings on misinformation, the company has downplayed Instagram’s negative impact in public – including in comments to Congress – and has done little to tailor teen users’ experience to the application.

“I think given what they knew about Instagram and mental health, it would definitely have been the right thing to do to make changes to the platform,” says Twenge.

In their email, the Meta spokesperson said: “Our research does not lead to the conclusion that Instagram is inherently bad for teenagers. While some teens have told us that Instagram makes them feel worse when they struggle with issues such as loneliness, anxiety and sadness, more teens have told us that Instagram makes them feel better when they experience the same problems. ”

A responsibility towards the public interest?

While Facebook users may be surprised to find out how the company regularly puts profits above its customers’ health, those who study public health are anything but.

“This is not a problem in any way unique to social media platforms,” ​​says Schillinger.
“Corporate entities regularly follow policies that involve the public to participate in activities, to buy or consume products, to implement behaviors that are unhealthy for themselves or others or the planet. … Do you think Facebook behaves differently than any other company in that space? ”

This is where the potential for regulation comes in, said Haugen, the whistleblower. She asked for it, as did many lawmakers in the wake of her revelations.

“Large organizations that have influence and access to many people should be responsible for the well-being of that population, just as a matter of principle,” says sociologist Damon Centola, PhD, author of Change: How to make great things happen.

He compares the explosion of social media to the history of television, which has been regulated in many ways for decades.

“I think it provides us with a parallel of social media and the ability of the medium to influence the population,” he says. “It seems to me that organizations can not get away with saying they will not consider public welfare.”

The so-called Facebook Papers are the most damning, some experts believe, because the company’s defense claims their research was only meant for product development, so it proves nothing.

It ignores all the peer-reviewed papers, published in respected journals, which reinforce the findings of their internal research. Together, the two types of research leave little room for doubt, and little doubt that something needs to change.

“Think of it like environmental pollution,” Centola says. “Companies can know they are polluting, but they can also say that it did not really matter, it did no harm. But then you get the documentation that says no, it has big consequences. That’s when it really matters. “

Social media as a force for good

But there is one potential benefit of the Facebook newspapers, according to the experts: It is clear that the company knows a lot about how to distribute messages effectively. With enough pressure, Facebook and other social media platforms can now start using these insights in a positive direction.

“Facebook should develop a strong partnership with trusted entities to develop content that is both true and promotes public health, while also being engaging and algorithmically driven,” says Schillinger. “If we can use the platform and the reach and the [artificial intelligence] Facebook has for health-promoting content, the sky is the limit. ”

And such efforts may be on the horizon.

“We are focused on building new features to help people who are struggling with negative social comparison or negative body image,” the Meta spokesman wrote in the email. “We are also continuing to seek opportunities to work with more partners to publish independent studies in this area, and we are working on how our external researchers can provide more access to our data in a way that respects people’s privacy. ”

This is not to say that Facebook will voluntarily put public health ahead of the company’s need to make money, without regulations forcing them to do so.

“I do think Facebook is interested in making their platform better for users. But their first interest will always be that as many users as possible spend as much time on the platform as possible, ”says Twenge. “Those two desires often have cross goals.”



Source link

Related Articles

Leave a Reply

Your email address will not be published.

Back to top button