• Tech News
    • Games
    • Pc & Laptop
    • Mobile Tech
    • Ar & Vr
    • Security
  • Startup
    • Fintech
  • Reviews
  • How To
What's Hot

Elementor #32036

January 24, 2025

The Redmi Note 13 is a bigger downgrade compared to the 5G model than you might think

April 18, 2024

Xiaomi Redmi Watch 4 is a budget smartwatch with a premium look and feel

April 16, 2024
Facebook Twitter Instagram
  • Contact
  • Privacy Policy
  • Terms & Conditions
Facebook Twitter Instagram Pinterest VKontakte
Behind The ScreenBehind The Screen
  • Tech News
    1. Games
    2. Pc & Laptop
    3. Mobile Tech
    4. Ar & Vr
    5. Security
    6. View All

    Bring Elden Ring to the table with the upcoming board game adaptation

    September 19, 2022

    ONI: Road to be the Mightiest Oni reveals its opening movie

    September 19, 2022

    GTA 6 images and footage allegedly leak

    September 19, 2022

    Wild west adventure Card Cowboy turns cards into weird and silly stories

    September 18, 2022

    7 Reasons Why You Should Study PHP Programming Language

    October 19, 2022

    Logitech MX Master 3S and MX Keys Combo for Business Gen 2 Review

    October 9, 2022

    Lenovo ThinkPad X1 Carbon Gen10 Review

    September 18, 2022

    Lenovo IdeaPad 5i Chromebook, 16-inch+120Hz

    September 3, 2022

    It’s 2023 and Spotify Still Can’t Say When AirPlay 2 Support Will Arrive

    April 4, 2023

    YouTube adds very convenient iPhone homescreen widgets

    October 15, 2022

    Google finishes iOS 16 Lock Screen widgets rollout w/ Maps

    October 14, 2022

    Is Apple actually turning iMessage into AIM or is this sketchy redesign rumor for laughs?

    October 14, 2022

    MeetKai launches AI-powered metaverse, starting with a billboard in Times Square

    August 10, 2022

    The DeanBeat: RP1 simulates putting 4,000 people together in a single metaverse plaza

    August 10, 2022

    Improving the customer experience with virtual and augmented reality

    August 10, 2022

    Why the metaverse won’t fall to Clubhouse’s fate

    August 10, 2022

    How Apple privacy changes have forced social media marketing to evolve

    October 16, 2022

    Microsoft Patch Tuesday October Fixed 85 Vulnerabilities – Latest Hacking News

    October 16, 2022

    Decentralization and KYC compliance: Critical concepts in sovereign policy

    October 15, 2022

    What Thoma Bravo’s latest acquisition reveals about identity management

    October 14, 2022

    What is a Service Robot? The vision of an intelligent service application is possible.

    November 7, 2022

    Tom Brady just chucked another Microsoft Surface tablet

    September 18, 2022

    The best AIO coolers for your PC in 2022

    September 18, 2022

    YC’s Michael Seibel clarifies some misconceptions about the accelerator • DailyTech

    September 18, 2022
  • Startup
    • Fintech
  • Reviews
  • How To
Behind The ScreenBehind The Screen
Home»Startup»Why Social Media Amplifies Extreme Views – And How To Stop It
Startup

Why Social Media Amplifies Extreme Views – And How To Stop It

May 4, 2023No Comments10 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Why Social Media Amplifies Extreme Views – And How To Stop It
Share
Facebook Twitter LinkedIn Pinterest Email

Peace-builder and Ashoka Fellow Helena Puig Larrauri co-founded Build Up to transform conflict in the digital age–in places from the U.S. to Iraq. With the exponential growth of viral polarizing content on social media, a key systemic question emerged for her: What if we made platforms pay for the harms they produce? What if we imagined a tax on polarization, akin to a carbon tax? A conversation about the root causes of online polarization, and why platforms should be held responsible for the negative externalities they cause.

Ashoka Fellow Helena Puig co-founded Build Up to transform conflict in the digital age.

Claudia Meier

Konstanze Frischen: Helena, does technology help or harm democracy?

Helena Puig Larrauri: It depends. There is great potential for digital technologies to include more people in peace processes and democratic processes. We work on conflict transformation in many regions across the globe, and technology can really help include more people. In Yemen, for instance, it can be very difficult to incorporate women’s viewpoints into the peace process. So we worked with the UN to use WhatsApp, a very simple technology, to reach out to women and have their voices heard, avoiding security and logistical challenges. That’s one example of the potential. On the flip side, digital technologies bring about immense challenges – from surveillance to manipulation. And here, our work is to understand how digital technologies are impacting conflict escalation, and what can be done to mitigate that.

Frischen: You have staff working in countries like Yemen, Kenya, Germany and the US. How does it show up when digital media escalates conflict?

Puig Larrauri: Here is an example: We worked with partners in northeast Iraq, analyzing how conversations happen on Facebook, and it quickly showed that what people said and how they positioned themselves had to do with how they spoke about their sectarian identity, whether they said they were Arabic or Kurdish. But what was happening at a deeper level is that users started to associate a person’s opinion with their identity – which means that in the end, what matters is not so much what is being said, but who is saying it: your own people, or other people. And it meant that the conversations on Facebook were extremely polarized. And not in a healthy way, but by identity. We all must be able to disagree on issues in a democratic process, in a peace process. But when identities or groups start opposing each other, that’s what we call affective polarization. And what that means is that no matter what you actually say, I’m going to disagree with you because of the group that you belong to. Or, the flip side, no matter what you say, I’m going to agree with you because of the group that you belong to. When a debate is at that state, then you’re in a situation where conflict is very likely to be destructive. And escalate to violence.

Frischen: Are you saying social media makes your work harder because it drives affective polarization?

Puig Larrauri: Yes, it certainly feels like the odds are stacked against our work. Offline, there may be space, but online, it often feels like there’s no way that we can start a peaceful conversation. I remember a conversation with the leader of our work in Africa, Caleb. He said to me during the recent election cycle in Kenya “when I walk the streets, I feel like this is going to be a peaceful election. But when I read social media, it’s a war zone.” I remember this because even for us, who are professionals in the space, it is unsettling.

Frischen: The standard way for platforms to react to hate speech is content moderation — detecting it, labeling it, depending on the jurisdiction, perhaps removing it. You say that’s not enough. Why?

Puig Larrauri: Content moderation helps in very specific situations – it helps with hate speech, which is in many ways the tip of the iceberg. But affective polarization is often expressed in other ways, for example through fear. Fear speech is not the same as hate speech. It can’t be so easily identified. It probably won’t violate the terms of service. Yet we know that fear speech can be used to incite violence. But it wouldn’t fall foul of the content moderation guidelines of platforms. That’s just one example, the point is that content moderation will only ever catch a small part of the content that is amplifying divisions. Maria Ressa, the Nobel Prize Winner and Filipino journalist, said that recently so well. She said something along the lines that the issue with content moderation is it’s like you fetch a cup of water from a polluted river, clean the water, but then put it back into the river. So I say we need to build a water filtration plant.

Frischen: Let’s talk about that – the root cause. What has that underlying architecture of social media platforms to do with the proliferation of polarization?

Puig Larrauri: There’s actually two reasons why polarization thrives on social media. One is that it invites people to manipulate others and to deploy harassment on mass. Troll armies, Cambridge Analytica – we’ve all heard these stories, let’s put that aside for a moment. The other aspect, which I think deserves a lot more attention, is the way in which social media algorithms are built: They’re looking to serve you up with content that is engaging. And we know that affective polarizing content, that positions groups against each other, is very emotive, and very engaging. As a result, the algorithms serve it up more. So what that means is that social media platforms provide incentives to produce content that is polarizing, because it will be more engaging, which is incentivizing people to produce more content like that, which makes it more engaging, and so on. It’s a vicious circle.

Frischen: So the spread of divisive content is almost a side effect of this business model that makes money off engaging content.

Puig Larrauri: Yes, that’s the way that social media platforms are designed at the moment: to engage people with content, any kind of content, we don’t care what that content is, unless it’s hate speech or something else that violates a narrow policy, right, in which case, we will take it down, but in general, what we want is more engagement on anything. And that is built into their business model. More engagement allows them to sell more ads, it allows them to collect more data. They want people to spend more time on the platform. So engagement is the key metric. It’s not the only metric, but it’s the key metric that algorithms are optimizing for.

Frischen: What framework could force social media companies to change this model?

Puig Larrauri: Great question, but to understand what I’m about to propose, let me say first that the main thing to understand is that social media is changing the way that we understand ourselves and other groups. It is creating divisions in society, and amplifying politically existing divisions. That’s the difference between focusing on hate speech, and focusing on this idea of polarization. Hate speech and harassment is about what the individual experience of being on social media is, which is very important. But when we think about polarization, we’re talking about the impact social media is having on society as a whole, regardless of whether I’m being personally harassed. I am still being impacted by the fact that I’m living in a more polarized society. It is a societal negative externality. There’s something that is affecting all of us, regardless of whether we are individually affected by something.

Frischen: Negative externality is an economics term that – I’m simplifying – describes that in a production or consumption process, there’s a cost being generated, a negative impact, which is not captured by the market mechanisms, and it is harming someone else.

Puig Larrauri: Yes, and the key here is that that cost is not included in the production costs. Let’s take air pollution. Traditionally, in industrial capitalism, people were producing things like cars and machines, in the process of which they also produced environmental pollution. But first, nobody had to pay for the pollution. It was as if that cost didn’t exist, even though it was actually a negative cost to society, but it just wasn’t being priced by the market. Something very similar is happening with social media platforms right now. Their profit model isn’t to create polarization, they just have an incentive to create content that is engaging, regardless of whether it’s polarizing or not, but polarization happens as a by-product, and there’s no incentive to clean it up, just like there was no incentive to clean up pollution. And that’s why polarization is a negative externality of this platform business model.

Frischen: And what are you proposing we do about that?

Puig Larrauri: Make social media companies pay for it. By bringing the societal pollution they cause into the market mechanism. That’s in effect what we did with environmental pollution – we said it should be taxed, there should be carbon taxes or some other mechanism like cap and trade that make companies pay for the negative externality they create. And for that to happen, we had to measure things like CO2 output, or carbon footprints. So my question is: Could we do something similar with polarization? Could we say that social media platforms or perhaps any platform that is driven by an algorithm should be taxed for their polarization footprint?

Frischen: Taxation of polarization is such a creative, novel way to think about forcing platforms to change their business model. I want to acknowledge there are others out there – in the U.S., there’s a discussion about the reform of section 230 that currently shields social media platforms from liability, and….

Puig Larrauri: Yes, and there’s also a very big debate, which I’m very supportive of, and part of, about how to design social media platforms differently by making algorithms optimize for something other than engagement, something that might be less polluting, and produce less polarization. That’s an incredibly important debate. The question I have, however, is how do we incentivize companies to actually take that on? How do we incentivize them to say, Yes, I’m going to make those changes, I’m not going to use this simple engagement metric anymore, I’m going to take on these design changes in the underlying architecture. And I think the way to do that is to essentially provide a financial disincentive to not doing it, which is why I’m so interested in this idea of a tax.

Frischen: How would you ensure taxing content is not seen as undermining protections of free speech? A big argument, especially in the U.S., where you can spread disinformation and hate speech under this umbrella.

Puig Larrauri: I don’t think that a polarization footprint necessarily needs to look at speech. It can look at metrics that have to do with the design of the platform. It can look at, for example, the connection between belonging to a group and only seeing certain types of content. So it doesn’t need to get into issues of hate speech or free speech and the debate around censorship that comes with that. It can look simply at design choices around engagement. As I said before, I actually don’t think that content moderation and censorship is what’s going to work particularly well to address polarization on platforms. What we now need to do is to set to work to measure this polarization footprint, and find the right metrics that can be applied across platforms.

For more follow Helena Puig and Build Up.



Source link

See also  A Sprawling Bot Network Used Fake Porn to Fool Facebook
Amplifies extreme media Social stop views
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

Xiaomi’s Smart Band 8 Pro is a cheap and easy way to track health, control media and get notifications in an Apple Watch-style design

March 26, 2024

Tineco Floor One S5 Extreme review

February 23, 2024

Multiple Milestones As New Majority Capital Boosts Entrepreneurship Through Acquisition

September 26, 2023

Getty Images Plunges Into the Generative AI Pool

September 26, 2023
Add A Comment

Comments are closed.

Editors Picks

Revenue-Based Financing Powers A Tech Company To The Inc. 5000

August 25, 2022

Here are the first not-an-NFT “digital collectibles” for PlayStation Stars

September 14, 2022

Kubernetes co-founder, Microsoft Azure CTO invest in new Seattle startup Diagrid – Startup

October 12, 2022

YC’s Michael Seibel clarifies some misconceptions about the accelerator • DailyTech

September 18, 2022

Subscribe to Updates

Get the latest news and Updates from Behind The Scene about Tech, Startup and more.

Top Post

Elementor #32036

The Redmi Note 13 is a bigger downgrade compared to the 5G model than you might think

Xiaomi Redmi Watch 4 is a budget smartwatch with a premium look and feel

Behind The Screen
Facebook Twitter Instagram Pinterest Vimeo YouTube
  • Contact
  • Privacy Policy
  • Terms & Conditions
© 2025 behindthescreen.uk - All rights reserved.

Type above and press Enter to search. Press Esc to cancel.