All posts

Levelling the Digital Playing Field

Why Platforms Must Embrace Content Accountability

Published

Apr 11, 2025

Topic

Media

The digital public square has become dangerously unbalanced. Social media platforms, once celebrated as democratic spaces of free expression, have evolved into opaque powerhouses with little accountability for the content they amplify. Shielded by Section 230 of the U.S. Communications Decency Act, these platforms have enjoyed legal immunity for user-generated content, fueling their rise as media juggernauts while hollowing out traditional journalism.

It’s time to rethink the system. To rebuild trust in the information we consume, we must demand radical transparency and algorithmic reform. Here’s why—and how—it can be done.

Journalism Has Been Defunded by Design

For generations, journalism played a vital role in holding power to account. Investigative reporting, fact-checking, and in-depth analysis all demand time and money. Historically, this was funded by advertising revenues. But the emergence of social platforms disrupted that model.

Platforms like Facebook and Google didn't just compete with news outlets—they absorbed their audiences. Publishers became dependent on the algorithms that dictated what content was seen. Then, once platforms had control, they pivoted: hosting content within their own ecosystems and monopolising ad revenues.

The impact has been devastating:

  • Over 1,800 newspapers in the U.S. shut down between 2004 and 2019.

  • The UK lost nearly 220 local papers between 2005 and 2015.

Quality journalism was replaced with clickbait, low-effort articles, and outright falsehoods. And this isn’t just a media crisis—it’s a democratic one. Without strong journalism, disinformation flourishes and public trust withers.

Section 230: Innovation’s Double-Edged Sword

Originally designed to encourage innovation, Section 230 now enables platforms to benefit from viral content while avoiding responsibility for its impact. Disinformation spreads rapidly on social media—not because it's true, but because it’s emotionally engaging.

"False information spreads six times faster on Twitter than factual stories."

From conspiracy theories to COVID-19 hoaxes, platforms profit from virality without bearing the societal costs.

Algorithms Are Driving Division

Social media algorithms are engineered for one thing: engagement. And divisive, extreme content keeps users clicking. This creates a feedback loop of outrage and misinformation, reinforcing echo chambers and deepening political divides.

We’ve seen the results:

  • Facts become subjective.

  • Dialogue becomes polarised.

  • Lies, like “alternative facts,” become mainstream.

This digital ecosystem is burning down shared truth, and the journalistic institutions that once called out falsehoods have been weakened in the process.

A Roadmap to Radical Transparency

Change is possible—and essential. Social platforms can no longer operate like black boxes. They must become transparent, accountable, and fair in how they moderate content and shape public discourse. Here's how:

1. Public Moderation Reports

Platforms should publish regular, detailed reports outlining:

  • Content moderation policies

  • Use of automation vs. human reviewers

  • Volume and type of content flagged or removed

2. Algorithmic Transparency

Users deserve to understand what they’re seeing and why. Platforms should disclose:

  • The criteria used to prioritise content

  • Whether engagement, ad revenue, or user preferences drive feed curation

3. Independent Audits

External regulators and third-party auditors must have access to platform data to:

  • Evaluate moderation consistency

  • Expose manipulation or bias in algorithms

Rewriting the Rules of Engagement

Transparency alone won’t solve the problem. We also need to change the algorithms that fuel the disinformation economy. Here’s what meaningful regulation could look like:

Deprioritise Sensationalism

Algorithms should no longer favour the loudest or most outrageous content. Regulators must require platforms to dial down sensationalism and promote content that informs rather than inflames.

Promote Quality Journalism

Regulations could incentivise the prioritisation of content from reputable, verified sources. This creates a healthier information ecosystem and supports the work of real journalists.

Mandatory Risk Assessments

Like the EU’s Digital Services Act, platforms should be required to conduct risk assessments for harmful content, with meaningful penalties for non-compliance.

Give Users More Control

Let users choose how their feed is curated. Offer algorithm-free or user-customisable feed options. WeAre8 is already leading the way on this front, showing that empowering users is not only possible—it’s preferable.

“Profits should be built on innovation, not societal decay.”

Free Speech vs. Amplified Harm

This isn’t about censorship. Free speech doesn’t mean platforms must amplify every voice equally—especially when those voices are divisive or false. Today’s algorithms elevate the most extreme content not because it's right, but because it’s profitable.

The real threat to free speech is the dominance of a few opaque platforms that decide what we see without oversight or fairness.

A Global Movement for Digital Accountability

Section 230 may be American law, but its consequences are global. The UK’s Online Safety Act and the EU’s Digital Services Act are early attempts to rein in the chaos. But without international cooperation, regulation is like a patchwork quilt full of holes.

Platforms that operate across borders must be held to global standards of transparency and responsibility.

The Future We Choose

Unchecked, social media platforms will continue to destabilise democracies, erode public trust, and reward outrage over truth. But we still have a choice.

By embracing transparency, reforming algorithms, and supporting quality journalism, we can rebuild a digital public square grounded in truth and accountability.

The question isn’t whether we can afford to regulate these platforms.
It’s whether we can afford not to.

Man Wearing Sunglasses
Logo

Instant Media Capability for Creative Agencies

Great creative deserves media that’s just as ambitious. We bring strategic media expertise into the heart of the creative process. Early, collaboratively, and impact-first.

Amsterdam

5:13:01 PM

KNSM-Eiland 171 1019LC Amsterdam The Netherlands

Phone: +31 85 080 51 78

Email: media.eu@mediafuturesmarket.com

For media enquiries regarding Continental Europe

London

4:13:01 PM

90 York Way, London N1 9AG, United Kingdom

Phone: +44 7810 481295

Email: media.uk@mediafuturesmarket.com

For media enquiries regarding the United Kingdom

New York

11:13:01 AM

1 Little W 12th St, New York, NY 10014,

Phone: +31 85 080 51 78

Email: media.us@mediafuturesmarket.com

For media enquiries regarding the United States

Copyright © Media Futures Market 2025

Man Wearing Sunglasses
Logo

Instant Media Capability for Creative Agencies

Great creative deserves media that’s just as ambitious. We bring strategic media expertise into the heart of the creative process. Early, collaboratively, and impact-first.

Amsterdam

5:13:01 PM

KNSM-Eiland 171 1019LC Amsterdam The Netherlands

Phone: +31 85 080 51 78

Email: media.eu@mediafuturesmarket.com

For media enquiries regarding Continental Europe

London

4:13:01 PM

90 York Way, London N1 9AG, United Kingdom

Phone: +44 7810 481295

Email: media.uk@mediafuturesmarket.com

For media enquiries regarding the United Kingdom

New York

11:13:01 AM

1 Little W 12th St, New York, NY 10014,

Phone: +31 85 080 51 78

Email: media.us@mediafuturesmarket.com

For media enquiries regarding the United States

Copyright © Media Futures Market 2025

All posts

Levelling the Digital Playing Field

Why Platforms Must Embrace Content Accountability

Published

Apr 11, 2025

Topic

Media

The digital public square has become dangerously unbalanced. Social media platforms, once celebrated as democratic spaces of free expression, have evolved into opaque powerhouses with little accountability for the content they amplify. Shielded by Section 230 of the U.S. Communications Decency Act, these platforms have enjoyed legal immunity for user-generated content, fueling their rise as media juggernauts while hollowing out traditional journalism.

It’s time to rethink the system. To rebuild trust in the information we consume, we must demand radical transparency and algorithmic reform. Here’s why—and how—it can be done.

Journalism Has Been Defunded by Design

For generations, journalism played a vital role in holding power to account. Investigative reporting, fact-checking, and in-depth analysis all demand time and money. Historically, this was funded by advertising revenues. But the emergence of social platforms disrupted that model.

Platforms like Facebook and Google didn't just compete with news outlets—they absorbed their audiences. Publishers became dependent on the algorithms that dictated what content was seen. Then, once platforms had control, they pivoted: hosting content within their own ecosystems and monopolising ad revenues.

The impact has been devastating:

  • Over 1,800 newspapers in the U.S. shut down between 2004 and 2019.

  • The UK lost nearly 220 local papers between 2005 and 2015.

Quality journalism was replaced with clickbait, low-effort articles, and outright falsehoods. And this isn’t just a media crisis—it’s a democratic one. Without strong journalism, disinformation flourishes and public trust withers.

Section 230: Innovation’s Double-Edged Sword

Originally designed to encourage innovation, Section 230 now enables platforms to benefit from viral content while avoiding responsibility for its impact. Disinformation spreads rapidly on social media—not because it's true, but because it’s emotionally engaging.

"False information spreads six times faster on Twitter than factual stories."

From conspiracy theories to COVID-19 hoaxes, platforms profit from virality without bearing the societal costs.

Algorithms Are Driving Division

Social media algorithms are engineered for one thing: engagement. And divisive, extreme content keeps users clicking. This creates a feedback loop of outrage and misinformation, reinforcing echo chambers and deepening political divides.

We’ve seen the results:

  • Facts become subjective.

  • Dialogue becomes polarised.

  • Lies, like “alternative facts,” become mainstream.

This digital ecosystem is burning down shared truth, and the journalistic institutions that once called out falsehoods have been weakened in the process.

A Roadmap to Radical Transparency

Change is possible—and essential. Social platforms can no longer operate like black boxes. They must become transparent, accountable, and fair in how they moderate content and shape public discourse. Here's how:

1. Public Moderation Reports

Platforms should publish regular, detailed reports outlining:

  • Content moderation policies

  • Use of automation vs. human reviewers

  • Volume and type of content flagged or removed

2. Algorithmic Transparency

Users deserve to understand what they’re seeing and why. Platforms should disclose:

  • The criteria used to prioritise content

  • Whether engagement, ad revenue, or user preferences drive feed curation

3. Independent Audits

External regulators and third-party auditors must have access to platform data to:

  • Evaluate moderation consistency

  • Expose manipulation or bias in algorithms

Rewriting the Rules of Engagement

Transparency alone won’t solve the problem. We also need to change the algorithms that fuel the disinformation economy. Here’s what meaningful regulation could look like:

Deprioritise Sensationalism

Algorithms should no longer favour the loudest or most outrageous content. Regulators must require platforms to dial down sensationalism and promote content that informs rather than inflames.

Promote Quality Journalism

Regulations could incentivise the prioritisation of content from reputable, verified sources. This creates a healthier information ecosystem and supports the work of real journalists.

Mandatory Risk Assessments

Like the EU’s Digital Services Act, platforms should be required to conduct risk assessments for harmful content, with meaningful penalties for non-compliance.

Give Users More Control

Let users choose how their feed is curated. Offer algorithm-free or user-customisable feed options. WeAre8 is already leading the way on this front, showing that empowering users is not only possible—it’s preferable.

“Profits should be built on innovation, not societal decay.”

Free Speech vs. Amplified Harm

This isn’t about censorship. Free speech doesn’t mean platforms must amplify every voice equally—especially when those voices are divisive or false. Today’s algorithms elevate the most extreme content not because it's right, but because it’s profitable.

The real threat to free speech is the dominance of a few opaque platforms that decide what we see without oversight or fairness.

A Global Movement for Digital Accountability

Section 230 may be American law, but its consequences are global. The UK’s Online Safety Act and the EU’s Digital Services Act are early attempts to rein in the chaos. But without international cooperation, regulation is like a patchwork quilt full of holes.

Platforms that operate across borders must be held to global standards of transparency and responsibility.

The Future We Choose

Unchecked, social media platforms will continue to destabilise democracies, erode public trust, and reward outrage over truth. But we still have a choice.

By embracing transparency, reforming algorithms, and supporting quality journalism, we can rebuild a digital public square grounded in truth and accountability.

The question isn’t whether we can afford to regulate these platforms.
It’s whether we can afford not to.

Man Wearing Sunglasses
Logo

Instant Media Capability for Creative Agencies

Great creative deserves media that’s just as ambitious. We bring strategic media expertise into the heart of the creative process. Early, collaboratively, and impact-first.

Amsterdam

5:13:01 PM

KNSM-Eiland 171 1019LC Amsterdam The Netherlands

Phone: +31 85 080 51 78

Email: media.eu@mediafuturesmarket.com

For media enquiries regarding Continental Europe

London

4:13:01 PM

90 York Way, London N1 9AG, United Kingdom

Phone: +44 7810 481295

Email: media.uk@mediafuturesmarket.com

For media enquiries regarding the United Kingdom

New York

11:13:01 AM

1 Little W 12th St, New York, NY 10014,

Phone: +31 85 080 51 78

Email: media.us@mediafuturesmarket.com

For media enquiries regarding the United States

Copyright © Media Futures Market 2025

All posts

Levelling the Digital Playing Field

Why Platforms Must Embrace Content Accountability

Published

Apr 11, 2025

Topic

Media

The digital public square has become dangerously unbalanced. Social media platforms, once celebrated as democratic spaces of free expression, have evolved into opaque powerhouses with little accountability for the content they amplify. Shielded by Section 230 of the U.S. Communications Decency Act, these platforms have enjoyed legal immunity for user-generated content, fueling their rise as media juggernauts while hollowing out traditional journalism.

It’s time to rethink the system. To rebuild trust in the information we consume, we must demand radical transparency and algorithmic reform. Here’s why—and how—it can be done.

Journalism Has Been Defunded by Design

For generations, journalism played a vital role in holding power to account. Investigative reporting, fact-checking, and in-depth analysis all demand time and money. Historically, this was funded by advertising revenues. But the emergence of social platforms disrupted that model.

Platforms like Facebook and Google didn't just compete with news outlets—they absorbed their audiences. Publishers became dependent on the algorithms that dictated what content was seen. Then, once platforms had control, they pivoted: hosting content within their own ecosystems and monopolising ad revenues.

The impact has been devastating:

  • Over 1,800 newspapers in the U.S. shut down between 2004 and 2019.

  • The UK lost nearly 220 local papers between 2005 and 2015.

Quality journalism was replaced with clickbait, low-effort articles, and outright falsehoods. And this isn’t just a media crisis—it’s a democratic one. Without strong journalism, disinformation flourishes and public trust withers.

Section 230: Innovation’s Double-Edged Sword

Originally designed to encourage innovation, Section 230 now enables platforms to benefit from viral content while avoiding responsibility for its impact. Disinformation spreads rapidly on social media—not because it's true, but because it’s emotionally engaging.

"False information spreads six times faster on Twitter than factual stories."

From conspiracy theories to COVID-19 hoaxes, platforms profit from virality without bearing the societal costs.

Algorithms Are Driving Division

Social media algorithms are engineered for one thing: engagement. And divisive, extreme content keeps users clicking. This creates a feedback loop of outrage and misinformation, reinforcing echo chambers and deepening political divides.

We’ve seen the results:

  • Facts become subjective.

  • Dialogue becomes polarised.

  • Lies, like “alternative facts,” become mainstream.

This digital ecosystem is burning down shared truth, and the journalistic institutions that once called out falsehoods have been weakened in the process.

A Roadmap to Radical Transparency

Change is possible—and essential. Social platforms can no longer operate like black boxes. They must become transparent, accountable, and fair in how they moderate content and shape public discourse. Here's how:

1. Public Moderation Reports

Platforms should publish regular, detailed reports outlining:

  • Content moderation policies

  • Use of automation vs. human reviewers

  • Volume and type of content flagged or removed

2. Algorithmic Transparency

Users deserve to understand what they’re seeing and why. Platforms should disclose:

  • The criteria used to prioritise content

  • Whether engagement, ad revenue, or user preferences drive feed curation

3. Independent Audits

External regulators and third-party auditors must have access to platform data to:

  • Evaluate moderation consistency

  • Expose manipulation or bias in algorithms

Rewriting the Rules of Engagement

Transparency alone won’t solve the problem. We also need to change the algorithms that fuel the disinformation economy. Here’s what meaningful regulation could look like:

Deprioritise Sensationalism

Algorithms should no longer favour the loudest or most outrageous content. Regulators must require platforms to dial down sensationalism and promote content that informs rather than inflames.

Promote Quality Journalism

Regulations could incentivise the prioritisation of content from reputable, verified sources. This creates a healthier information ecosystem and supports the work of real journalists.

Mandatory Risk Assessments

Like the EU’s Digital Services Act, platforms should be required to conduct risk assessments for harmful content, with meaningful penalties for non-compliance.

Give Users More Control

Let users choose how their feed is curated. Offer algorithm-free or user-customisable feed options. WeAre8 is already leading the way on this front, showing that empowering users is not only possible—it’s preferable.

“Profits should be built on innovation, not societal decay.”

Free Speech vs. Amplified Harm

This isn’t about censorship. Free speech doesn’t mean platforms must amplify every voice equally—especially when those voices are divisive or false. Today’s algorithms elevate the most extreme content not because it's right, but because it’s profitable.

The real threat to free speech is the dominance of a few opaque platforms that decide what we see without oversight or fairness.

A Global Movement for Digital Accountability

Section 230 may be American law, but its consequences are global. The UK’s Online Safety Act and the EU’s Digital Services Act are early attempts to rein in the chaos. But without international cooperation, regulation is like a patchwork quilt full of holes.

Platforms that operate across borders must be held to global standards of transparency and responsibility.

The Future We Choose

Unchecked, social media platforms will continue to destabilise democracies, erode public trust, and reward outrage over truth. But we still have a choice.

By embracing transparency, reforming algorithms, and supporting quality journalism, we can rebuild a digital public square grounded in truth and accountability.

The question isn’t whether we can afford to regulate these platforms.
It’s whether we can afford not to.

Man Wearing Sunglasses
Logo

Instant Media Capability for Creative Agencies

Great creative deserves media that’s just as ambitious. We bring strategic media expertise into the heart of the creative process. Early, collaboratively, and impact-first.

Amsterdam

5:13:01 PM

KNSM-Eiland 171 1019LC Amsterdam The Netherlands

Phone: +31 85 080 51 78

Email: media.eu@mediafuturesmarket.com

For media enquiries regarding Continental Europe

London

4:13:01 PM

90 York Way, London N1 9AG, United Kingdom

Phone: +44 7810 481295

Email: media.uk@mediafuturesmarket.com

For media enquiries regarding the United Kingdom

New York

11:13:01 AM

1 Little W 12th St, New York, NY 10014,

Phone: +31 85 080 51 78

Email: media.us@mediafuturesmarket.com

For media enquiries regarding the United States

Copyright © Media Futures Market 2025