NEWS

Insights from the Policy Workshop – Prosocial Tech Design Governance: Exploring Policy Innovations

BY Tasha Adamsky

Few weeks ago, I had the privilege of participating in the Policy Workshop – Prosocial Tech Design Governance at the European University Institute for Transnational Governance. The discussions were insightful, and there are key takeaways I believe are critical for both policymakers and tech leaders to consider.

Unfortunately, many in governance and tech may not even be familiar with the concept of prosocial tech design – what it means, why it matters, and how it can help address some of the most pressing challenges posed by today’s digital platforms. So, let’s dive into what prosocial tech is, followed by a real-world case study and my takeaways from the workshop.

Insights from the Policy Workshop – Prosocial Tech Design Governance: Exploring Policy Innovations

What is Prosocial Tech Design Governance?

Prosocial tech design governance aims to create technology that serves not just users, but society at large – by embedding ethical principles into the design and functionality of digital platforms. This goes beyond simply minimizing harm; it actively promotes positive social outcomes by reshaping the way platforms operate and the incentives that drive their design.

To understand why this is necessary, let’s consider some of the current challenges that technology giants face globally. These challenges fall into three major categories:

  • Unwanted/Harmful Usage: Platforms are optimized to capture attention and keep users engaged for as long as possible through techniques like infinite scrolling, autoplay, and personalized content feeds. While these features increase user engagement, they can also lead to extended, compulsive usage that can have detrimental effects on mental health.
  • Harmful Content: Algorithms that prioritize engagement often promote content that triggers strong emotional responses – frequently, this means harmful or divisive content. Features like ephemeral content and a lack of effective content moderation allow harmful narratives to spread unchecked (the topic of misinformation and polarization is something we all know too well in our highly divided world today).
  • Privacy Violations: Users are often unknowingly exposed to privacy risks due to design choices that prioritize data collection and visibility. For instance, platforms frequently make it difficult for users to manage who can view their personal information or how their data is used, which can lead to misuse or breaches.
Insights from the Policy Workshop – Prosocial Tech Design Governance: Exploring Policy Innovations

Case Study: Tech Regulation in Africa

One particularly illuminating case discussed during the workshop was the situation in Africa, where tech giants like Meta, Google, and YouTube have come under scrutiny. In this instance, one platform was found to be advertising products that claimed to promote women’s health but were, in fact, harmful and illegal in many other countries. Instead of focusing on regulating the design elements that allowed such advertising, governments in some countries, such as Uganda, responded by imposing social media taxes on users or passing ‘fake news’ laws.

This approach highlights a dangerous trend: blaming users rather than addressing the platforms themselves. Rather than holding tech companies accountable for enabling harmful products or content, the responsibility is often placed on the users who are least equipped to protect themselves from these sophisticated design traps. The solution lies in educating users but more importantly, in pushing for creative legislation that targets the underlying design flaws in tech platforms.

Insights from the Policy Workshop – Prosocial Tech Design Governance: Exploring Policy Innovations

Key Takeaways (PS – personal opinion):

  1. Collaboration is Essential: Governments cannot compete with tech giants – they need to work together. Policymakers and tech leaders must bridge the knowledge gap to create a digital environment that protects users without stifling innovation.
  2. Measure Positive Impact: While we are adept at identifying and measuring harm, we lag in understanding how to measure the positive impact of tech innovations. Developing frameworks that assess prosocial outcomes will be key to fostering ethical tech.
  3. Market Forces Alone Aren’t Enough: Market dynamics will not ensure ethical tech design. Policymakers must establish regulatory frameworks that enforce prosocial tech practices from the outset, focusing on privacy and user well-being.
  4. Accountability Over Good Intentions: Hoping that tech CEOs will act as benevolent leaders isn’t a sustainable approach. We need clear, enforceable policies that ensure platforms are designed in ways that mitigate harmful outcomes – whether it’s preventing the spread of misinformation or protecting.

While we are adept at identifying and measuring harm, we lag in understanding how to measure the positive impact of tech innovations.


Co-organised with the Global PeaceTech Hub, the Council on Technology and Social Cohesion, the CCDP – Centre on Conflict, Development and PeacebuildingGeneva Graduate Institute, and the reState Foundation, this workshop analysed the evolution of prosocial tech governance in Europe, North America, and the Global Majority.