More

    Rethinking Regulation through Prosocial Design Approaches

    Digital Peacebuilding and Platform Accountability: Insights from Lena Slachmuijlder

    Lena Slachmuijlder is a prominent figure in the realm of digital peacebuilding, currently serving as a senior advisor for digital peacebuilding at Search for Common Ground. With her extensive experience as a practitioner fellow at the USC Neely Center and as co-chair of the Council on Tech and Social Cohesion, she is at the forefront of discussions on digital platform governance.

    The Urgent Need for Digital Platform Regulation

    Across the globe, the regulation of digital platforms remains stagnant or entirely absent. As harmful content proliferates online, pressure mounts from governments, civil society, and the media. However, platforms frequently evade accountability for the very designs that enable such content. This issue is particularly acute in conflict-affected regions, where the potential for online harm translates into real-world violence. For many, the stakes are not merely theoretical; they are life or death.

    Most countries lack comprehensive legal frameworks that dictate how platforms operate, influence user behavior, or shape public discourse. While there is a valid concern that excessive government oversight could lead to censorship, a burgeoning global consensus recognizes that the roots of online harm lie not just with “bad users” but also in platform designs optimized for attention, engagement, and data extraction.

    A New Blueprint for Accountability

    The foundational insight behind the guide, titled Prosocial Tech Design Regulation: A Practical Guide, emerges from this recognition. This document was crafted with the contributions of 20 experts from various regions—including Asia, Africa, Latin America, North America, and Europe. It represents a collaborative effort, developed in partnership with influential organizations such as the USC Neely Center and the Integrity Institute.

    The guide offers actionable steps for regulators, affirming that many proposed changes can be made without resorting to surveillance or infringing on free speech. Instead, it focuses on governing the mechanics of platforms themselves, promoting a safer online environment free from harmful content.

    Key Recommendations from the Guide

    1. Ban Addictive and Manipulative Design

    One of the primary recommendations is to eliminate features that drive compulsive engagement, such as infinite scrolling and autoplay functions. These should be turned off by default, especially for children. Simple prompts to encourage users to pause or reflect on their usage can mitigate compulsive patterns.

    2. Default to Privacy and Safety for Children

    New accounts for minors should automatically default to private, with strict limitations on communication from strangers. Before the introduction of new features, child impact assessments are essential, reinforcing the principles laid out by the 5Rights Foundation, which advocates for children’s rights in the digital landscape.

    3. Reform Recommender Systems for Long-Term Value

    Recommender systems should prioritize quality, diversity, and user well-being over mere engagement. Algorithms must also be transparent, allowing users to understand and customize their experiences to foster healthier online environments.

    4. Require Transparency and Testing

    Major design changes need to undergo rigorous testing, with public change logs established as standard practice. This transparency allows researchers, regulators, and users to track evolving risks associated with platform changes.

    5. Use Real User Experience to Guide Policy

    Regulators should implement rolling, privacy-safe surveys to collect data on user experiences related to time spent, instances of abuse, polarization, and overall well-being. Such feedback loops can help inform policy and improve user trust in platform governance.

    The Equity Question: Safety Disparities Among Children

    A significant issue raised during consultations in Africa and Asia is the disparity in online safety for children across different nations. While the UK’s Age-Appropriate Design Code has led to substantial platform changes, similar measures are woefully lacking in lower-income countries. This inconsistency raises critical questions about justice and equity in the digital realm.

    Civil society leaders from various regions express concern that children’s online safety should not depend on geography. In a world seemingly governed by digital norms, it’s inconceivable to allow children in some countries to navigate the internet with tools that prioritize their safety while others are left unprotected.

    Platforms Don’t Have to Wait

    Much of the proposed change outlined in the guide can be implemented now without legislative action. Platforms like Pinterest have already adapted their designs to minimize engagement-based dynamics for minors, showcasing what’s possible when user well-being is prioritized.

    However, scalability poses a challenge; without significant public pressure, many companies may hesitate to adopt such reforms on a global scale. This highlights the pressing need for coordinated advocacy from civil society and proactive regulatory measures.

    Governance as a Collective Endeavor

    The guide envisions governance as a shared responsibility involving multiple stakeholders, including regulators, civil society, and researchers. It aligns with existing global frameworks emphasizing collaborative governance. For genuine progress to occur, platforms must also play a role in shaping these rules.

    While Africa already possesses several frameworks aimed at better digital governance, the proposed African Digital Experience Observatory would bring together various stakeholders to monitor harms, correlate them with design features, and glean real-time insights from users.

    Through collaborative approaches and an emphasis on rights and safety, stakeholders can work together to enhance platform accountability, ultimately creating a more equitable and safer digital landscape for all. The guide provides adaptable frameworks aimed at fostering understanding and cooperation among diverse actors striving for healthy online interactions and design accountability.

    Latest articles

    Related articles

    Leave a reply

    Please enter your comment!
    Please enter your name here

    Popular