Proposal

APrIGF 2025 Session Proposal Submission Form
Part 1 - Lead Organizer
Contact Person
Ms. Angela Thomas
Email
Organization / Affiliation (Please state "Individual" if appropriate) *
SFLC.in
Designation
Counsel
Gender
Female
Economy of Residence
India
Stakeholder Group
Civil Society
Part 2 - Session Proposal
Session Title
The Growing Threat of Personal Liability in Platform Regulation: A Necessary Measure or Overreach?
Thematic Track of Your Session
  • Option

    • Primary: Resilience
    • Secondary: Security & Trust
Description of Session Formats
Lightning Talk (30 minutes)
Where do you plan to organize your session?
Onsite at the venue (with online moderator for questions and comments from remote participants)
Specific Issues for Discussion
In an era where governments are increasingly holding platform owners and employees personally accountable for user-generated content, the landscape of digital free speech is rapidly evolving. Recent events, such as the arrest of Telegram CEO Pavel Durov and the suspension of platform X (formerly Twitter) in Brazil, demonstrate how authorities are compelling digital platforms to comply with content moderation laws under the threat of legal consequences. From account takedowns to potential imprisonment, intermediaries are being forced to navigate a precarious balance between safeguarding employee safety and upholding users' rights to free expression.
This session will critically examine the trend of imposing personal liability on platform executives and employees, analyzing its impact on free speech and the broader digital ecosystem. Governments worldwide, including the U.K., have introduced legislative proposals to fine or prosecute senior executives for failure to remove specific types of content within tight deadlines. These measures create a "hostage-taking" effect, where legal authorities use liability statutes to pressure platforms into swift compliance, often at the expense of open discourse.
The session will also explore the legal principles behind intermediary liability, the doctrine of the corporate veil, and when it may be pierced to hold executives personally responsible. While intermediaries traditionally enjoy safe harbor protections, there is an increasing tendency for governments to bypass these safeguards, leading to pre-emptive censorship and content over-policing by platforms. This chilling effect on speech has also driven some social media companies to withdraw from certain jurisdictions, as seen with Google’s exit from China and the bans imposed on Telegram in various countries.
Describe the Relevance of Your Session to APrIGF
The session aligns with the theme of Resilience as it explores the increasing personal liability imposed on platform owners and employees for user-generated content. This issue directly impacts rights and freedoms, as it raises concerns about how legal pressures on digital platforms can lead to over-compliance and a chilling effect on free speech. Additionally, it falls under media and content governance, as governments worldwide are enforcing stricter regulations on content moderation, sometimes compelling platforms to take excessive actions to avoid legal repercussions. The session will also touch on regulatory diligence, analyzing the balance between platform accountability and intermediary protections under existing legal frameworks. By addressing these concerns, the discussion contributes to a broader understanding of how evolving regulations shape digital governance, trust, and resilience in online spaces.
Methodology / Agenda (Please add rows by clicking "+" on the right)
Time frame (e.g. 5 minutes, 20 minutes, should add up to the time limit of your selected session format) Description
Introduction (5 mins) Overview of the platform economy and its governance challenges. Emergence of personal liability as a regulatory tool in various jurisdictions.
Case studies and global trends (5 mins) Focus on India, Brazil, Russia and EU
Risks of personal liabilities (5 mins) Chilling effects on free speech and platform operations. Impact on employees' willingness to work in high-risk environments. Potential for misuse of liability laws to target dissent or opposition.
alancing Accountability and Innovation (5 minutes) Alternative mechanisms: Institutional oversight, independent boards, and collaborative governance. Role of transparent algorithms, content moderation policies, and user-centric approaches. Fostering a global consensus on fair regulation without undermining fundamental rights.
Moderators & Speakers Info (Please complete where possible) - (Required)
  • Moderator (Primary)

    • Name: Angela Thomas
    • Organization: SFLC.in
    • Designation: Counsel
    • Stakeholder Group: Select One
    • Expected Presence: Select One
    • Status of Confirmation: Select One
  • Moderator (Facilitator)

    • Stakeholder Group: Select One
    • Expected Presence: Select One
    • Status of Confirmation: Select One
  • Speaker 1

    • Stakeholder Group: Select One
    • Expected Presence: Select One
    • Status of Confirmation: Select One
  • Speaker 2

    • Stakeholder Group: Select One
    • Expected Presence: Select One
    • Status of Confirmation: Select One
  • Speaker 3

    • Stakeholder Group: Select One
    • Expected Presence: Select One
    • Status of Confirmation: Select One
  • Speaker 4

    • Stakeholder Group: Select One
    • Expected Presence: Select One
    • Status of Confirmation: Select One
  • Speaker 5

    • Stakeholder Group: Select One
    • Expected Presence: Select One
    • Status of Confirmation: Select One
Please explain the rationale for choosing each of the above contributors to the session.
I am a lawyer and public policy professional working at the intersection of technology, law, and policy, with a strong focus on digital rights, privacy, data protection, free speech, and net neutrality. At SFLC.IN, a civil society organisation I lead the Free Speech Tracker and Platform Governance verticals, while also engaging in projects at the intersection of technology, law, and policy. Through the Free Speech Tracker, we have developed a dynamic platform to monitor and document instances of free speech violations across India, focusing on censorship of books, movies, art, and other forms of content takedowns, along with the reasons behind such actions. I am involved in platform governance issues, wherein I explore the legal implications of generative AI, data privacy, intellectual property rights, and liability concerns for platforms.
If you need assistance to find a suitable speaker to contribute to your session, or an onsite facilitator for your online-only session, please specify your request with details of what you are looking for.
No
Please declare if you have any potential conflict of interest with the Program Committee 2025.
No
Are you or other session contributors planning to apply for the APrIGF Fellowship Program 2025?
No
Upon evaluation by the Program Committee, your session proposal may only be selected under the condition that you will accept the suggestion of merging with another proposal with similar topics. Please state your preference below:
Yes, I am willing to work with another session proposer on a suggested merger.
Brief Summary of Your Session
In recent years, governments around the world have increasingly turned their attention toward the personal liability of platform executives and employees, marking a dramatic shift in the conversation on platform accountability. The arrest of Telegram’s founder Pavel Durov in Paris and the suspension of X (formerly Twitter) in Brazil spotlighted a new frontier in digital regulation, one where the lines between corporate responsibility and individual culpability are being redrawn.
This session explores whether holding individuals including local compliance officers and executives personally or even criminally liable for a platform’s systemic failures is an appropriate and effective regulatory response, or whether it risks overreach, chilling innovation, and discouraging responsible leadership.
To unpack this, the discussion traces the evolution of platform regulation from the early internet era, when safe harbour provisions such as Section 230 (U.S.), the EU E-Commerce Directive (2000), and India’s IT Act (2000) prioritized innovation and free speech by shielding intermediaries from user-generated content liability to the growing global demand for accountability and transparency in the age of disinformation, data breaches, and online harms.
By the mid-2010s, a series of watershed moments like the Arab Spring, the Cambridge Analytica data scandal, and the Christchurch attacks revealed the immense social and political power of platforms. Governments began adopting proactive models like the EU’s Digital Services Act (2022) and the UK’s Online Safety Act (2023), which impose a duty of care on companies to mitigate systemic risks. Meanwhile, countries like India, Turkey, and Russia went further by introducing laws that make local representatives personally liable for non-compliance blurring the line between organizational and individual accountability.
However, this growing reliance on personal criminal liability has sparked intense debate. In the Global South, where platform oversight is already limited, these measures often expose local employees to disproportionate risk without addressing structural issues like inadequate content moderation in regional languages, opaque algorithms, and jurisdictional evasion by tech giants headquartered abroad.
Substantive Summary of the Key Issues Raised and the Discussion
The issues with digital platforms are multifaceted:

Compliance and Accountability Gaps: Unlike in Europe or North America, many platforms in global south avoid establishing local offices, grievance redressal teams, or transparent compliance frameworks, allowing them to evade oversight and disregard lawful government orders.

Delayed or Insufficient User Remedies: Users face slow or inadequate responses to takedown requests, harassment complaints, or appeals, resulting in systemic denial of equal access to justice.

Cultural and Linguistic Blind Spots: Content moderation is often based on Western-centric standards, neglecting local languages and context. Automated moderation tools enforce “one-size-fits-all” community standards, enabling misinformation and hate speech to proliferate, especially during elections or conflicts.

Jurisdictional Evasion by Platforms: During the 2000s and 2010s, major platforms like Meta, Google, and Twitter operated globally but were legally based in the U.S.. This created a significant regulatory loophole: when governments or courts in other countries sought compliance or accountability, these companies often invoked “lack of jurisdiction.”

Key reasons this argument often succeeded:

Foreign Incorporation: Companies were legally registered outside the country seeking enforcement, often in the U.S. or Ireland (sometimes for tax optimization).

Governing Law in Terms of Service: User agreements typically specified U.S. law and courts as the controlling jurisdiction.


Enforcement Challenges: Local courts struggled to implement cross-border orders, particularly for global takedowns or content removals.

Effect: Platforms enjoyed global reach without proportional accountability, creating a regulatory gap where local governments could demand compliance in principle, but rarely in practice.
Over time, countries began devising legal strategies to close this "jurisdictional gap" and ensure effective regulation of multinational tech companies.

Appointment of Local Legal Representatives Many countries now require platforms to appoint local representatives who are responsible for compliance with local law. This has numerous benefits: It provides a clear point of contact for government agencies and citizens. It ensures that platforms can be held accountable under local jurisdiction. It facilitates effective enforcement of government orders or court decisions. However, it also has potential downsides: Risk of Overreach: Some governments may misuse this requirement to suppress dissent or control information. Personal Risk to Representatives: There is a risk of reprisals, threats, or even arrests, especially in countries with unpredictable legal systems.

Data Localization & Sovereign Internet Models Some countries have adopted data localization laws, which require platforms to store user data within their borders. They contend that this safeguards citizens' data, facilitates law enforcement access, and enhances local control.
Conclusions and Suggestions of Way Forward
Ideally, directors and founders should not be held personally liable for platform-level decisions, as this risks discouraging innovation and deterring responsible leadership. However, where there is clear evidence of direct complicity, willful negligence, or knowledge of wrongdoing, criminal liability may be justified. Even then, the threshold must remain high, backed by substantial evidence and due process safeguards.
Going forward, accountability should not rely solely on personal liability or fines. Instead, regulators should focus on building robust systemic mechanisms, such as:
(i) Increasing fine amounts and strengthening fine recovery systems to ensure penalties have real deterrent value.
(ii) Independent algorithmic audits to assess bias, risk, and compliance.
(iii) Mandatory transparency reports to make moderation and data practices publicly visible.
(iv) Cross-border regulatory cooperation for consistent global enforcement.
(v) Public accountability frameworks that engage civil society and researchers in oversight.
Number of Attendees (Please fill in numbers)
    • Online: 55
Gender Balance in Moderators/Speakers (Please fill in numbers)
  • Moderators

    • Female: Yes
How were gender perspectives, equality, inclusion or empowerment discussed? Please provide details and context.
NA
Consent
I agree that my data can be submitted to forms.for.asia and processed by APrIGF organizers for the program selection of APrIGF 2025.