APrIGF 2024 Session Proposal Submission Form | |||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Part 1 - Lead Organizer | |||||||||||||||
Contact Person | |||||||||||||||
Ms. Liza Garcia | |||||||||||||||
Organization / Affiliation (Please state "Individual" if appropriate) * | |||||||||||||||
Foundation for Media Alternatives | |||||||||||||||
Designation | |||||||||||||||
Executive Director | |||||||||||||||
Gender | |||||||||||||||
Female | |||||||||||||||
Economy of Residence | |||||||||||||||
Philippines | |||||||||||||||
Primary Stakeholder Group | |||||||||||||||
Civil Society | |||||||||||||||
List Your Organizing Partners (if any) | |||||||||||||||
Ms. Sabhanaz Rashid Diya Founder and Executive Director Tech Global Institute srdiya@techglobalinstitute.com |
|||||||||||||||
Part 2 - Session Proposal | |||||||||||||||
Session Title | |||||||||||||||
Platform Accountability in South and Southeast Asia | |||||||||||||||
Session Format | |||||||||||||||
Panel Discussion (60 minutes) | |||||||||||||||
Where do you plan to organize your session? | |||||||||||||||
Onsite at the venue (with online moderator for questions and comments from remote participants) | |||||||||||||||
Specific Issues for Discussion | |||||||||||||||
Accountability of social media and user-to-user communication platforms (“platforms”) around their speech and privacy practices is a longstanding, albeit contested, issue. Countries have adopted different approaches to holding platforms accountable— from legislation (Singapore) to coalition-led voluntary commitments (Australia). On the other hand, India has taken platforms to court, demanding more transparency and safety against hate speech for their communities. Efficacy measurements around these approaches are fragmented, therefore, it is difficult to identify what works and doesn’t, and subsequently, how to make more concerted efforts based on learnings. This panel discussion brings together experts from Bangladesh, Nepal, Philippines and South Korea to understand the varied approaches to accountability and making online spaces safer, discuss what works, and provide recommendations on specific interventions that civil society and policymakers can adopt based on shared, cross-regional experiences. The panel discussion is built on a series of civil society-led conversations in the Asia Pacific and the Global Majority that increasingly recognised that accountability efforts are fragmented. As a result, the uneven power dynamic between private entities, policymakers and non-governmental voices is exacerbated, and communities are left unsafe and unheard. There is a growing urgency to de-duplicate efforts and share learnings, ideally contributing towards coalition-building in APAC and globally. The panel is both a culmination of these discussions to bring in a wider audience, as well as a starting point towards more cross-regional learning and coalition-building. |
|||||||||||||||
Describe the Relevance of Your Session to APrIGF | |||||||||||||||
This year’s APrIGF touches on both the evolution of digital ecosystems and the principles that govern them with hopes to build more consensus around responsible technologies. The panel discussion fits directly into this year’s theme by: (a) taking stock of lessons learnt by civil society groups on platform governance, (b) recognising the evolution of various governance tactics across countries, (c) outlining converging and diverging principles underpinning these tactics, and (d) providing an opportunity for consensus-building around tactics based on efficacy, inclusivity and replicability. The session also demonstrates the importance of multi-stakeholder, cross-regional approach to internet governance by bringing together experts from different backgrounds, industries and countries. The outcome of the panel discussion is an APAC-focused white paper on platform accountability and governance that civil society organizations and policymakers can use as a reference to build consensus with each other with and between countries. |
|||||||||||||||
Methodology / Agenda (Please add rows by clicking "+" on the right) | |||||||||||||||
|
|||||||||||||||
Moderators & Speakers Info (Please complete where possible) | |||||||||||||||
|
|||||||||||||||
Please explain the rationale for choosing each of the above contributors to the session. | |||||||||||||||
Professor Kyung-Sin (K.S.) Park is one of the founders of Open Net Korea, a forum for discussion and collaboration on ICT freedoms and human rights since 2013. From 2006 through 2016, he also has served as the executive director of the PSPD Law Center, a non-profit entity that organized several high-impact litigations in the areas of free speech, privacy, and copyright in South Korea. Sabhanaz Rashid Diya is the founder of Tech Global Institute, a tech policy nonprofit focused on advancing equity and accountability for the Global Majority on the Internet. From 2019 through 2023, she headed public policy for Meta in the Asia-Pacific region, and brings 20 years of experience in shaping technology regulations in South Asia and Sub-Saharan Africa. She is a visiting policy fellow at Oxford Internet Institute, University of Oxford. Grace Salonga is a lawyer by profession. She is the Executive Director of the Movement Against Disinformation (MAD), a broad non-partisan coalition of members from the academe, legal profession, civil society groups, international and local non-government organizations and other advocacy groups that are united in pushing back against the systematic and unregulated spread of disinformation on social media platforms Prateek Wahgre is the Executive Director of Internet Freedom Foundation. He has done research and work on the impact of technology in democratic networked societies, the role of misinformation and disinformation in the information ecosystem, the governance of digital communication networks, data privacy, internet shutdowns, and major issues affecting the internet policy space in India. Liza Garcia is the Executive Director of the Foundation for Media Alternatives. She specializes in women’s rights and ICT and has been actively promoting internet governance in the Philippines. She currently sits as one of the board members of the Internet Society – Philippines Chapter |
|||||||||||||||
If you need assistance to find a suitable speaker to contribute to your session, or an onsite facilitator for your online-only session, please specify your request with details of what you are looking for. | |||||||||||||||
Thank you, we got it covered. | |||||||||||||||
Please declare if you have any potential conflict of interest with the Program Committee 2024. | |||||||||||||||
No | |||||||||||||||
Are you or other session contributors planning to apply for the APrIGF Fellowship Program 2024? | |||||||||||||||
Yes | |||||||||||||||
APrIGF offers live transcript in English for all sessions. Do you need any other translation support or any disability related requests for your session? APrIGF makes every effort to be a fully inclusive and accessible event, and will do the best to fulfill your needs. | |||||||||||||||
None | |||||||||||||||
Brief Summary of Your Session | |||||||||||||||
The session looked into the accountability of social media and communication platforms around their speech and privacy practices. It raised questions on what platform accountability should be and provided some approaches to accountability and making online spaces. The four panelists from the Philippines, South Korea, Bangladesh and India provided examples of platform governance in their respective countries in order to surface the different approaches to accountability and to make online spaces safer, as well as discussed impact and limitations of governance. KS Park said that while we have to hold platforms accountable we also need to ask for what are we holding them accountable? Is it for speech, or for content? Is it both for content that they are not aware of or for content that they were made aware of? Perhaps instead of content specific activity, platforms can be held accountable for privacy and censorship. He suggested considering the DMA and DSA in the EU and see how these can be useful to Asia. Grace Salonga from the Philippines stressed that platforms are a source of news information for many, and thus they have a moral and legal obligation to their users, especially when the platform is being used again and again in nefarious activities that harm users. She cited the red-tagging of journalists in the Philippines as an example. She also said that platforms can be held accountable in the Philippines as service providers and that one journalist has filed a case against Meta through the Philippines’ National Privacy Commission. Diya from Bangladesh said there is a need to define what a platform is, and also what accountability means. She suggested having a broader lens when looking at platform accountability and not just look at it in terms of speech. In Asia for instance, we should also see it through the lens of women. She also encouraged people to think if legislation is the only way to hold platforms accountable. Prateek Waghre from India said that we also need to interrogate when we talk about accountability because it can also be weaponized. He also cited the tension of holding platforms accountable and the executive branch imposing more control and restrictions, and the need to balance these and bring into fore safeguards for the users. |
|||||||||||||||
Substantive Summary of the Key Issues Raised and the Discussion | |||||||||||||||
The session proposed that we step back and look at the architecture of the platform when raising the issue of accountability. Other issues raised were as follows: - Platform accountability varies across regions. What metrics are being used and would a metric used in one region or one country work for others, too? - Platforms create conditions that in turn create risks for its users. Should platforms set aside a fund to address this? - Can engagement with platforms work? Civil society organizations are already burdened - Platforms have been collecting data from users and profiting from them. We do not have control over our data. - There is no representation of social media companies in some countries and it makes it difficult for them to file complaints and make requests - How do we regulate legal content to prevent potential harms, as well as privacy? - The tension between news organizations and social media companies, as social media becomes a major source of news. An example cited is the news media bargaining code in Australia where the government introduced laws requiring tech companies to pay for the news on their platforms. - Harms on minority and marginalized groups, including women and LGBTQI persons. How can the rights of users, especially the marginalized be safeguarded online? How do we make platforms accountable for such? While there are existing community standards, they have proven to be inadequate in many instances. - Transparency over content moderation. Who is to decide what is harmful or not? The public should have the right to challenge what is being done to them. If a content is harmful, what is the recourse for this? |
|||||||||||||||
Conclusions and Suggestions of Way Forward | |||||||||||||||
There are various approaches to accountability in many countries but these are fragmented. The uneven power dynamics between private entities, policy makers and non-government voices are exacerbated and as a result communities are left unsafe and unheard. There is a need to look further on what we mean by platform accountability and this includes understanding the metrics and complaints mechanisms that can be adopted that will lead to improvements in governance. People also need to engage platforms so that their rights are safeguarded online. | |||||||||||||||
Number of Attendees (Please fill in numbers) | |||||||||||||||
|
|||||||||||||||
Gender Balance in Moderators/Speakers (Please fill in numbers) | |||||||||||||||
|
|||||||||||||||
How were gender perspectives, equality, inclusion or empowerment discussed? Please provide details and context. | |||||||||||||||
Platforms amplify different forms of discrimination including hate which threaten our safety, civil rights, privacy and our democracy. Such have a bearing and impact on women, the LGBTQIA+ community and other marginalized and minority groups and therefore should be addressed. A gender lens should be used when looking at platform accountability considering that in South and Southeast Asia, there are many instances and experiences of harm faced by women when using platforms, but many are left unaddressed. |
|||||||||||||||
Consent | |||||||||||||||
I agree that my data can be submitted to forms.for.asia and processed by APrIGF organizers for the program selection of APrIGF 2024. |