A platform policy implementation audit of actions against Russia's state-controlled media

INTERNET POLICY REVIEW(2023)

Cited 0|Views2
No score
Abstract
The information influence of Russia's state-controlled media outlets such as RT and Sputnik on global multimillion audiences has been one of the major concerns for Western democracies and Ukraine in the last decade. With the start of the 2022 full-scale invasion of Ukraine by Russia, they were recognised as a threat to international security and several major bans were implemented towards RT and Sputnik, including their ban in the EU and its member states, and non-EU countries such as Canada, the UK and Australia, and reinforced content moderation by digital platforms globally. Digital platforms had to step up as new arbitrators of digital public spheres in this crisis event, which led us to question how major digital platforms (Twitter, YouTube, Facebook, Instagram, TikTok, and Telegram) have implemented their content moderation policies towards RT and Sputnik accounts following Russia's full-scale invasion of Ukraine in 2022, across ten countries. We present a platform policy implementation audit method to analyse such content moderation measures, and demonstrate its implementation by six coders after two months of the full-scale invasion. Our audit shows largely inconsistent trends in platform policy implementation towards RT and Sputnik, as well as a wide catalogue of measures taken by tech giants. We conclude with a discussion of the further implications and effectiveness of such content moderation measures for global digital audiences.
More
Translated text
Key words
platform policy implementation audit,policy implementation,media,state-controlled
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined