research note

Educating or performing? TikTok literacy activities in the Nordics

TikTok’s popularity makes the platform both a significant site of online information disorders and a potentially powerful educator in media and information literacy. Yet, a monitoring study of the transparency reports shows that TikTok’s activities related to media and information literacy in the Nordics are highly limited and largely performative.

Authors: Minna Horowitz & Mervi Pantti

TikTok has become a major global social media platform used for entertainment, communication, and increasingly for news. Its rapid growth has also made it a point of public concern, including impact on youth mental health, the viral spread of disinformation, and breaches of data and national security. These debates are prominent even in the Nordic countries, high-trust societies with advanced media and information literacy (MIL) policies and practices. MIL is generally understood as the ability to access, critically analyze, evaluate, create, and safely use media and information across all platforms.

In the European Union, TikTok has sought to present itself as a platform proactively combating information harms by joining the EU’s co-regulatory Code of Practice on Disinformation. The Code has been integrated into the Digital Services Act (DSA) in 2025, becoming a co-regulatory Code of Conduct applicable to Very Large Online Platforms (VLOPs). Under the Code, platforms must report twice yearly through “transparency reports” that describe, among other things, their actions aimed at improving media literacy and user empowerment.

This research note—drawing on broader monitoring of the transparency reports by the European Digital Media Observatory (EDMO)—focuses on what TikTok communicates about its literacy-related activities in four Nordic countries, Denmark, Finland, Norway, and Sweden.

Worries in the Nordics

While these countries rank among the best in MIL, national surveys from 2023 indicate that roughly 60% of people in these countries report encountering disinformation somewhat often or often. Survey respondents express concern about both foreign information interference and the overall chaotic information environment. According to the 2025 Nordic Media Literacy Survey, people in the Nordics are relatively satisfied with their own literacy skills when measured by their trust in journalism and media in general. Yet both the oldest respondents, who spend less time online than the average user, and the youngest age group, who engage in online activities frequently, are less confident than the average respondent in their ability to evaluate media trustworthiness.

These facts create a literacy demand and an opportunity, at least in theory: platform-based literacy interventions could be highly valuable in countering various online disorders and serve various target groups.

TikTok’s position in the Nordics demonstrates such a need. By late 2025, TikTok’s reach had become substantial, with estimates of roughly 31% in Denmark, 35% in Finland, 40% in Norway, and 41% in Sweden. TikTok is growing in importance as a platform for news consumption, yet it has also been embroiled in political controversy. Denmark’s Defence Ministry banned TikTok on official devices in 2023 due to cybersecurity concerns; Norway’s parliament restricted the app on work devices after warnings from the Justice Ministry; Finland’s parliament followed with a similar restriction in 2024. In Sweden, political parties began reassessing their use of TikTok amid international bans. In addition, public debate and research attention have focused on TikTok’s potential effects on elections and democratic processes, as well as its impact on young people.

Transparency, TikTok way

Against this backdrop, TikTok’s transparency reports offer a timely account of its literacy and disinformation responses. To systematically analyse the VLOP reports, the EDMO has developed an assessment framework aligned with the Code’s commitments, including Commitment 17 on MIL. The framework examines (1) the quality and completeness of what platforms report—documentation, transparency, and meaningful qualitative and quantitative metrics—and (2) the strength of claims—whether information is verifiable or corroborated by external sources. Under Commitment 17, platform efforts are grouped into three categories: tools for critical thinking and media literacy, activities and campaigns to promote literacy, and partnerships with MIL experts.

Based on the analysis of TikTok’s 2024 reporting, the platform’s activities seem extensive. These include materials in TikTok’s Transparency Center that explain recommendation systems and responsible practices for AI-generated content. TikTok’s Safety Center provides information addressing misinformation and election integrity, while the Newsroom publishes updates and monthly reports on covert influence operations. 

TikTok also describes various literacy tools:  topic-specific interventions with notice tags that prompt users to learn more, search interventions that guide users to trusted sources, and public service announcements that link to authoritative information. Additional practices include an “unverified content” label that discourages sharing, labels for state-controlled media, and labels that creators can use to indicate AI-generated content. 

Numbers without meaning

TikTok reports data by country on how often literacy tools to foster media were shown and used. Metrics include impressions (how often an intervention appears), clicks, and click-throughs (following links to external sources). However, without contextualisation—such as how these exposures relate to total national user bases, typical engagement patterns, or baseline levels of news consumption—these figures do not convincingly demonstrate impact. TikTok does not show in any detail that users meaningfully engaged with them, or that engagement has changed behaviour. More importantly, independent verification is not possible: external researchers cannot easily validate the platform’s figures, and the reports do not provide sufficient methodological detail or supporting evidence.

TikTok also reports specific MIL campaigns in the Nordics. For January–June 2024, it highlights election-related literacy campaigns in Denmark, Finland, and Sweden (though details for Sweden are limited), as well as a general literacy campaign in Denmark. These campaigns were produced by Logically Facts, an Ireland-registered company that operates in fact-checking and online harm reduction. 

TikTok frames its campaigns as topic-tailored: some are localised around national elections and partnerships, while others—such as those related to Ukraine—aim for a broad reach across countries. Yet reach appears limited: low landing-page impressions, modest search impressions and clicks, and very low click-throughs to further resources. Election campaigns reportedly perform better, but again, the data is unverifiable. 

What kinds of partnerships?

The most opaque dimension concerns partnerships. TikTok states that it engages qualified external experts and lists numerous literacy partners, yet it provides limited transparency on how partners are selected, what collaboration entails, and how expert insights are integrated into platform decisions. It is also unclear what, if any, data access partners receive to help them create impactful literacy interventions. In the Nordic context, it is notable that TikTok’s MIL content creator is an Irish company, even though there are potential Nordic partners—such as the NORDIS project under EDMO—that could offer regionally specific expertise.

Overall, TikTok’s 2024 transparency reporting does not demonstrate innovation or measurable impact in the Nordics. Instead, the reporting is vague, difficult to verify, and insufficiently contextualised. Context specificity is especially important because literacy interventions often address politically sensitive and specific topics. The lack of depth in reporting reflects a broader structural issue: platform accountability depends not just on self-reported actions but also on researchers’ and civil society’s ability to assess claims through access to data. EDMO researchers and other research efforts have reported similar barriers across platforms.

The Code: a powerful tool, in theory

Currently, TikTok is under fire. The platform has been found to be in breach of the DSA. Due to its addictive design, TikTok is considered a security risk in many countries, yet it serves as a key news gateway for many. TikTok’s US-ownership may bring about major challenges.  At the same time, U.S.-based platforms like X and Meta have done away with most measures against disinformation. Wars and other severe geopolitical crises abound. This is the moment when robust, evidence-based accountability for the VLOPs’ media literacy commitments should be demanded, monitored, and upheld in the EU. 

In principle, transparency reporting under the DSA and the Code of Conduct is an important step toward greater accountability. Yet this case suggests that without stronger verification mechanisms, clearer reporting standards, and improved data access, transparency becomes mainly performative. Under Article 40 of the DSA, researchers may request access to detect and understand the VLOP’s “systemic risks”, including negative effects on fundamental rights and civic discourse. It remains to be seen whether this provision, coupled with the new mandatory DSA reporting of the Code of Conduct, will lead to a better understanding of the platforms’ impact and approach to MIL. After all, boosting MIL is one of the EU’s policy priorities in the actions to support societal resilience against disinformation.

Authors: 

Minna Horowitz is a University Researcher and an Adjunct Professor in Media and Communication studies. She studies epistemic rights from the perspective of citizens as well as national and European media policy, with a particular focus on public service and media and information literacy. She serves as the interaction coordinator for DECA, a research consortium of the Strategic Research Council, and is also part of the Nordic NORDIS project of the European Digital Media Observatory (EDMO).

Mervi Pantti is Professor in Media and Communication Studies at the University of Helsinki, Finland. Her research concerns media responsibility, relationship between media and emotion, conflict and crisis reporting and epistemic rights of marginalized groups. She leads DECA, a research consortium of the Strategic Research Council.

Disclaimer: Copilot has been used to check the language.

Leave a comment