25.02.2025

How to raise awareness of for-profit media and search engine platforms?

The dissertation, published in October 2024, the dissertation introduces a new concept of platform awareness. According to the dissertation, platform awareness is a broad understanding of the power structure of for-profit media and search engine platforms, consisting of a complex interplay of social, technological, economic and political perspectives. Platform literacy is presented as a fundamental work in the development of different literacies (media literacy, data literacy, digital literacy, algorithm literacy).

The conversation of the social impact of platform giants has been active in the early part of the year. The ongoing conversation of the platform giants raises awareness of their influence. What kind of tools does recent media education research offer? Guna Spurava’s dissertation Towards Platform Awareness in Media Education explores how media education can be used to support agency in a media environment dominated by for-profit platform giants.

In the second phase of digitalization, the media landscape transformed into the era of digital media platforms. Global operators, such as Facebook, Instagram, YouTube, TikTok, X and Google Search, became part of the media by offering services and products. These platforms are managed by a few international companies that are not traditional media companies and are therefore excluded from traditional media regulation. Today, platforms manage their users’ feeds and provide a platform for media agency, being important structures for millions of users. Researchers underline that awareness of the economy of platforms and the use of data in profit-seeking should be increased.

The media landscape provided by algorithms and platforms is partly an artificial social fabric. Picture: Pixabay

The amount of consumer attention is limited, and there is an ever-increasing struggle for it among platforms. Platforms use their position as gatekeepers of information by restricting the agency of their users, which also threatens democracy and well-being. Although the starting point for the platforms is to limit and organize the flow of information into more suitable entities for users, their ultimate aim is not the well-being of their users. However, people appreciate the ease of everyday life provided by the invisible processes of the platforms and easily hand over some of the continuous decision-making to machines and algorithms (for example Anderson and Rainie 2023).

Different literacies are part of understanding how platforms work, but not the limits of its understanding

The theoretical part of the dissertation discusses whether, for example, new literacies can be expanded to deal with the effects of platforms. Media education is focused on introducing literacy-based concepts (Hobbs 2020, Wuyckens, Landry ja Fastrez 2022) and frameworks, and to develop an individual’s capacity to create and interpret digital media content, with particular emphasis on inclusion and creative expression in digital media. However, there are limits to the concepts of literacy, such as how well they manage to handle different aspects of digital media platforms, such as hardware, user interfaces, algorithms or finances (Nichols ja LeBlanc 2020). It is not only a question of critical review of the content transmitted by algorithms, but also of understanding the economic operating models of the platforms. Literacy is important, but it is difficult to teach users to read something that is not open and transparent.

Critically, different literacies are an individual-level solution to a problem that is rather systemic (Beckett and Livingstone 2018).  In this situation, responsibility for operating in complex media environments is transferred to the individual, which at worst increases a new kind of digital divide. Some users are more aware of the effects of algorithms and are thus able to be critical of the input they see, while some users are clearly more vulnerable to the way algorithms work. In addition, the search for systemic-level solutions slows down when individual-level solutions are presented.

One background study of the dissertation states that programming skills would be useful in the development of algorithmic literacy. The advanced development of algorithm literacy is therefore quite technical and approximates the job description of an algorithm programmer. Awareness of how platforms work does not require a programmer-level understanding of how algorithms, for example, work technically, but rather an awareness of when our operations are being influenced. In fact, users have developed different coping mechanisms to resist the effects of algorithms.

 Who decides on your media feed?

Guna Spurava presents a framework for platform awareness consisting of four modules: political, social, economic and technological. Each of the modules contains its own thematic areas, and together they form a holistic view of platform awareness. All of them are interconnected, so understanding the political, social or technological whole, for example, requires understanding of the economic mechanisms and the workings of digital capitalism.

The aim of the model is to present this complex and interdependent whole and to suggest that it should be treated as a whole, not as individual parts. According to the dissertation, platform awareness is a necessary step towards digital media agency, which in this study is understood as the freedom to choose, create and share information.

While reading the dissertation, I remembered a slide from 1979 that was circulating on the internet:

“A computer can never be held accountable, therefore a computer must never make a management decision. – IBM 1979.”

Whether the slide is real or not, it refers to current practices such as the use of artificial intelligence in recruitment. The responsibility for the machine’s mistakes lies with the human being. This idea can be brought down to the individual level: the computer can never be held responsible, so the computer can never make a decision that belongs to you. However, according to the dissertation, many people prefer the reality where platforms and search engine algorithms make decisions for us because it is easier. The challenge is the opacity of the operation: the user should know what decisions are being made by the machine in order to assess who is responsible for the decision. Platform awareness and literacy are media education for the individual, but systemic solutions are also needed alongside them.

Senior Specialist Outi Laiti

This article is based on Guna Spurava’s doctoral dissertation.

Sources:

Anderson, J., & Rainie, L. (2023). The future of human agencyPew Research Center.

Beckett, C., & Livingstone, S. (2018). Tackling the information crisis: a policy framework for media system resilience-the report of the LSE Commission on Truth Trust and Technology.

Hobbs, R. (2020). Propaganda in an age of algorithmic personalization: Expanding literacy research and practice. Reading Research Quarterly55(3), 521-533.

Nichols, T. P., & LeBlanc, R. J. (2020). Beyond apps: Digital literacies in a platform society. The reading teacher74(1), 103-109.

Spurava, G. (2024). Towards Platform Awareness in Media Education. Väitöskirja, Tampereen yliopisto.

Wuyckens, G., Landry, N., & Fastrez, P. (2022). Untangling media literacy, information literacy, and digital literacy: A systematic meta-review of core concepts in media education. Journal of Media Literacy Education, 14(1), 168-182. https://doi.org/10.23860/JMLE-2022-14-1-12