Using the algorithm to obtain information or profit?

Modern social networks have come a long way in the past almost twenty years and have fundamentally changed both ourselves and the world in which we live. It started as an idea of Facebook to connect Harvard students in 2004 and has grown into a platform for third-party interference in other countries’ election processes. After the initial momentum and success in the process of democratization of information, the dark side of those platforms came to light. Disinformation campaigns and hate speech found refuge on Facebook, with very direct and tangible implications for democratic processes, as well as the mental health of citizens. Especially during periods of significant socio-political events on a global, regional and local level, networks have transformed into a popular place for gathering and organizing extreme groups. Ever since the Cambridge Analytica scandal in 2018, the collection and trade of user data has cast a shadow of doubt on the sincerity and sustainability of the story of freedom of expression and connecting people. Therefore, a legitimate question arises – is the coexistence of liberal democracy and such social networks possible?

Transparency of work is one of the central themes when talking about platforms, related to the phenomenon that psychologist Shoshana Zuboff calls surveillance capitalism. On social networks, users pay for free services by unwittingly giving away their data to unknown third parties who then expose them to ads that target their preferences and personal characteristics, often at the expense of the user’s interests. As the influence of social networks grew, so did their manipulative and harmful influence.

Companies such as Facebook, Instagram, Twitter, YouTube, and TikTok rely heavily on artificial intelligence, that is, algorithms for ranking and recommending content not rewarding veracity, but virality (rapid and wide circulation on the Internet), and popularity does not necessarily imply the quality of information. The root of the problem is that the algorithms are designed to instigate increased advertising revenue. They are owned by private corporations, exploiting data on people’s behavior and manipulating their attention to increase profits.

The use of algorithms in the provision of media and other services through online social platforms has direct implications for the functioning of democratic processes. A healthy democracy is one in which citizens participate and make free political decisions based on accurate information, verified facts, and reliable evidence. The role of social networks as mediators between users and media places them as de facto providers of media content. Given the frequent unprofessional, untrue, and dangerous content on social networks, users are not only exposed but especially vulnerable to online disinformation.

Social networks have developed algorithms for selecting the news and directing it to those groups and individuals that the algorithms estimate would be of interest to them. The number of clicks has become an important parameter of success on networks and portals, which is often achieved with content that disinforms rather than informs, says the rector of VERN University, prof. Dr. Vlatko Cvrtila.

He also points out that social networks, ways of producing and disseminating news, have enabled different actors to manage public opinion in certain countries to confuse and mobilize the public against democratic structures and development goals of certain societies.

Social networks enabled the creation and realization of various hybrid operations of influence in order to achieve the goals of destabilizing contemporary societies and managing their destiny. And that must become one of the key issues of national security if we want to preserve our independence and sovereignty, concludes Cvrtila.

Behind the scenes, Facebook has programmed an algorithm that decides what people see in their news feeds, so it uses emoticons as signals to encourage more provocative content. Following recent revelations by whistleblower Frances Haugen, we now know that Facebook’s news feed sorting and placement algorithm favored content that provokes anger and made it five times more visible than content that provokes happiness. The theory was simple: posts with lots of wow, angry, sad, and haha reactions tended to keep users more engaged and present on the platform, and user retention was the key to Facebook’s business. In doing so, favoring controversial and polarizing posts opened the door to more spam, violating Facebook’s terms of service. The internal documents of that company, which were leaked to the public in 2019, confirmed that the posts that caused one of the above emoticon reactions more often included disinformation, harmful news, and content of questionable and low quality. Changing the algorithm, despite its public importance, was impossible since in the final calculation it would lead to less use, and fewer clicks on ads, that is, fewer earnings for that company.

Also, when we consider the issue of transparency, the personalization of content is problematic, which, in combination with profiling and micro-targeting of users, contributes to the creation of the so-called bubble filters. In these bubble filters, people are exposed to an excessive amount of news or views aligned with their existing beliefs. This further results in hermetically closing users into a circle of personalized information according to their interests and beliefs. In this way, exposure to alternative viewpoints is limited, creating so-called echo chambers.

Users are often unaware that the information posted at the top of the news feed is suggested by algorithms. Few users understand that algorithms will present them with information filtered to provoke anger or resentment, emotions that go hand in hand with disinformation, conspiracy theories, and hate speech.

In the analysis of Social Networks and Journalism in Montenegro (Društvene mreže i novinarstvo u Crnoj Gori), which was conducted by the Media Council for Self-Regulation (MSS) with the support of UNESCO and the EU, it was shown that neither journalists nor editors were clear about the importance and role of algorithms. Out of a total of 20 interlocutors interviewed, the largest number did not understand the principle of the networks, nor could they come up with guidelines or instructions on how to use the networks.

Achieving more transparent and less manipulative social networks is one of the key goals of the 21st century. This is confirmed by the recently adopted Law on Digital Services and the amended Code on the Suppression of Disinformation under the auspices of the European Union. The documents commit networks to faster removal of illegal content, transparency of political advertisements, explaining to users and researchers how their algorithms work, taking stricter measures against the spread of disinformation, and suppressing content that has negative effects, citing basic human rights. Companies will face fines of up to six percent of annual turnover for non-compliance.

Apart from the legal solution, mitigating the vulnerability of users (who use social networks as sources of news and information and who are potential targets of disinformation campaigns) can also be prevented through the improvement of media and digital literacy skills, which users need to critically understand the information they encounter on the Internet and with which they interact.

Media literacy helps us understand how, for example, algorithms track user habits and offer tailored content accordingly. We live in an environment saturated with information, the distribution of which is simple and practically free, points out Olivera Nikolić, director of the Media Institute of Montenegro.

He says that relevant research indicates that today’s users mostly access information and news via various digital platforms and through mobile phones. Therefore, our interlocutor believes that it is important to understand how new technologies work and how new media work, especially social networks, so that users know how and in what way information reaches them. As he states, the question arises as to how much media users today are aware of the fact that the content that reaches them is the one that interests them, but that it is the one that corresponds to our interests and habits, that confirms our views and leads us to bias.

In this sense, media literacy plays a big role, which includes a set of skills that help users access, select, critically read, share and create information, and also how to communicate through new media, she says.

The conclusion that emerges from the above facts is that social networks are built on a fundamental, perhaps irreconcilable, dichotomy: the mission to improve society by connecting people and democratizing information while simultaneously profiting from it. This is their conflict of interest and at the same time our ugly truth.