POLITICAL MISINFORMATION AND DIGITAL COMMUNITIES: AN ETHNOGRAPHIC STUDY ON THE DISCORD PLATFORM

Berru Perdahcı
ASSOC. PROF. SUNCEM KOÇER

Executive Summary

In the current era of the information society, the dissemination and exchange of knowledge within the networked community is in a constant state of evolution. In the modern media landscape, where information exchange has taken on a more versatile dimension, every platform provides distinct communication prospects and environments owing to its technical features. However, the fast pace of information, which has now become a fundamental aspect of users’ daily media consumption practices, also poses the challenge of rapidly spreading false information. This report adopts an ethnographic approach to investigate the problem of political fake information within virtual communities taking into account the community’s components.

 

Over a six-month period, we used the method of digital ethnography (Prior & Miller, 2012, p. 503) to study a community focused on political discourse and communication. This community was created on the Discord platform, which offers voice, text, and video communication features and was formed to facilitate the sharing of political news and debates. This report examines the factors that may affect the dissemination of news within a community.

 

Our observations indicate that social interaction, anonymity, troll culture, polarization, and control measures within virtual communities may influence the spread of false political information. Furthermore, this study offers insights into how community members make use of media and their attitudes and practices when confronted with false political information, illuminating emerging user practices. We examine the relationships within the community and the features of the platform where political falsehoods are intersecting, underscoring the importance of media practices in the dissemination of false information. Within this context, we offer insights into communities that gather for political discussion and communication. We highlight the production and potential risks linked to false information circulation. The article follows a conventional academic structure and employs clear and objective language. Technical terms are explained, and precise vocabulary is used when required.

 

Our research has identified several factors that affect the spread of false information in virtual communities: 

 

Sociality

The size and structure of relationships among the members of the community were found to affect the dissemination and visibility of information. For instance, news and updates from individuals occupying higher positions in the community, like its founder, were more likely to gain traction, whereas messages from members with lower standing had a tendency to be overlooked. Hence, the circulation of information was significantly impacted by the identity of its source and the social status of the individual transmitting it. Furthermore, we noted that the positions held within the social hierarchy had an impact on the verification process of the accuracy of information.

 

Gatekeepers

We noted the existence of gatekeepers who are responsible for enforcing and maintaining the community’s established rules or those who feel a personal obligation to do so. These gatekeepers, which may comprise members, the management team or artificial intelligence, not only guarantee the members’ compliance to the rules but also exert control over all information exchange in the community. Information from any source within the community underwent a filtering process, and individuals who communicated information or messages contrary to the regulations received specific sanctions. The “Conflicts” section examines the conflicts that we observed, and we noted that gatekeeping and control mechanisms within the community played a role in their observation.

 

Conflicts

Given that the community centred around political discourse and communication, conflicts were bound to arise. Our observations revealed instances of conflict among members pertaining not only to political ideologies, but also extending to gender and religious beliefs. These conflicts, which are integral to the sustainability of the community, were also found to play a role in the process of information sharing and verification. For example, members tended to analyze news shared by a member with significantly different ideologies to their own, which results in polarized filtering of false political information.

 

Anonymity and Troll Culture

The majority of community members hid their personal details. Nonetheless, identities were uncovered via personal profiles, communication patterns, and opinions shared. Our research reveals that anonymity and troll culture in the community resulted in falsehoods spread in two specific ways. On the one hand, members operated with a sense of security afforded by their anonymity; on the other hand, they intentionally adopted an approach linked to anonymity and trolling. Our observation led to the conclusion that suspicions of trolling, especially in the context of political debates and information sharing, were interwoven with the procedures of information verification and resistance against false information.

 

Practices

We noted that the community had several resilience mechanisms in place to counter fake information. For instance, in group chats, members usually asked the person who shared the news for links, which enabled them to access the source. Even if the information was presented in written form, members still referred to the links. Additionally, if inaccurate information was transmitted, other members who recognized the inaccuracies immediately corrected the message or informed other members of the false information. During times of crisis, specifically election periods, the community’s leadership implemented various measures to combat the potential spread of false information. In-depth interviews with participants illuminated the process of filtering false information from the standpoint of media consumption. Findings revealed that trust in news sources was linked to their impartiality as discussed by two participants.

 

In this study, our objective was to conduct a thorough examination of a community created for political discussions, communication, and chatting, with a particular focus on the spread of false political information. For this purpose, we utilised the technique of digital ethnography (Prior and Miller, 2012, p. 503), which facilitates the investigation of online conversations and outputs, and is commonly applied to online communities. Digital ethnography modifies standard ethnographic techniques to explore interactions and behaviors within the digital realm. Technical term abbreviations are explained upon first use. Consistent citation, formatting and grammar rules should be followed. Precise word choice and grammatical correctness are crucial. Researchers engaged in digital ethnography conduct participant observation and interviews while immersing themselves in online communities. This method provides valuable insights into the social dynamics, cultural norms and online interactions. Language should remain formal, impartial and devoid of colloquialism, figurative or ornamental expressions and bias. A clear and concise structure with logical progression and causal connections between statements is necessary.

 

In this study, we employed the digital ethnography technique to investigate the dissemination of false political information from the point of view of a digital community. To carry out the ethnographic observation, we opted for a political community on the Discord platform, which serves as a space for political content, exchange of views, and communication. In the course of selecting the server, we began by accessing servers with a political leaning in Turkey through the disboard.org website, which facilitates the discovery of servers on the Discord platform. It was noted that a large proportion of 424 political-themed servers in Turkey were established for the purpose of aligning with a single political party or ideology. This observation is based purely on objective evaluation. The language used is clear, concise, and logical, with causal connections that enhance comprehensibility. Technical abbreviations have been adequately explained, and conventional academic structures have been followed. The language used is objective, value-neutral, and free of bias, figurative language or ornamental terms employing consistently high-level technical terms. Sentence and paragraph structure have created a logical flow of information with a formal register that avoids the use of colloquialism, contractions, informal expressions or jargon, while hedging has been employed to ensure the neutrality of positions. The use of precise word choice ensures that the subject-specific vocabulary has been used where necessary, and grammatical accuracy has been observed throughout the text. As a result, we selected a server with 3,640 members, anonymised for the purpose of this report and named the “Political Agenda.” The server invitation message stated, “Our server is a community that welcomes members with diverse perspectives and we encourage respectful and peaceful discussions. The server administration has taken steps to ensure complete ideological neutrality.” Between May 1, 2023, and September 17, 2023, we undertook the ethnographic observation of this server, examining topics such as news dissemination, ideologies, social interactions, attitudes and behavior towards false news, power dynamics, and conflicts. We also gave particular attention to the spread of false political information during the electoral period. To address the issue from the users’ perspective, we conducted in-depth interviews with four community members with a focus on identifying false information and reliable sources.