The Role of Beliefs and Behavior on Facebook: A Semiotic Approach to Algorithms, Fake News, and Transmedia Journalism

From Digital Culture & Society

Jump to: navigation, search

Contents

[edit] The Role of Beliefs and Behavior on Facebook: A Semiotic Approach to Algorithms, Fake News, and Transmedia Journalism

Borges, P. M., & Gambarato, R. R. (2019). The Role of Beliefs and Behavior on Facebook: A Semiotic Approach to Algorithms, Fake News, and Transmedia Journalism. International Journal of Communication, 13, 603-.

https://ocul-bu.primo.exlibrisgroup.com/permalink/01OCUL_BU/p5aakr/cdi_swepub_primary_oai_DiVA_org_hj_43225

[edit] Context

This article takes a Peircean semiotic perspective to explore two key aspects: the algorithms used by Facebook to enhance audience engagement and their influence on the dissemination of fake news in the realm of transmedia journalism, and how our individual methods of fixing beliefs contribute to this process. Employing a qualitative study grounded in Peircean semiotics, the analysis focuses on critical concepts such as truth, reality, representation, fixation of beliefs, and collateral experiences. This approach seeks to show the intricate relationship between algorithms, fake news, and virality journalism. The research outcomes display the shared responsibility in the proliferation of fake news, attributing accountability not only to social media networks like Facebook but also to audiences whose behaviors and beliefs significantly contribute to shaping and reinforcing the algorithms at play.

[edit] Overview

The purpose of this article is to display the impacts of algorithms and how they are used to push a narrative in regards to social networks and other communication avenues. Using the semiotic approach this article establishes the meaning of looking for “Signs”. The research findings suggest that individuals tend to believe what aligns with their existing beliefs, regardless of its correspondence with experience. This inclination, akin to confirmation bias, contributes to the formation of filter bubbles and echo chambers on social media platforms like Facebook. The algorithms on Facebook, serving as filters, prioritize information that aligns with users' existing beliefs, creating an environment where conflicting opinions are often hidden. This selective exposure reinforces users' beliefs, providing a sense of agreement and satisfaction. Peirce's exploration of the fixation of beliefs introduces four methods: the method of tenacity, the method of authority, the a priori method, and the scientific method. The method of tenacity involves unwavering adherence to established beliefs, shielding individuals from doubts but lacking a commitment to truth. The method of authority relies on institutional beliefs, restricting individual freedom for independent thinking. The priori method is seen frequent, often leading to confirmation bias. In contrast, the scientific method necessitates shared beliefs based on external events, emphasizing observation and reasoning.

[edit] Strengths and Weaknesses

Articles written such as this provide an eye opening experience in regards to the impact of small technologies such as algorithms. This article provides detailed examples of how algorithms work and operate in our daily world. The article also provided examples of how fake news and journalism can impact what is performed inside of these algorithms to output false for fake news. A weakness would include the article using minimal current day data to support the claims. Many of the supporting stories used throughout the article displayed older dates. This won't provide the most accurate data possible due to various changes in our society. This also includes the creation of algorithmic technology.

[edit] Assessment

This article displays the impacts of algorithms and how they are used. Various different views such as the semiotic approach imply and explain various other techniques that are prominent in todays technology. Much of our online behaviour also has much to do with what algorithms promote. If many people interact with fake or misleading information the product will be pushed onto more individuals throughout the platforms.

-zf19kb 16:11, 3 December 2023

Personal tools
Bookmark and Share