Friday, September 9, 2022

Psychology Experts Advise Social Media Giants To Boost Transparency Around Algorithms To Preserve Users’ Mental Health

In a new report published in the journal Body Image, a team of psychology experts presented a mountain of evidence relating social media use to body image difficulties. The researchers detailed how algorithms may be increasing this link and encouraging social media businesses to take action.

Appearance-based social media platforms like TikTok appear to be particularly detrimental to users’ body image. On these sites, teenagers are regularly exposed to filtered and manipulated content that offers unrealistic body ideals. According to new studies, this misleading environment increases users’ risk of body dissatisfaction and dangerous illnesses like body dysmorphia and eating disorders.

"I am interested in risk and protective aspects of body image, and some of my more recent research has focused on the effect of social media," noted lead author Jennifer A. Harriger, a professor of psychology at Pepperdine University. "I became interested in the use of algorithms by social media firms and the exposes by whistleblowers proving that companies were aware of the harm that their platforms were inflicting on young users. This paper was meant as a call to arms for social media corporations, researchers, influencers, parents, educators, and therapists. We need to do a better job protecting our youth."



In their paper, Harriger and her team explain that these impacts may be worsened by social media algorithms that tailor the material given to users. These algorithms "rabbit hole" consumers into information that is more severe, less controlled, and meant to keep them on the network.

Importantly, the harm caused by these algorithms is not unknown to social media corporations, as revealed by recent whistleblower statements. Former Facebook executive Frances Haugen leaked data suggesting that the social media company was aware of research tying its products to mental health and body image issues among youngsters. A TikTok whistleblower later disclosed evidence of an algorithm that meticulously manipulates the information shown to users, preferring emotionally upsetting content in order to keep their involvement.

"Harriger told PsyPost that one such example is the company's use of algorithms that are designed to keep the user engaged for longer periods of time."

"Social media corporations are aware of the harm caused by their platforms and their use of algorithms but have not taken measures to protect consumers. Until these firms become more clear about the usage of their algorithms and provide ways for consumers to opt-out of content they do not desire to view, users are at danger. One method to limit danger is to only follow accounts that have beneficial impacts on mental and physical health and to ban anything that is triggering or negative."

In their essay, Harriger and colleagues describe solutions for combating these algorithms and protecting the mental health of social media users. First, they underline that the fundamental duty lies with the social media firms themselves. The authors reaffirm comments from the Academy for Eating Disorders (AED), noting that social media companies should boost the transparency of their algorithms, take steps to remove accounts promoting eating-disordered information and make their research data more accessible to the public.
The researchers say that social media platforms should disclose to users why the content they view in their feeds was chosen. They should also minimize microtargeting, a marketing tactic that targets specific consumers based on their personal data. Further, these firms are socially responsible for the well-being of their users and should take initiatives to enhance awareness of weight stigma. This can be done by consulting body image and eating disorder professionals on how to create a good body image among users, perhaps through the promotion of body-positive material on the site.

Next, influencers can also play a role in altering their followers’ body image and well-being. Harriger and her colleagues recommend that influencers should also contact body image experts for guidelines on body positive messaging. Positive activities could include alerting their audience about social media algorithms and urging them to counteract the negative impacts of algorithms by following and engaging with body-positive content.

Researchers, educators, and clinicians can study approaches to prevent the detrimental impact of social media on body image. "It is difficult to empirically examine the influence of algorithms because every user’s experience is specifically geared towards their interests (e.g., what they’ve clicked on or viewed in the past")," Harriger remarked. Research can, however, study the usage of media literacy programs that highlight the impact of algorithms and provide young users with strategies to preserve their well-being while on social media.

Such studies can help influence social media literacy programs that teach teenagers about advertising on social media, encourage them to apply critical thinking while interacting on social media, and teach them techniques to improve the positive content appearing in their feeds.

Parents may teach their children positive social media habits by modeling healthy conduct with their own electronic devices and by creating rules and boundaries surrounding their social media use. They can also conduct discussions with their children on problems like image alteration on social media and algorithms.

Overall, the researchers argue that social media firms have an ultimate responsibility to defend the well-being of their users. "We stress that system-level change needs to occur so that individual users may effectively do their part in preserving their own body image and well-being," the researchers conclude. "Social media firms need to be upfront about how information is provided if algorithms continue to be utilized, and they need to provide users with clear options to simply opt out of content that they do not wish to see."

The paper, "The hazards of the rabbit hole: Reflections on social media as a portal into a distorted world of altered bodies and eating disorder risk and the function of algorithms," was authored by Jennifer A. Harriger, Joshua A. Evans, J. Kevin Thompson, and Tracy L. Tylka.

No comments:

Post a Comment