Newswise — Researchers from the Australian Institute for Suicide Research and Prevention (AISRAP) have found that those who use YouTube often have elevated levels of loneliness, anxiety, and depression.

Griffith University’s School of Applied Psychology and AISRAP's Dr Luke Balcombe and Emeritus Professor Diego De Leo aimed to investigate the favorable and unfavorable effects of the most widely utilized streaming platform on mental well-being.

Their research revealed that the age group most impacted by adverse effects were those below 29 years old, or those who frequently viewed content related to others' lives.

Dr Luke Balcombe, the primary author of the study, stated that the formation of parasocial relationships between content creators and their audience could be worrying, although some instances of neutral or positive connections between creators and their followers were also observed.

Dr Balcombe further explained, "For some individuals, these virtual 'relationships' can compensate for a lack of in-person social interactions, particularly for those who struggle with social anxiety. Nevertheless, it can amplify their problems when they fail to engage in face-to-face communication, which is especially crucial during developmental years."

Dr Balcombe suggests, "To combat loneliness and foster favorable mental health, we advise people to restrict their time spent on YouTube and explore alternative modes of social interaction."

Dr Balcombe also noted that parents are often worried about the duration of time their children spend on YouTube, as they find it challenging to supervise their kids' use of the platform for educational or other legitimate intentions.

In the study, watching YouTube videos for more than two hours per day was categorized as high-frequency use, while viewing videos for over five hours a day was considered saturated use.

The research also concluded that more efforts are required to prevent the recommendation of suicide-related content to users based on algorithm-generated suggested viewing.

Although it would be ideal if individuals were unable to search for such topics and encounter suicide-related methods, the YouTube algorithm often suggests content based on users' previous searches, leading them down a distressing "rabbit hole."

When a video is flagged as potentially containing self-harm or suicide-related topics, YouTube displays a warning message and asks the user if they wish to play the video.

Dr Balcombe suggested that artificial intelligence could be utilized for monitoring and intervention in high-frequency YouTube users who are vulnerable, particularly children and adolescents.

"We have examined the concerns related to human-computer interaction and suggested a concept for a recommendation system that operates independently of YouTube. The system will guide users toward verified positive mental health content or campaigns," Dr. Balcombe explained.

Dr. Balcombe emphasized that YouTube is being increasingly used for mental health-related purposes, such as seeking or sharing information. Furthermore, many digital mental health interventions are being tested with varying levels of effectiveness. However, with over 10,000 mental health apps currently accessible, it can be challenging to determine which ones to use or even which ones to recommend from a practitioner's perspective.

Dr. Balcombe stated that there is a need for verified mental health or suicide-related tools that combine AI-based machine learning, risk modeling, and qualified human decisions. By collaborating with mental health and suicide experts to authenticate information from AI, digital mental health interventions could be a promising solution to address the increasing unmet mental health needs.

The full study, ‘The Impact of YouTube on Loneliness and Mental Health’, can be accessed online at https://www.mdpi.com/2227-9709/10/2/39.

Journal Link: Informatics