AI TRAINED TO IDENTIFY DEPRESSION ON SOCIAL NETWORKS

A recent study, published in the journal Proceedings of the National Academy of Sciences, has shown that artificial intelligence (AI) can identify signs of depression in user messages on social networks. However, the study also revealed a concerning bias in the AI algorithm, as it was found to be more effective in detecting depression in white Americans compared to their black counterparts.

Researchers found that the AI model used was over three times less accurate in diagnosing depression among black Facebook users than white users. This highlights the importance of considering race when evaluating mental health through language analysis.

Prior research has shown that the frequent use of first-person pronouns and certain categories of words associated with self-abasement are linked to a higher risk of depression. To further explore this, a new study utilized a specialized AI tool to analyze social media posts from 868 volunteers, half of whom were white and the other half black. Age and gender were consistent among all participants.

All participants underwent a validated questionnaire commonly used by healthcare professionals to screen for depression. Interestingly, the connection between depression and language features like “I-Message,” self-criticism, and feelings of isolation was only observed in white individuals.

Co-author of the study, Sharat Chandra Guntaku, expressed surprise at the findings, noting that previously identified language associations for depression do not apply universally. While social media data cannot be used for a definitive diagnosis, it can help assess the risk of depression in individuals or communities.

Another study by Guntaku’s team, focusing on language analysis in social media posts during the Covid-19 pandemic, underscored the importance of such data in evaluating societal mental health. Brenda Curtis from the National Institute on Drug Abuse, who also participated in the study, highlighted the significance of identifying depression indicators on social media for predicting treatment interruptions and relapses in individuals with substance use disorders.

* META and its products are deemed extremist and banned in Russia.

/Reports, release notes, official announcements.