Skip to main content

Canada is exploring using AI to help prevent suicide

Image Credit: Pan Xunbin / Shutterstock

Suicide is the second most common cause of death in people between the ages of 10 and 19 in Canada. Despite the country’s preventative efforts, the number of suicides continues to grow year after year. Existing efforts include increased funding for suicide research, new mental wellness educational programs, and human-assisted monitoring of national suicide statistics. Though these efforts provide an important foundation for preventing suicide in Canada, it’s clear additional tactics are needed to save more lives. This is where the predictive and scalable capabilities of AI could offer assistance.

Canadian officials are in talks with Ottawa-based Advanced Symbolics to develop a program that will leverage the power of social media to forecast geographic spikes in suicidal behavior. According to reports, the terms of the agreement will include a three-month pilot program. During the pilot, researchers will analyze 160,000 social media accounts to identify trends that could indicate a pending rise in suicide-related deaths in communities across Canada. When the AI predicts a potential rise in suicides in a given area, officials will cue government health programs to take action. The Canadian government is set to finalize the contract for the pilot next month. After the three-month test, officials will determine whether future work is necessary.

A rep from the Public Health Agency of Canada (PHAC) said, “To help prevent suicide, develop effective prevention programs and recognize ways to intervene earlier, we must first understand the various patterns and characteristics of suicide-related behaviours. PHAC is exploring ways to pilot a new approach to assist in identifying patterns, based on online data, associated with users who discuss suicide-related behaviours.”

A promising solution

Advanced Symbolics uses AI to identify market research trends. The company’s CEO, Erin Kelly, says, “we’re the only research firm in the world that was able to accurately predict Brexit, the Hillary and Trump election, and the Canadian election of 2015.” The hope is that the same methods the company applies to tracking and analyzing public sentiment for political and commercial purposes will help reduce suicide rates.


June 5th: The AI Audit in NYC

Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.


If the program is successful, the AI could help Canadian health organizations determine where suicide spikes will happen next and deploy preventative measures months in advance.

Kenton White, chief scientist with Advanced Symbolics said, “What we would like to try and understand is what are the signals … that would allow us to forecast where the next hot spots are so that we can help the government of Canada provide the resources that are going to be needed to help prevent suicide before the tragedies happen.”

Potential ethical implications

The idea is similar to Facebook’s AI that monitors users’ Messenger interactions, public posts, and livestreams to identify individuals at risk for suicide. One of the key differences between Advanced Symbolic’s proposed solution and Facebook’s current solution is privacy. While Facebook’s AI analyzes both private conversations and public content, this solution would stick to monitoring public posts to gather overall sentiment for specific communities and national regions. As it’s a government-backed program, Advanced Symbolic’s solution must respect the boundary between public and private content, but concerns about an AI Big Brother watching Canadians on social media still loom over the project.

In response to these potential concerns, Kelly says, “We’re not violating anybody’s privacy — it’s all public posts. We create representative samples of populations on social media, and we observe their behaviour without disturbing it.”

While the separation between private and public content monitoring will hopefully ease privacy concerns among Canadians, it will still be important for government entities to maintain transparency during the pilot. Few things are more off-putting than the idea of government programs monitoring social behavior, but researchers and government officials hope Canadian citizens will keep the ultimate cause in mind as the initiative plays out.

What’s ahead

Advanced Symbolic is currently in the process of determining what suicidal behaviors look like and how AI can detect those red flags on social media platforms. The firm is projected to begin monitoring behavior on select social media accounts next month.

The following resources are available for residents in the U.S. and Canada who suffer from suicidal thoughts: Canadian Association for Suicide Prevention and National Suicide Prevention Lifeline.