An Ottawa-based market research company has been tapped by the federal government to see if it can figure out how to use artificial intelligence to anticipate suicide risk in Canadian communities.
Advance Symbolics Inc. has been awarded a contract to work with the government and psychiatrists to define "suicide-related behaviour" on social media and then "use that classifier to conduct market research on the general population of Canada."
The company says it was one of few firms around the world to predict Brexit, the election of Donald Trump, and Justin Trudeau’s 2015 election win. Now it wants to use the same technology to help the government better deploy mental health resources to areas at risk of suicide clusters, which have plagued some First Nations communities in Canada for decades.
In Canada, 11 people die by suicide every day. Another 210 attempt suicide, according to Statistics Canada.
Chief scientist Kenton White told VICE News the company can analyze up to 160,000 social media accounts — a more representative and accurate sample than can be obtained through online surveys and telephone polls, which have also been struggling in recent years because of reduced response rates.
“We take everyone from a particular region, and we look for patterns in how they’re talking,” said White, adding that the AI would be searching for behaviour and language it’s seen before in communities during crises involving multiple suicides.
He stressed that the research will be done on a community level, and that the company is taking every precaution to ensure people’s privacy. It only uses publicly available information and would be unable to address any individual cases that the AI might detect.
Over the next few months, the company hopes, with the help of mental health professionals, that the AI will be able to identify language related to: “ideation (i.e., thoughts), behaviors (i.e., suicide attempts, self-harm, suicide) and communications (i.e., suicidal threats, plans),” according to the tender document.
Advanced Symbolics has already been training the AI in academic settings using anonymized data sets from people who have agreed to be part of a study. Over the next three months, the AI will also read news articles about past events and documents that describe self-harm to learn to detect suicide risk, White told VICE News.
White said the AI has also learned from tweets from the BellLetsTalk campaign, Bell Canada’s annual effort to raise awareness about mental health on social media.
“We downloaded a bunch of tweets with the hashtag to see if we could differentiate between people who were sharing their stories and those who were raising awareness.”
For Indigenous Canadians younger than 44, suicide and self harm is the leading cause of death and suicides affecting First Nations communities across Canada have often been described as an epidemic. The issue is due in part to a chronic lack of mental health funding and resources on reserves, according to many First Nations leaders and advocates.
Three communities declared states of emergency in the past two years in response to clusters of suicide, bringing international attention to the issue. Attawapiskat First Nation, a community of about 2,000 people, saw about 100 suicide attempts in a period of 10 months, including 11 attempts in a single night. In the Pimicikamak Cree Nation community, six people committed suicide in 3 months.