How AI can improve 113 Suicide Prevention's counselling services

PhD student Salim Salmi studied how to use artificial intelligence (AI) to measure the quality of chat conversations with 113 Suicide Prevention. He then used the data to train an AI model that can support 113 counsellors in their work. For example, the model makes suggestions when conversations get stuck. Currently, Salmi is investigating whether and how the tool can be implemented at 113.

Publication date
5 Dec 2024

Language is his thing, especially language processing by computers. Salim Salmi did his master's at TU Delft, where he was looking for a graduation project in the field of language processing. 'I could see that there would be a big step forward in that area,' he says.

He ended up at the 113 Suicide Prevention Foundation, which was looking for students to help develop a tool to support counsellors. 113 is a helpline for people who are thinking about suicide. It's a safe place for them to express their feelings and concerns anonymously, with the goal of preventing suicide. In 2023, the helpline received almost 185,000 calls and chats, which is 21.5% more than the previous year.

Salmi accompanied counsellors and chatted with them about their work to find out where he could be of help. The idea for the tool to support them eventually led to the offer of a PhD position. On 4 December he defended his thesis at Vrije Universiteit Amsterdam.

Quality of conversations

"One of the main things we wanted to figure out was how we could measure the effectiveness of the 113 helpline", says the researcher, who conducted his research at CWI. This was an immediate challenge "because we have a lot of text data, but we can't link it to outcomes. As people contact us anonymously, we don't know how they got on after the interview. So we asked them to fill in a questionnaire before and after the interview. That list looks at the presence of characteristics that might indicate suicidal behaviour. This gives us a score. 'Do we see a change in that after the interview?'

By gathering this data over time, we can see which conversations lead to a positive outcome and which don't. Salmi used this info to train an AI model that tries to figure out which parts of the conversation are helpful and which aren't. “Normally, these types of models are a bit of a mystery. As a researcher, you have no idea how they come to a particular conclusion. By breaking down the chat conversations into fairly simple sentences, I can see which sentences, according to the model, influenced the outcome of a conversation in hindsight. This isn't a definitive measure of whether someone was helped, but it does give an idea.

Digital assistant

Salmi uses machine learning (ML) and natural language processing (NLP) for his PhD research. NLP is a subfield of AI that focuses on the interaction between computers and human language. It is all about building models and algorithms that help computers understand, interpret and generate human language. ChatGPT is a well-known example of this.

Now that Salmi had a model that could assess conversations with social workers, it was time to develop a digital assistant that could give suggestions during a 113 chat conversation. The model looks for relevant examples in its database and suggests a text that it thinks might be useful for the chat in progress. So, helpers are shown examples of actual conversations.

"They also find that very important", Salmi knows. Before he developed his tool, he asked 113 staff members which they would prefer: tailored AI help where you don't know where the information comes from, or a less specific suggestion in the form of a previous conversation similar to the one you're having now. "The majority wanted to look at real conversations from real helpers", he says.

Tough conversations

This tool was put through its paces in extensive testing. First, with a group of 24 counsellors. Then, in a randomised study, where 27 counsellors were given the tool and 21 were not. "They decided for themselves when to use the assistant and what to do with the suggestions", says Salmi. The results showed: "Conversations with this AI tool were slightly shorter on average. When it comes to self-efficacy – the belief in your own ability to do a task well – the social workers said there wasn't much difference. They mostly picked the AI assistant for tough conversations where they didn't know how to get through to the other person properly.

Aid workers who had the option of either no help, an AI tool or help from an experienced expert picked the expert first and then the tool. "Since you can't always count on an experienced colleague being around, the tool can be a great backup", Salmi says.

Always a human being

"Ten years ago, it would not have been possible to create such a tool", the PhD student knows. "Understanding and interpreting language was always the biggest challenge for AI models. They struggle to understand what you're trying to say. 'But there's been a lot of progress in this area recently."

Master's student Mathijs Pellemans demonstrated that the tool can also be used to apply certain conversation techniques. For instance, 113 aid workers have been trained in a technique called motivational interviewing. The tool can give them a helping hand and get them started using this technique.

The next step is to expand the model from 113 chats to phone calls. Now that he's finished his PhD, Salmi is starting a new project: looking into whether he can add his tool to the platform people use to ask for help at 113.

His goal is to create a 'ChatGPT-like' model that can be used in real-time conversations to provide tailored advice. He believes this is the future. However, he wants to emphasize that there will always be a human mediator between the AI and the person seeking help from 113.

portait of Salim Salmi
Salim Salmi