It has been found that people in crisis due to domestic abuse, sexual assault, suicidal intent, among others, turn to Google in the first instance for impersonal support.
But Google doesn’t want to stop there, it is applying new Artificial Intelligence techniques to better direct people to the information they need.
It is using its latest Machine Learning Model, MUM, in the search engine to accurately detect a wide range of personal crisis searches.
What does MUM detect in searches for people in crisis?
It can detect search queries on difficult situations which was not possible before.
It can help Google understand more complex or longer text queries. For example, phrases like “why did he attack me when I said I don’t love him”, is a domestic violence phrase, but without an advanced AI system these queries are not easy to understand.
There are other words that have been defined as “hot spots”.
In these cases, for example, when Google detects the word “suicide” alone or in the middle of a sentence, it associates it with a crisis search.
It immediately responds to the user with an information box that says, “Help is available,” accompanied by a phone number or a website of a charity where the person can seek real help.
In addition to detecting personal crises, MUM can also be applied to identify searches with explicit content such as pornography. These are topics categorized as “impactful results” that may indicate some results on specific topics.