Google to use Artificial Intelligence to detect searches for people in danger

Google applications and services removed

In a time of crisis many people turn to ask questions to the one they know will always give them an answer: Google. Every day, the company’s search engines register queries related to suicide, sexual abuse or mistreatment. Faced with this situation, Google has announced that it will begin to apply techniques based on Artificial Intelligence to help these people find help when they need it most.

Google will incorporate a machine learning system to detect searches related to suicide attempts, mistreatment or sexual abuse

The company will integrate into its search engine what is known as MUM, its latest machine learning model. This algorithm is designed to accurately detect, through Google searches, the traits that indicate that a person is going through a serious crisis.

Google introduced MUM last fall as part of its annual IoT conference. Since then, it has used this tool for various functions related to augmented searchwhich try to find out user traits from their web searches.

In this case, MUM will be able to detect qhat searches may be related to a difficult personal moment. In this way, the search engine will be able to provide the user with quality information related to health and safety.

Anne Merrit, Google’s product manager for health-related topics, explained that “MUM is able to understand complex searches such as ‘why he assaulted me by telling him I don’t love him’.” According to Merrit, “to a human it is obvious that it is a search related to gender violence But these types of questions are not so obvious to an AI system.”

However, it’s not all good news in this Google announcement. The company’s increasing use of machine learning systems can lead to bias and misinformation in the results obtained. Moreover, these Artificial Intelligence systems are opaque and only its engineers have access to their operating mechanism.

Click to rate this entry!
(Votes: 0 Average: 0)
Share!

Leave a Comment