3.23.2016

Should Siri offer rape and mental health advice – and should we take it?

Those who take calls use algorithms to figure out where to refer patients, and, as with Siri's shortfalls, it's the mechanisation of decision making which is apparently causing these problems.Specific physical conditions are one thing, but, as
Collins tells me, “For a sensitive issue like mental health or suicidal thoughts it seems unlikely that a computer-based algorithm would be able to deal with the call in the same way as a human being.”
In the conversational agents paper, researchers note that “empathy matters – callers to a suicide hotlines are 5 times more likely to hang up if the helper was independently rated as less empathetic.” As they point out, a non-successful intervention could leave you feeling worse, not better. Empathy is an incredibly hard thing to teach to a mechanical system which makes its decisions based on specific, measurable inputs.
There were reports last month, for example, that a machine-learning algorithm used by the US military in Pakistan may have bombed targets containing innocent civilians. It “machine-learned” that human life, on balance, was not worth sparing in this particular situation.
Tech has a huge role to play in mental health, but that role is still largely imagined, rather than a reality. These virtual assistants, the study’s authors argue, are useful precisely because they aren’t dedicated mental health services – they aren’t apps that must be specially downloaded. The research notes that the Siri or Cortana “might help overcome some of the barriers to effectively using smartphone-based applications for health, such as uncertainties about their accuracy, and security”. But why should consumers trust Siri while they don’t trust other mental health apps, often designed by professionals? 
Our knowledge of the brain is still astonishingly rudimentary, which, ironically, holds back advances in AI and health alike. We're doing rape, domestic violence, illness and mental health sufferers a disservice if we imagine that what they really need is advice from a bot which can still barely answer questions about the weather.

No comments:

Post a Comment