Are you having difficulty getting your head around the use of technology in mental health care? The idea that computer programs might interact with humans in a humanlike way has been around for decades but ideas about implementation are getting more and more sophisticated.
Early work dating from the 1960’s found that people were more likely to disclose suicidality when interacting with a computer-based assessment than in face to face sessions with clinicians.
This era also saw the launch of the famous ELIZA bot, designed to mimic a Rogerian therapist. It works by recognizing keywords in text and responding with pieces of text that simulate active listening. ELIZA was designed to show that robot-human ‘conversations’ had limited meaning but some have found ELIZA to be convincing – perhaps demonstrating that humans tend to seek meaning in uncertain interactions. ELIZA is still freely available online – http://psych.fullerton.edu/mbirnbaum/psych101/Eliza.htm
Bots are certainly becoming more sophisticated. It has been estimated that there are 161 terms to describe bots and related programs. Although there is overlap between these terms, there are some important differences.
Some programs are designed to recognize patterns in human input and then select and activate standardized responses, much like the ELIZA bot. They can’t create new responses to input that they do not recognize.
More sophisticated programs use Artificial Intelligence and natural language processing. They can learn from their interactions with humans and develop new responses. To see how this works in practice have a play at Cleverbot, a well-known general conversation AI platform.
Programs interfaces range from basic text ‘conversations’ to ‘embodied’ platforms that look like humans and receive information from users’ cameras and microphones to interpret emotion and adjust their responses. Some ‘digital humans’ look quite realistic: https://www.soulmachines.com.
Researchers are exploring the potential for these tools to provide assessments, interventions and administrative support. See here for an example of assessments conducted by the SimSesi tool.
Virtual humans are also being used in service provider training – as in this example of a virtual patient in a psychiatry training program: https://www.chatbots.org/research/news/virtual_humans_virtual_reality_patient_mental_clinicians/
More research and development is needed before bots will be ready for routine use, but this field is developing rapidly. If you haven’t explored this technology before, you might like to grab a coffee, follow some of the links in this post and drop in to a bot for a chat!
Julia Reynolds MPsych(Clin), MAPS, is Clinical Psychologist and e-hub Clinical Services Manager, Centre for Mental Health Research, ANU.
Stories about topical steroid withdrawal are difficult to put into context for both consumers and health professionals. The outcome is that many consumers are avoiding using topical corticosteroids, which in many cases, makes it harder to manage conditions such as eczema.
You are working in ED and have received a call from Pathology regarding blood results that you took earlier on Lauren, who is 18 years old and 30 weeks pregnant with her first child. Discover the diagnosis behind Lauren's abnormal blood results and learn the symptoms, risks, and management of this life-threatening obstetric emergency.
Jaime has suffered from severe eczema for most of his life. This podcast delves into the interactions that take place between consumers and health professionals in the eczema journey.