Are you having difficulty getting your head around the use of technology in mental health care? The idea that computer programs might interact with humans in a humanlike way has been around for decades but ideas about implementation are getting more and more sophisticated.
Early work dating from the 1960’s found that people were more likely to disclose suicidality when interacting with a computer-based assessment than in face to face sessions with clinicians.
This era also saw the launch of the famous ELIZA bot, designed to mimic a Rogerian therapist. It works by recognizing keywords in text and responding with pieces of text that simulate active listening. ELIZA was designed to show that robot-human ‘conversations’ had limited meaning but some have found ELIZA to be convincing – perhaps demonstrating that humans tend to seek meaning in uncertain interactions. ELIZA is still freely available online – http://psych.fullerton.edu/mbirnbaum/psych101/Eliza.htm
Bots are certainly becoming more sophisticated. It has been estimated that there are 161 terms to describe bots and related programs. Although there is overlap between these terms, there are some important differences.
Some programs are designed to recognize patterns in human input and then select and activate standardized responses, much like the ELIZA bot. They can’t create new responses to input that they do not recognize.
More sophisticated programs use Artificial Intelligence and natural language processing. They can learn from their interactions with humans and develop new responses. To see how this works in practice have a play at Cleverbot, a well-known general conversation AI platform.
Programs interfaces range from basic text ‘conversations’ to ‘embodied’ platforms that look like humans and receive information from users’ cameras and microphones to interpret emotion and adjust their responses. Some ‘digital humans’ look quite realistic: https://www.soulmachines.com.
Researchers are exploring the potential for these tools to provide assessments, interventions and administrative support. See here for an example of assessments conducted by the SimSesi tool.
Virtual humans are also being used in service provider training – as in this example of a virtual patient in a psychiatry training program: https://www.chatbots.org/research/news/virtual_humans_virtual_reality_patient_mental_clinicians/
More research and development is needed before bots will be ready for routine use, but this field is developing rapidly. If you haven’t explored this technology before, you might like to grab a coffee, follow some of the links in this post and drop in to a bot for a chat!
Julia Reynolds MPsych(Clin), MAPS, is Clinical Psychologist and e-hub Clinical Services Manager, Centre for Mental Health Research, ANU.
Perioperative nurses play a crucial role in ensuring patient safety during surgery. Monitoring intraoperative temperatures is essential for identifying and managing perioperative hypothermia and malignant hyperthermia, safeguarding patient well-being.
Anthony is a retired engineer, who is compliant with his COPD and diabetes management but has been struggling with frequent exacerbations of his COPD.
QHUB, launched by Medcast, is a new home for quality use of medicine education. Supported by the Australian Government, it offers healthcare professionals and consumers free, comprehensive resources on key health issues, starting with eczema.