We have detected you are using Internet Explorer. To provide the best and most secure experience, please use a modern browser as we do not support Internet Explorer.

Bots and ‘digital humans’ in mental healthcare – the next wave of technology?

05 February 2019 - Julia Reynolds

Are you having difficulty getting your head around the use of technology in mental health care? The idea that computer programs might interact with humans in a humanlike way has been around for decades but ideas about implementation are getting more and more sophisticated.


Early work dating from the 1960’s found that people were more likely to disclose suicidality when interacting with a computer-based assessment than in face to face sessions with clinicians. 


This era also saw the launch of the famous ELIZA bot, designed to mimic a Rogerian therapist. It works by recognizing keywords in text and responding with pieces of text that simulate active listening. ELIZA was designed to show that robot-human ‘conversations’ had limited meaning but some have found ELIZA to be convincing – perhaps demonstrating that humans tend to seek meaning in uncertain interactions. ELIZA is still freely available online –


Bots are certainly becoming more sophisticated. It has been estimated that there are 161 terms to describe bots and related programs. Although there is overlap between these terms, there are some important differences.


What can they understand and produce?

Some programs are designed to recognize patterns in human input and then select and activate standardized responses, much like the ELIZA bot. They can’t create new responses to input that they do not recognize.


More sophisticated programs use Artificial Intelligence and natural language processing. They can learn from their interactions with humans and develop new responses. To see how this works in practice have a play at Cleverbot, a well-known general conversation AI platform.


What do they look like?

Programs interfaces range from basic text ‘conversations’ to ‘embodied’ platforms that look like humans and receive information from users’ cameras and microphones to interpret emotion and adjust their responses. Some ‘digital humans’ look quite realistic:


How are they used in mental health care?

Researchers are exploring the potential for these tools to provide assessments, interventions and administrative support. See here for an example of assessments conducted by the SimSesi tool.

Virtual humans are also being used in service provider training – as in this example of a virtual patient in a psychiatry training program:


More research and development is needed before bots will be ready for routine use, but this field is developing rapidly. If you haven’t explored this technology before, you might like to grab a coffee, follow some of the links in this post and drop in to a bot for a chat!

Julia Reynolds
Julia Reynolds

Julia Reynolds MPsych(Clin), MAPS, is Clinical Psychologist and e-hub Clinical Services Manager, Centre for Mental Health Research, ANU.

Related Tags
Related Categories

If you need help, please call

  • Lifeline- 13 11 14
  • BeyondBlue - 1300 22 4636
  • Suicide Call Back Service - 1300 659 467
Latest News
Tools for your practice: VETERAN lens autofill template
Brand icon

This useful tool shows the aspects of the Veteran Health Check to incorporate into all relevant consultations with Veterans at any time after transition, including a useful autofill template.

5 mins READ
Keep the Fire Burning: bridging gaps and building trust

Australia's healthcare system, often lauded for its comprehensive and accessible nature, has a glaring gap when it comes to addressing the unique needs of First Nations people.

5 mins READ
New resources to optimise veteran healthcare
Brand icon

Launching tomorrow, the Department of Veterans’ Affairs has partnered with medical education company Medcast, to provide freely available resources for health professionals to assess and manage veterans’ health.

5 mins READ