Taylor, Debbie, Melvin, Clare ORCID: https://orcid.org/0000-0001-5621-3417, Aung, Min Hane and Asif, Rameez (2024) To Bot or Not to Bot?:Analysing Mental Health Data Disclosures. In: International Conference on Human-Computer Interaction, 2024-06-29 - 2024-07-04, Washington.
PDF (To_Bot_of_Not_to_Bot___Analysing_Mental_Health_Data_Disclosures___HCII_2024_Submission)
- Accepted Version
Restricted to Repository staff only until 1 June 2025. Request a copy |
Abstract
Disclosure of personal information about wellbeing and mental health is a nuanced situation requiring trust between agents. Current methods for initial mental health assessments are time and resource intensive. With increases in demand for mental health services and decreases in funding and staffing levels, this paper explores whether conversational agents can be sufficiently ‘trusted’ to collect the sensitive data disclosed in initial mental health assessment, thereby reducing the workload for trained professionals. An initial study identified the desired characteristics of a conversational agent designed for mental health assessment purposes and produced a MoSCoW design framework of desirable features. A second study tested the framework by investigating whether a conversational agent, displaying these desirable human-like features, could establish sufficient trust to collect data, comparable to completing forms online which are sent from a mental health service provider, or requested through social media. Participants (n = 236; female = 58%, non-binary = 5%, Prefer not to say = 1%, age 18–80+yrs) were recruited from a UK mental health service provider and through social media. Of the participants, 50% (n = 126) engaged with the bespoke conversational agent to disclose sensitive personal information in an initial mental health assessment; the remaining participants provided the information by completing the online forms though social media or from a mental health service provider. Results indicate a conversational agent can be used to collect sensitive mental health data for initial assessment. Whilst such a tool may not be appropriate for all individuals and demographics, the conversational agent shows promise for reducing the administrative workload of those in the mental health profession, thus increasing resources for treatment and therapy.
Actions (login required)
View Item |