Published November 3, 2022 | Version 1.0
Report Open

Trustworthy Assurance of Digital Mental Healthcare

  • 1. Alan Turing Institute

Description

There is a culture of distrust surrounding the development and use of digital mental health technologies.

As many organisations continue to grapple with the long-term impacts on mental health and well-being from the COVID-19 pandemic, a growing number are turning to digital technologies to increase their capacity and try to meet the growing need for mental health services.

In this report, we argue that clearer assurance for how ethical principles have been considered and implemented in the design, development, and deployment of digital mental health technologies is necessary to help build a more trustworthy and responsible ecosystem. To help address this need, we set out a positive proposal for a framework and methodology we call 'Trustworthy Assurance'.

To support the development and evaluation of Trustworthy Assurance, we conducted a series of participatory stakeholder engagement events with students, University administrators, regulators and policy-makers, developers, researchers, and users of digital mental health technologies. Our objectives were a) to identify and explore how stakeholders understood and interpreted relevant ethical objectives for digital mental health technologies, b) to evaluate and co-design the trustworthy assurance framework and methodology, and c) solicit feedback on the possible reasons for distrust in digital mental health.

 

Notes

Research and production for this report was undertaken at the Alan Turing Institute and supported by funding from the UKRI's Trustworthy Autonomous Hub, which was awarded to Dr Christopher Burr (Grant number: TAS_PP_00040).

Files

final-report.pdf

Files (4.4 MB)

Name Size Download all
md5:01eda5deaa212e63ee035ed5c96f5f01
4.4 MB Preview Download