- Research
- Open access
- Published:
Framework to Assist Stakeholders in Technology Evaluation for Recovery (FASTER) to Mental Health and Wellness
BMC Health Services Research volume 25, Article number: 623 (2025)
Abstract
Mental health mobile applications (apps) have the potential to expand the provision of mental health and wellness services to underserved populations. There is a lack of guidance on how to choose wisely from the thousands of mental health apps without clear evidence of safety, efficacy, and consumer protections. We propose the Framework to Assist Stakeholders in Technology Evaluation for Recovery (FASTER) to Mental Health and Wellness to support agencies and individuals working in mental health, as well as users of mental health apps, in making informed decisions recommending the use of, or using mental health and wellness apps. The framework was developed through a systematic process including a review of existing frameworks and the literature, interviews with key informants, public stakeholder feedback, and iterative application and refinement of the framework to 45 apps. It comprises three sections: Section 1. Risks and Mitigation Strategies, assesses the integrity and risk profile of the app; Section 2. Function, focuses on descriptive aspects related to accessibility, costs, developer credibility, evidence and clinical foundation, privacy/security, usability, functions for remote monitoring of the user, access to crisis services, and artificial intelligence (AI); and Section 3. Mental Health App Features focuses on specific mental health app features, such as journaling and mood tracking. The framework facilitates an assessment of the level of risk posed by the app against the evidence on the effectiveness of the app and its safety features, recognizing that given vast variations in health apps, a ‘one size fits all’ approach is likely to be insufficient. Future application, testing, and refinements may be required to determine the framework’s suitability and reliability across multiple mental health conditions.
Background
Among adults aged 18 or older in the United States, the prevalence of ‘mental illness in the past year’ increased from 17.7 percent (or 39.8 million people) in 2008 to 23 percent (or 58.7 million people) in 2023 [1]. Of this latter population, 27.1 million did not receive mental health treatment and of these 23.8% had an unmet need for mental health services in the past year. The demand for mental health services far outweighs existing resources and the capacity of the healthcare system to meet these needs [2]. Several factors drive limited access to mental healthcare services, including affordability, availability of mental health care providers, acceptance of insurance by mental healthcare providers [3], and the stigma associated with mental health care [4]. Moreover, there are racial and socio-demographic disparities in access to mental health care- Black and Latino Americans are half as likely to access mental health care compared to non-Latino whites [5]. Isolation and anxiety related to the COVID-19 pandemic has amplified the challenges around mental health care access. In December 2021, the U.S Surgeon General Issued an advisory outlining the unprecedented impacts of the pandemic on the populations’ mental health, especially among children, adolescents, and young adults. The advisory also outlined the disproportionate impacts on racial and ethnic minorities, LGBTQ + youth, low-income youth, immigrant households, those with disabilities, youth involved in the child welfare and juvenile systems, and homeless populations [6].
Considering numerous barriers to mental health care including the shortage of mental health care providers in many settings, digital mental health interventions, accessible through the internet on tablets and mobile phones have the potential to provide much needed access to mental health services. Mental health apps can be used by individuals on their own or may complement treatment plans from health care providers. Mental health apps may especially serve to overcome barriers to access for marginalized and underserved populations [7]. Such technology-driven mental health interventions may offer a scalable and accessible augmentation or bridge to traditional care. Research on efficacy and effectiveness of mental health apps has been emerging across mental health conditions and contexts, including self-management of acute symptoms and well-being, as well as early evidence on clinical management of chronic psychiatric conditions [8]. Recent meta-analyses of apps targeting anxiety and depression suggested moderate efficacy of apps in reducing the symptoms of anxiety and depression [9,10,11]. Yet despite emerging evidence, few patients receive a recommendation to use mental health apps by mental health professionals or other healthcare providers [12].
One factor that has contributed to the unrealized potential of mental health apps is their rapid and unregulated proliferation that has resulted in a lack of confidence in the safety and efficacy of these apps, and subsequently their lack of uptake and sustained use among patients. The US Food and Drug Administration (FDA) announced a “precertification” program for mobile apps, to supply information about the app development organizations’ quality control process for software [13]. However, this program does not support an evaluation of whether the app improves healthcare outcomes, leaving a critical gap in the safe and effective use of such apps in clinical practice. Without federal agencies requiring review and approval of mental health apps or a validated assessment of mental health apps, clinicians may feel ill-equipped to recommend an app to a patient.
Patients, on the other hand, may decide to use a mental health app largely based on ratings and consumer reviews, with inadequate understanding of whether such apps are evidence-based. Research suggests that consumer ratings are a poor indication of an app’s effectiveness, and found that several apps did not have an appropriate response when a user provided potentially concerning health information that would warrant escalation in a traditional healthcare environment [14]. Apps that claim to provide a diagnosis or that target individuals who may be vulnerable due to their mental health condition or age, can pose a serious risk outside of the bounds of clinical care [15]. Mental health apps should offer direct linkage to working crisis lines. Concerningly, there have been more than 2 million downloads of mental health apps that either entirely lack or contain inaccurate suicide crisis phone numbers [16]. Moreover, users may consent to the use of the app without understanding privacy agreements or data sharing schemes and may unknowingly share financial or private health information [17]. An assessment of health and wellness apps suggested that 95 percent posed some threat to privacy infringement of the users [18].
The demand for and scalability of mental health apps coupled with growing concerns over safety, privacy, and the effectiveness of mental health apps have created the need for an assessment framework that can be systematically applied to inform selection of mental health apps by organizations, clinicians and patients [13]. There are existing frameworks used to evaluate digital health apps, including some that focus on mental health apps [19,20,21,22,23].
Examples of frameworks for the assessment of general health app assessment include Healthy Living Apps Guide, Digital Technology Assessment Framework from the National Health Service UK, and Digital Therapeutics Alliance. Examples of frameworks that are focused on mental health app assessment include One Mind Psyber Guide [19], APA App Advisor (American Psychiatric Association) [20], Kaiser Permanente [21], VeryWellMind [22], and Health Navigator [23]. Other notable frameworks reviewed include M-Health Index and Navigation Database (MIND) and the end-user version of the Mobile Application Rating Scale (uMARS). MIND is an operational and flexible framework based on the American Psychiatric Association App Assessment Framework, which includes 105 questions that have been harmonized from 79 frameworks. The end-user version of the Mobile Application Rating Scale (uMARS) provides a comprehensive set of questions about app engagement and usability [33].
However, most existing frameworks are geared towards evaluating specific aspects of health apps (e.g., such as usability), and are not tailored towards an assessment of their potential risks and evidence on clinical benefits. To address this gap, the Agency for Healthcare Research and Quality (AHRQ) supported the development of the Framework to Assist Stakeholders in Technology Evaluation for Recovery (FASTER) to Mental Health and Wellness to guide mental health and wellness app selection based on safety, effectiveness and features important to patients and providers. The goal of FASTER to Mental Health and Wellness is to support agencies and individuals working in mental health, as well as users of mental health apps, in selecting mental health and wellness apps. We expect that this framework will be applied by intermediary mental health advocate groups and agencies that have the capacity to train personnel to use this framework to evaluate mental health apps and employers or insurance companies that might have an interest in reimbursing for the use of certain health apps. The results and summary conclusions of such app assessments using FASTER will be valuable to healthcare professionals before they recommend or prescribe apps to patients, and to patients/users/caregivers in search of mental health and wellness apps. The framework might also inform and guide app developers in the development of apps. Here, we describe the development process, framework components, and its intended use.
Methods
The FASTER to Mental Health and Wellness framework was developed following a four-step process outlined in a protocol available on AHRQ’s Effective Health Care website [24], and provided in more detail in the full report, and in Supplement Figure A. Here we briefly summarize the process.
-
1.
Review and abstraction of existing frameworks: Existing health app frameworks were identified through a systematic search of PubMed, as well as guided by the authors knowledge of health apps-related frameworks. Peer-reviewed literature that described frameworks to evaluate health apps, specific aspects of health apps (e.g., usability), and mental health apps were included. Documents from federal drug agencies from the US (FDA), UK (National Institute for Health and Care Excellence [NICE]), and Germany (Federal Institute for Drugs and Medical Devices) provided additional information on federal regulation of apps. Relevant items were abstracted from 11 frameworks and consolidated. Items that were clear to understand and could be applied by individuals without specialized mental health or technology training were retained. Additionally, items that could be applied without significant time and research resources were retained. For example, searching bibliographic research databases such as PubMed or MEDLINE for evidence was considered beyond the scope of how this framework is intended to be used. Therefore, items that would require such a search were omitted. This yielded a total of 300 questions.
-
2.
Identification of critical needs to assess mental health and wellness apps: Four rounds of key informant interviews (KI) with a total of 12 stakeholders were conducted. The first two set of interviewees included clinicians with a background in mental health, primary health care, and emergency medicine, and payors. A third round of interviews was conducted with app developers and mental health providers with app development expertise. The purpose of these interviews was to identify the essential components to include in a mental health app framework to best guide selection of a mental health app. The fourth round of interviews included family members of individuals living with mental illness.
-
3.
Development of a draft framework: As a next step, we developed additional items to address the omissions identified in the existing frameworks based on the analyses and KI feedback. New thematic areas included risk assessment of apps, cultural appropriateness, use of machine learning/artificial intelligence (AI), informed consent, and inclusion of mental health app features and crisis resources to support mental health and wellness. Existing items, abstracted from other frameworks, were modified, where necessary, for clarity and ease of application. A training guide was simultaneously developed to facilitate broad uptake of the framework. For the assessment of risks posed by the apps, guidance was utilized provided by agencies such as National Institute for Health and Care Excellence (NICE) [25] and various FDA documents including Clinical Decision Support and Software as a Medical Device guidance [26,27,28]. For the development of criteria on the use of AI, literature on issues of safety in the use of AI for health apps was reviewed and an ethicist specializing in machine learning was consulted.
-
4.
Refinement of the framework: The draft framework was tested and iteratively refined through one pre-pilot and six pilot rounds where the framework was applied to mental health and wellness apps across a range of 15 mental health conditions defined by the Diagnostic and Statistical Manual of Mental Disorders (Supplementary Table D). We used mental health symptoms and diagnostic categories from the DSM-5 to guide our search and selection of mental health apps. We cross-checked the main diagnostic categories with mental health conditions addressed by current mental health apps and included those addressed by at least one app. In addition to the DSM-5 categories, we added categories for Self harm; Mental wellness; and Other mental disorders. A convenience sample of 10 apps were chosen as a pilot. After that 35 apps were chosen in a serialized fashion from the randomized list of apps categories. A team of 11 reviewers were trained over 2 h. Reviewers included undergraduate public health major students, and graduate public health students with an interest in mental health and health care technology. In total 45 apps were reviewed between May-December 2021, each by at least 2 reviewers. In each round, items that did not have perfect agreement were discussed, and items were either modified or training guidance notes were provided so that each item could be applied in a standardized way.
The development of the framework was not considered as human subjects’ research.
Results
The Framework to Assist Stakeholders in Technology Evaluation for Recovery (FASTER) to Mental Health and Wellness comprises three sections (Fig. 1) with an initial and concluding set of administrative questions: “Section 1: Risks and Mitigation Strategies”, “Section 2: Function” and “Section 3: Mental Health App Features”. Within each of these sections, there are a series of items related to the assessment of specific categories considered critical based on the literature review and key informant interviews. Details about the specific items are in Supplementary materials C.
Intended use
Evaluators who use this framework should be individuals with some background in technology and mental health, but do not need to be experts in technology or mental health. The results from applying the framework will be valuable to healthcare professionals as they recommend apps to patients, and to patients/users/caregivers who are in search of a mental health or wellness app. The framework might also inform and guide app developers in the development of apps.
The framework is intended to be applied to assess apps whose primary function is to support mental health and wellness through content and resources within the app. It is not appropriate to use this framework to evaluate apps whose primary function is to facilitate telemedicine (e.g., link users/patients to a mental health professional), or health apps that might contain supplemental content to support wellness (e.g., a weight loss app that has resources for mindfulness). The approach for summarizing the risk levels based on Sect. 1 of the framework is available is Supplementary section B.
Introduction to the Framework to Assist Stakeholders in Technology Evaluation for Recovery (FASTER) to Mental Health and Wellness
Section 1: Risks and mitigation strategies
This section facilitates an assessment of the app risk profile and serves to flag apps that do not meet basic safety, evidence, and security checks. Supplementary section C, Sect. 1 details the specific questions within each item.
-
a.
App Integrity: Questions under this category aim to assess whether an app uses personal health and financial information appropriately, and that the app has a legal commitment to user privacy and security. The questions also assess whether an app has been endorsed or is being used by a trusted federal agency (e.g., National Institutes of Health), or non-government body (e.g., American Psychiatric Association) which would reinforce credibility, as these institutions exercise due diligence before endorsing or making an app available to their users. Based on the responses, two integrity levels are assigned:
-
High integrity: If the app has been updated in the previous 6 months, ensures privacy and security of the user’s data (or/and provides disclaimers and warnings), and/or if the app has been endorsed by a trusted organization.
-
Low integrity: If the app has not been updated in the previous 6 months and/or provides no privacy and security statement, and/or provides no disclaimers and warnings.
-
-
b.
Risk: The level of risk posed by the app is determined based on a set of questions related to the goals of the app (e.g., standalone treatment, support through building coping skills, etc.), the target audience (e.g., minors), and severity of the mental health condition.
-
c.
Evidence: Questions in the Evidence category help determine whether the app has a clinical research foundation. The greater the risk of an app, the greater the burden of evidence. For apps that pose a higher level of risk, the framework requires that there are robust studies assessing the efficacy and risks posed by the apps.
-
d.
Linkage to Care: Questions evaluate the linkages to a healthcare provider who can monitor the patient’s app use to enhance care manually or through app data shared to their electronic health record (EHR) system. If the app could pose a higher level of risk, the framework requires that it also provide resources for linkage to care.
-
e.
Access to Crisis Services: Questions in the Access to Crisis Services category evaluate whether the app provides direct or indirect linkage to crisis services (e.g., dial 988 to reach the National Suicide Prevention Lifeline in the United States).
Section 1 summary
Questions in the Risk Assessment category assess the risks posed by a mental health app. The “level of risk” as determined by responses to questions (details of how risk level is assessed is provided in Supplement B) is balanced against the mitigation strategies to counter this risk based on questions related to the available evidence to support the approach, clinical oversight and linkage to care, and access to crisis services. This risk-based approach is guided by the International Medical Device Regulators Forum “Software as a Medical Device (SaMD): Clinical Evaluation [29]. For example, if the app is intended for use by populations with a condition that results in severe cognitive impairment, or by children, or if the app intends to provide standalone treatment, it may pose a higher risk to the target population. To ensure safety of the target populations, this section assesses whether sufficient evidence exists to support the app’s stated goals, and whether the linkages to care/access to crisis services provided within the app are appropriate given the potential risks (Table 1).
Section 2: Function
This section is focused on descriptive aspects of an app. These criteria are intended to (1) describe features offered by the app for users to assess its fit for their therapeutic and wellness needs (2), systematically catalogue the functions of the app, so that users may choose an app based on the functionality (3), present users with additional details about the potential risks related to the use of the app so that they can be informed consumers. Specific questions are outlined in Supplementary material C, section 2.
-
a.
Accessibility Features: The questions assess whether an app has features that facilitate easier use of the app by individuals with disabilities. Accessibility features assessed include text adjustment, colorblind color scheme features, text-to-speech, availability of transcriptions/captions, configurability of keyboard shortcuts and availability of a screen reader.
-
b.
App Information: This section captures details about the platform required by the app (e.g., iOS, Android), and users’ reviews and ratings.
-
c.
Costs: Increasingly, apps have complex pricing models which, especially in the case of a vulnerable user base with mental health impairments, may pose risks. The questions assess whether costs associated with the app are provided upfront, and whether the pricing model (e.g., free, onetime cost, in-app purchases, subscription model, reimbursable by healthcare insurance, etc.) is clearly presented.
-
d.
Organizational Credibility: The questions assess the reputation of the organization that has developed the app based on the type of organization (governmental, for-profit, not-for profit, etc.) and whether there are any documented consumer complaints against the app developing organization.
-
e.
Evidence & Clinical Foundation: The question related to evidence in this category goes beyond what was assessed in Sect. 1 to assess whether the app addresses its stated goals.
-
f.
Privacy/Security: The questions to assess privacy and security focus on whether any claims of Health Insurance Portability and Accountability Act (HIPAA) or other analogous national standards for protected health information (PHI) have been made, whether the app is transparent about how user data are used, and whether the app uses industry standards to share data with EHRs.
-
g.
Informed Consent: Informed consent is a process for getting permission before conducting some form of research using health data, or prior to sharing the users’ health and related information. Most apps have a disclosure list that is long and hard to understand. There are best practices for ensuring that information is presented in a way that is understandable by users [30,31,32]. The questions in this category evaluate whether the app follows these best practices.
-
h.
Cultural Competence: Cultural competence is defined as the ability to understand, appreciate, and account for different cultures or belief systems based on race, ethnicity, sexual orientation, income strata, religious beliefs, etc. The questions in this category assess whether the app is targeted at, or inclusive of, specific population groups and cultures. If the app is targeted at a specific cultural group, the criteria assess whether the app has been tested in that group. The criteria also assess the use of gender inclusive language, and evidence of effectiveness in a non-white population.
-
i.
Usability: Usability can be described as the capacity of an app to provide conditions for its users to perform safely, effectively, and efficiently. The usability criteria for this framework were adapted from the UMARS framework [33]. The usability assessment has some objective criteria (e.g., offline use, languages supported by the app, etc.), as well as criteria that might introduce some level of subjectivity from the evaluator (e.g., design of the app layout, clarity of the content).
-
j.
Functions for Remote Monitoring of the User: Remote patient monitoring is a technology to enable monitoring of patients outside of conventional clinical settings, such as in the home [34]. For mental health apps, the provider may receive an alert about their patient's health, or they may be able to access the patient’s health indicators from within the app. To enable remote monitoring, apps need to adhere to established data standards for interoperability to safely exchange health data, including with wearable devices that may be used to monitor vital parameters or behaviors. Questions assess how data are shared for remote monitoring, the availability of two-way communication with providers, and data sharing capabilities with wearable devices.
-
k.
Access to Crisis Services: An additional question related to access to crises services assess whether the app has additional functionality to automatically link the user to a provider or to a crisis line in case of an emergency.
-
l.
Artificial Intelligence: Increasingly, mental health apps are incorporating or claiming to incorporate artificial intelligence (AI) for the purposes of customizing feedback and interventions and identifying mental health risks [35]. This area is rapidly emerging, and the questions relate to whether the app claims to use AI, and how.
Section 3: Mental health app features
This is a specialized section of the framework focused on the availability of therapeutic features and skill-building approaches that are typically employed by mental health care providers to support their patients (Supplementary Material C, Sect. 3). The assessment of apps for these functionalities will facilitate cataloguing of functions from which users may benefit and find engaging. Questions pertain to the availability of two-way communication with therapists and coaches through text message, audio or video features, availability of group therapy services, live support by a coach, and concierge mental health services. Additionally, the questions also seek to address the comprehensiveness of functionality related to mindfulness, journaling, psychoeducation, building coping skills, self-screening, safety planning, sleep regulation, chatbot support, family/caregiver support, peer group interaction, etc.
Discussion
FASTER to Mental Health and Wellness is aimed at facilitating the selection of apps for mental health support through standardized evaluation, screening, and classification of apps. Several of the criteria have been extracted or modified from existing frameworks in the app evaluation and mental health space, and new criteria have been developed and tested to address emerging concerns in the use of apps for mental healthcare. This framework provides two novel areas of contribution. First, Sect. 1 of the framework facilitates an assessment of the level of risk posed by the app against the evidence on the effectiveness of the app and its safety features, recognizing that given vast variations in mental health apps, a ‘one size fits all’ approach is unlikely to be sufficient. This framework provides a level of assessment that is tailored to the stated goals of the app with the goal of empowering end users with critical information to support appropriate selection of an app. Second, this framework facilitates systematic cataloguing of a wide range of functionalities such as sleep journaling, and skill building that are increasingly being embedded in apps to support patients. Such a catalogue of app functions can assist patients and providers in the selection of apps that best fit their individualized needs.
Assessment and standardization of mental health apps poses some unique challenges that we anticipate will continue to require attention. Many mental health symptoms are transdiagnostic, and typically apps may aim to support alleviation of a symptom rather than the disorder. Several mental health apps may aim to target symptoms such as anxiety or insomnia which are common across several mental health conditions. As it stands, the framework assesses the risks posed by an app based on the health condition it targets and the level of functional impairment an average patient might experience due to their health condition. However, an individual’s mental health condition can deteriorate rapidly, which also changes the potential risks from the use of a mental health app. The framework proposes mitigation of this risk through linkage to a healthcare provider and other caregivers. Further refinement of the framework may be needed to address applicability across apps that target transdiagnostic symptoms. Additional criteria may be needed to account for potential harm or iatrogenic impacts of an app, based on the severity or other characteristics of specific mental health conditions or the culture and characteristics of the user.
We expect that this framework may also benefit from updates to reflect emerging areas in the use of health apps. As new governance and regulations for software as a medical device are formulated, the framework should be adapted to include those. Additionally, developments in our understanding of prerequisites for apps from a privacy/security perspective, as well as rapid innovation in the digital health and AI space will need to be incorporated as additional criteria. In future versions of the framework, it will be important to add greater input from commercial app developers as they can provide insight regarding the app roadmap and challenges in commercializing health apps. It would also be critical to test this framework by applying it to apps classified as digital therapeutics that require prescriptions.
Ultimately, to facilitate the adoption and sustainability of this framework, it would be necessary to have a centralized system in place to update the framework as mental health app technology advances and to train personnel to apply this framework to screen apps. The results of the review of apps using this framework would ideally be hosted as an interactive webpage that can be used by patients and mental health advocacy agencies. To further facilitate appropriate use of mental health apps in clinical and public health contexts, a systematic way to provide education is necessary across the healthcare ecosystem to convey to end-users and licensed mental health professionals and other clinicians, the potential benefits and risks of such health apps as technology continues to advance. This framework provides foundational guidance towards that goal.
Data availability
The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.
References
Substance Abuse and Mental Health Services Administration. Key substance use and mental health indicators in the United States: Results from the 2023 National Survey on Drug Use and Health (HHS Publication No. PEP24–07–021, NSDUH Series H-59). Center for Behavioral Health Statistics and Quality, Substance Abuse and Mental Health Services Administration. 2024. Retrieved from https://www.samhsa.gov/data/report/2023-nsduh-annual-national-report.
Graham AK, Greene CJ, Kwasny MJ, Kaiser SM, Lieponis P, Powell T, et al. Coached Mobile App Platform for the Treatment of Depression and Anxiety Among Primary Care Patients: A Randomized Clinical Trial. JAMA Psychiat. 2020;77(9):906–14.
Bishop TF, Press MJ, Keyhani S, Pincus HA. Acceptance of insurance by psychiatrists and the implications for access to mental health care. JAMA Psychiat. 2014;71(2):176–81.
Carbonell Á, Navarro-Pérez JJ, Mestre MV. Challenges and barriers in mental healthcare systems and their impact on the family: A systematic integrative review. Health Soc Care Community. 2020;28(5):1366–79.
Cook BL, Zuvekas SH, Carson N, Wayne GF, Vesper A, McGuire TG. Assessing racial/ethnic disparities in treatment across episodes of mental health care. Health Serv Res. 2014;49(1):206–29.
U.S. Surgeon General Issues Advisory on Youth Mental Health Crisis Further Exposed by COVID-19 Pandemic. https://public3.pagefreezer.com/browse/HHS.gov/30-12-2021T15:27/https://www.hhs.gov/about/news/2021/12/07/us-surgeon-general-issues-advisory-on-youth-mental-health-crisis-further-exposed-by-covid-19-pandemic.html.
Schueller SM, Hunter JF, Figueroa C, Aguilera A. Use of Digital Mental Health for Marginalized and Underserved Populations. Current Treatment Options in Psychiatry. 2019;6(3):243–55.
Torous J, Bucci S, Bell IH, Kessing LV, Faurholt-Jepsen M, Whelan P, et al. The growing field of digital psychiatry: current evidence and the future of apps, social media, chatbots, and virtual reality. World Psychiatry. 2021;20(3):318–35.
Mantani A, Kato T, Furukawa TA, Horikoshi M, Imai H, Hiroe T, et al. Smartphone Cognitive Behavioral Therapy as an Adjunct to Pharmacotherapy for Refractory Depression: Randomized Controlled Trial. J Med Internet Res. 2017;19(11):e373.
Firth J, Torous J, Nicholas J, Carney R, Pratap A, Rosenbaum S, et al. The efficacy of smartphone-based mental health interventions for depressive symptoms: a meta-analysis of randomized controlled trials. World Psychiatry. 2017;16(3):287–98.
Firth J, Torous J, Nicholas J, Carney R, Rosenbaum S, Sarris J. Can smartphone mental health interventions reduce symptoms of anxiety? A meta-analysis of randomized controlled trials. J Affect Disord. 2017;218:15–22.
Kao CK, Liebovitz DM. Consumer Mobile Health Apps: Current State, Barriers, and Future Directions. PM R. 2017;9(5S):S106–15.
Bates DW, Landman A, Levine DM. Health Apps and Health Policy: What Is Needed? JAMA. 2018;320(19):1975–6.
Singh K, Drouin K, Newmark LP, Lee J, Faxvaag A, Rozenblum R, et al. Many Mobile Health Apps Target High-Need, High-Cost Populations. But Gaps Remain Health Aff (Millwood). 2016;35(12):2310–8.
Bergin A, Davies EB. Technology Matters: Mental health apps - separating the wheat from the chaff. Child Adolesc Ment Health. 2020;25(1):51–3.
Martinengo L, Van Galen L, Lum E, Kowalski M, Subramaniam M, Car J. Suicide prevention and depression apps’ suicide risk assessment and management: a systematic assessment of adherence to clinical guidelines. BMC Med. 2019;17(1):231.
Huckvale K, Torous J, Larsen ME. Assessment of the Data Sharing and Privacy Practices of Smartphone Apps for Depression and Smoking Cessation. JAMA Netw Open. 2019;2(4):e192542.
Dehling T, Gao F, Schneider S, Sunyaev A. Exploring the Far Side of Mobile Health: Information Security and Privacy of Mobile Health Apps on iOS and Android. JMIR Mhealth Uhealth. 2015;3(1):e8.
App Guide | PsyberGuide. One Mind PsyberGuide. https://onemindpsyberguide.org/apps/. Accessed 11 May 2021.
APA App Advisor. https://www.psychiatry.org/psychiatrists/practice/mental-health-apps. Accessed 11 May 2021.
How to pick employee mental health apps | Kaiser Permanente. https://business.kaiserpermanente.org/insights/mental-health-workplace/mental-health-apps-workforce-wellness. Accessed 11 May 2021.
Know more. Live brighter. Verywell Mind. https://www.verywellmind.com/. Accessed 11 May 2021.
New Zealand health information | Health Navigator NZ. https://www.healthnavigator.org.nz/. Accessed 11 May 2021.
Research Protocol: Evaluation of Mental Health Applications. Content last reviewed August 2021. Effective Health Care Program, Agency for Healthcare Research and Quality, Rockville, MD.https://effectivehealthcare.ahrq.gov/products/mental-health-apps/protocol.
Evidence standards framework for digital health technologies | Our programmes | What we do | About. NICE. https://www.nice.org.uk/about/what-we-do/our-programmes/evidence-standards-framework-for-digital-health-technologies. Accessed 11 May 2021.
Software as a Medical Device (SaMD). FDA. Published September 9, 2020. Accessed 11 May 2021. https://www.fda.gov/medical-devices/digital-health-center-excellence/software-medical-device-samd.
Policy for Device Software Functions and Mobile Medical Applications - Guidance for Industry and Food and Drug Administration Staff. :45.
U.S. Food and Drug Administration. Clinical Decision Support Software. https://www.fda.gov/regulatory-information/search-fda-guidance-documents/clinical-decision-support-software. Accessed 20 Feb 2025.
International Medical Regulators Forum. Software as a Medical Device Clinical Evaluation. https://www.imdrf.org/sites/default/files/docs/imdrf/final/technical/imdrf-tech-170921-samd-n41-clinical-evaluation_1.pdf. Accessed 20 Feb 2025.
Martinez-Martin N, Kreitmair K. Ethical Issues for Direct-to-Consumer Digital Psychotherapy Apps: Addressing Accountability, Data Protection, and Consent. JMIR Ment Health. 2018;5(2):e32.
Doerr, Megan and Suver, Christine and Wilbanks, John. Developing a Transparent, Participant-Navigated Electronic Informed Consent for Mobile-Mediated Research (April 22, 2016). Available at SSRN: https://ssrn.com/abstract=2769129 or https://doiorg.publicaciones.saludcastillayleon.es/10.2139/ssrn.2769129.
Schairer CE, Rubanovich CK, Bloss CS. How Could Commercial Terms of Use and Privacy Policies Undermine Informed Consent in the Age of Mobile Health? AMA J Ethics. 2018;20(9):E864–72.
Stoyanov SR, Hides L, Kavanagh DJ, Wilson H. Development and Validation of the User Version of the Mobile Application Rating Scale (uMARS). JMIR Mhealth Uhealth. 2016;4(2):e72.
Gandrup J, Ali SM, McBeth J, van der Veer SN, Dixon WG. Remote symptom monitoring integrated into electronic health records: A systematic review. J Am Med Inform Assoc. 2020;27(11):1752–63.
Khan ZF, Alotaibi SR. Applications of Artificial Intelligence and Big Data Analytics in m-Health: A Healthcare System Perspective. J Healthc Eng. 2020;2020:8894694.
Acknowledgements
The authors gratefully acknowledge the following individual for their guidance: Hadi Kharrazi, M.D., Ph.D., Harold P. Lehmann, M.D. Ph.D., Alain B. Labrique, Ph.D., M.H.S., M.S. and Joseph Ali, J.D. We thank Jina Ryu, B.A., Mehek Bapna, B.A., Pritika Parmer, B.A., and Jacob C. Rainey, M.P.H. for their help with abstraction.
Funding
This publication is derived from work supported under a contract with the Agency for Healthcare Research and Quality (AHRQ) contract 75Q80120D00003/ Task Order 75Q80121F32006. However, this publication has not been approved by the Agency.
Author information
Authors and Affiliations
Contributions
SA, MJ, KR conceptualized the approach to developing the framework. RS, KR coordinated the stakeholder engagement. HW led key informant interviews. HW, JT contributed to the approach and development of the mental health criteria. RH, EP supported data extraction and testing. All authors applied and tested the criteria and contributed to the revisions. SA, MJ led the writing of the manuscript. All authors reviewed the manuscript and provided input.
Corresponding author
Ethics declarations
Ethics approval and consent to participate
Not applicable.
Consent for publication
Not applicable.
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Agarwal, S., Jalan, M., Hill, R. et al. Framework to Assist Stakeholders in Technology Evaluation for Recovery (FASTER) to Mental Health and Wellness. BMC Health Serv Res 25, 623 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12913-025-12418-0
Received:
Accepted:
Published:
DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12913-025-12418-0