| Literature DB >> 35388585 |
Rosanna Tarricone1,2, Francesco Petracca1, Maria Cucciniello2,3, Oriana Ciani1,4.
Abstract
Digital health and mobile medical apps (MMAs) have shown great promise in transforming health care, but their adoption in clinical care has been unsatisfactory, and regulatory guidance and coverage decisions have been lacking or incomplete. A multidimensional assessment framework for regulatory, policymaking, health technology assessment, and coverage purposes based on the MMA lifecycle is needed. A targeted review of relevant policy documents from international sources was conducted to map current MMA assessment frameworks, to formulate 10 recommendations, subsequently shared amongst an expert panel of key stakeholders. Recommendations go beyond economic dimensions such as cost and economic evaluation and also include MMA development and update, classification and evidentiary requirements, performance and maintenance monitoring, usability testing, clinical evidence requirements, safety and security, equity considerations, organizational assessment, and additional outcome domains (patient empowerment and environmental impact). The COVID-19 pandemic greatly expanded the use of MMAs, but temporary policies governing their use and oversight need consolidation through well-developed frameworks to support decision-makers, producers and introduction into clinical care processes, especially in light of the strong international, cross-border character of MMAs, the new EU medical device and health technology assessment regulations, and the Next Generation EU funding earmarked for health digitalization.Entities:
Keywords: HTA; assessment; digital health; eHealth; lifecycle; mHealth; mobile medical apps; regulatory
Mesh:
Year: 2022 PMID: 35388585 PMCID: PMC9545972 DOI: 10.1002/hec.4505
Source DB: PubMed Journal: Health Econ ISSN: 1057-9230 Impact factor: 2.395
Target focus of regulatory and assessment approaches across jurisdictions
| IMDRF | WHO | EU | FRANCE | GERMANY | USA | UK | ||
|---|---|---|---|---|---|---|---|---|
| Organization | IMDRF | WHO | EC | HAS | BfArM | FDA | MHRA | NICE, |
| NHSX | ||||||||
| Purpose | Regulatory approval | Assessment for Decision‐Making | Regulatory approval | Assessment for Coverage and Reimbursement Decisions | Assessment for Coverage and Reimbursement Decisions | Regulatory approval | Regulatory approval | Assessment for Coverage and Procurement Decisions |
| Target Object | Software as Medical Device (SaMD) | Digital Health (DH) | Medical Device Software (MDS) | Health Apps and Smart Devices (mHealth) | Digital Health Application (DIGA) | Device Software Functions (DSF), including Mobile Medical app (MMA) | Medical Device Standalone Software | Digital Health Technologies (DHT) |
| Definition | Software intended to be used for one or more medical purposes that perform these purposes without being part of a hardware medical device | The use of digital, mobile, and wireless technologies to support the achievement of health objectives. | Software that is intended to be used, alone or in combination, for a purpose as specified in the definition of a “medical device” in the Medical Device Regulation (MDR) or In Vitro Diagnostics Regulation (IVDR), regardless of whether the software is independent or driving or influencing the use of a device | Medical and public health practice supported by mobile devices, such as mobile phones, patient monitoring devices, personal digital assistants (PDAs), and other wireless devices | EU MDR Class I or IIa medical devices achieving their medical purpose through core digital functions and used either by patients alone or patients with healthcare professionals. | All SAMD and SIMD that are not used for administrative support of a healthcare facility, for maintaining or encouraging a healthy lifestyle, to serve as electronic patient records, or for transferring, storing, converting formats, or displaying data | Software not incorporated in a device possessing its own medical purpose | Apps, programmes and software used in the health and care system that may be standalone or combined with other products such as medical devices or diagnostic tests |
| Smart devices are devices connected to the internet that can collect, store, process and send data or that can take specific actions based on information received | ||||||||
| Digital health is inclusive of both mHealth and eHealth | ||||||||
| Main Reference Documents | Software as a Medical Device (SaMD): Key Definitions (IMDRF, | Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment (WHO, | Guidance on Qualification and Classification of Software in Regulation (EU) 2017/745 – MDR and Regulation (EU) 2017/746 – IVDR (European Union, | Good Practice Guidelines on Health Apps and Smart Devices (Mobile Health or mHealth) (Haute Authorité de Santé, | The Fast‐Track Process for Digital Health Applications (DiGA) according to Section 139e SGB V: A Guide for Manufacturers, Service Providers and Users (BfArM, | Policy for Device Software Functions and Mobile Medical Applications (FDA, | Guidance: Medical device stand‐alone software including apps (including IVDMDs) (Medicine and Healthcare Products Regulatory Agency, | Evidence Standards Framework for Digital Health Technologies (NICE, |
Risk‐categorization models developed across jurisdictions
| Jurisdictions | IMDRF | WHO | EU | FRANCE | GERMANY | USA | UK |
|---|---|---|---|---|---|---|---|
| Main reference documents | “Software as a Medical Device": Possible Framework for Risk Categorization and Corresponding Considerations (IMDRF, | Classification of digital health interventions v1.0. A shared language to describe the uses of digital technology for health (WHO, | Guidance on Qualification and Classification of Software in Regulation (EU) 2017/745 – MDR and Regulation (EU) 2017/746 – IVDR (European Union, | Good Practice Guidelines on Health Apps and Smart Devices (Mobile Health or mHealth) (Haute Authorité de Santé, | The Fast‐Track Process for Digital Health Applications (DiGA) according to Section 139e SGB V. A Guide for Manufacturers, Service Providers and Users (BfArM, | Developing a Software Pre‐ Certification Program: A Working Model (FDA, | Evidence Standards Framework for Digital Health Technologies (NICE, |
| Risk categorization | Two‐dimensional: significance of the information provided by the SaMD to the healthcare decision state of the healthcare situation or condition | DH is classified according to the different ways in which DH interventions are being used to support health system needs and not on risk level | Two‐dimensional: significance of the information provided by the SaMD to the healthcare decision state of the healthcare situation or condition | Two‐dimensional: main target user: general public patient/carers/family/patient associations healthcare professionals directly with their patients healthcare professionals directly with their peers (e.g., teamwork) main intended use: information, general advice primary prevention, health promotion, manual data entry and acquisition without analysis secondary and tertiary prevention, therapeutic patient education analysis of data, medical evaluation contributing to assessment, diagnosis, monitoring throughout the care pathway, impact on treatment | Two‐dimensional: significance of the information provided by the SaMD to the healthcare decision state of the healthcare situation or condition | Two‐dimensional: significance of the information provided by the SaMD to the healthcare decision state of the healthcare situation or condition | One‐dimensional: Detailed functional use |
| Risk classes | Categories I, II, III, IV | DH are classified around the target audience: Clients Healthcare providers Health systems managers Data services | Classes I, IIa, IIb, III (and A, B, C, and D for IVD). Rule 11 has been introduced to address the risks related to the information provided by Medical Device Software (MDSW) and is divided into three sub‐rules (an analogous reasoning is applied to IVDR MDSW with classes A, C, D): 11a: intended to provide information which is used to take decisions with diagnostic or therapeutic purposes (in this case MDSW is classified as class IIa, except if such decisions have an impact that may cause death or an irreversible deterioration of a person's state of health, in which case it is in class III; or a serious deterioration of a person's state of health or a surgical intervention, in which case it is classified as class IIb); 11b: intended to monitor physiological processes or parameters (in this case MDSW is classified as class IIa, except if it is intended for monitoring of vital physiological parameters, where the nature of variations of those parameters is such that it could result in immediate danger to the patient, in which case it is classified as class IIb); 11c: all other uses (in this case it is classified as class I). | Low Criticality, Medium Criticality, High Criticality | Classes I and IIa | Categories I, II, III, IV | 10 Classes grouped in Evidence Tiers A, B, C.
system services: DHTs with no measurable patient outcomes but which provide services to the health and social care system
inform: provides information, resources or activities to the public, patients or clinicians; includes information about a condition or general health and lifestyle health diaries: includes general health monitoring using fitness wearables and simple symptom diaries communicate: allows 2‐way communication between citizens, patients or healthcare professionals
preventative behavior change: address public health issues like smoking, eating, alcohol, sexual health, sleeping and exercise self‐manage: allows people to self‐manage a specified condition; may include behavior change techniques treat: provides treatment; guides treatment active monitoring: tracking patient location, using wearables to measure, record or transmit data (or both) about a specified condition; uses data to guide care calculate: a calculator that impacts on treatment, diagnosis or care diagnose: diagnoses a specified condition; guides diagnoses. |
Assessment approaches across jurisdictions
| Jurisdictions | IMDRF | WHO | EU | FRANCE | GERMANY | USA | UK |
|---|---|---|---|---|---|---|---|
| Organization | IMDRF | WHO | EC | HAS | BfArM | FDA | NICE |
| NHSX | |||||||
| Main reference documents | Software as a Medical Device (SaMD): Clinical Evaluation (IMDRF, | Classification of digital health interventions v1.0. A shared language to describe the uses of digital technology for health (WHO, | Guidance on Qualification and Classification of Software in Regulation (EU) 2017/745 – MDR and Regulation (EU) 2017/746 – IVDR (Medical Device Coordination Group, | Good Practice Guidelines on Health Apps and Smart Devices (Mobile Health or mHealth) (Haute Authorité de Santé, | The Fast‐Track Process for Digital Health Applications (DiGA) according to Section 139e SGB V: A Guide for Manufacturers, Service Providers and Users (BfArM, | Developing a Software Pre‐Certification Program: A Working Model (FDA, | Evidence Standards Framework for Digital Health Technologies (ESFDHT) (NICE, |
| Software as a Medical Device (SaMD): Application of Quality Management System (IMDRF, | Digital Technology Assessment Criteria for Health and Social Care (DTAC) (NHSX, | ||||||
| Guidance on Clinical Evaluation (MDR)/Performance Evaluation (IVDR) of Medical Device Software (Medical Device Coordination Group, | |||||||
| Target focus | Product | Product | Product | Product | Product | Product & Firm | Product |
| Assessment approach | Clinical Evaluation Model embedded in SaMD Realization and Use Processes | Lifecycle Approach | Continuous and iterative clinical (MDR)/performance (IVDR) evaluation embedded in the QMS (IMDRF inspired) | Step‐Wise Evaluation model based on: | Fast‐track process for the assessment of DiGA, focusing on product quality and evidence underlining positive healthcare effects | Total Product Lifecycle (TPLC) Approach for continued evaluation from pre‐market development to post‐market performance | Risk‐based model for evaluation, with cumulative evidence requirements for demonstrating effectiveness and economic impact |
| Assessment domains/steps |
Clinical Validity Analytical Validation Clinical Validation |
Functionality, i.e., the ability to support the desired intervention; Stability, i.e., the ability to remain functional both under normal and anticipated peak conditions for data loads. Fidelity, i.e., whether the intervention is delivered as intended from both a technical and user perspective. Monitoring fidelity consists of three different categories: a) monitoring the technical fidelity in the implementation process; b) monitoring external barriers that might cause the system not to function as expected; c) monitoring the compliance of the digital health system users; Quality, i.e., the overall quality of the intervention in terms of excellence, values, conformance, fitness for purpose and ability to meet or exceed expectation. Two main aspects should be addressed when monitoring quality. The first one pertains to user capabilities, to guarantee users do enter information accurately and use the system correctly overall. The second dimension relates to the overall quality of the intervention as a prerequisite for the effectiveness of the intervention
|
Planning Evidence Generation Demonstrating Clinical association to the outcomes Updating Clinical Evaluation through Post‐Marketing Clinical Follow‐Up |
Informing User: Description Consent Health Content: Design of initial content Standardization Generated content Interpreted content Technical Content: Technical design Data flow Security/Reliability: Cybersecurity Reliability Confidentiality Usability/Use: Usability/design Acceptability Integration/import |
Mandated Requirements: Security Functionality Quality Data Protection Data Security Interoperability Positive Care Effects: Medical benefits patient‐relevant structural and procedural improvements |
Excellence appraisal Review Pathway Determination Streamlined Pre‐Market Review Real World Performance Product Quality Patient Safety Clinical responsibility Proactive Culture | ESF: Clinical Effectiveness Economic Impact Clinical safety Data protection Technical assurance Interoperability Usability and accessibility |
| Type of Recommendations: Permanent acceptance Preliminary Acceptance | |||||||
| Economic domains | N/A | N/A | N/A | N/A | N/A | N/A | Economic impact relative to the financial risk: key economic information appropriate economic analysis economic analysis reporting standards |
| Study Designs | N/A | Hybrid approaches | Prospective study may be required for higher risk MDSW, whereas retrospective analyses may be sufficient for lower risk ones | N/A | Minimum requirement of one retrospective comparative study (although prospective ones are more appreciated) and must be conducted, at least partially, in Germany Recognized relevance of alternative study designs (e.g. PCT, SMART, or MOST) | N/A | Tier C: High quality observational or quasi‐experimental studies (minimum), High quality (quasi‐) experimental intervention studies (best practice), High quality RCT or meta‐analysis of RCTs (best practice) |
| Real World Data | Clinical evaluation should be intended as a dynamic summary that changes through continuous RWP learning | Clinical evaluation must be generated as a blend effectiveness and implementation trial elements | Manufacturers are asked to continuously monitor the device safety, effectiveness and performance | N/A | Declared preference for RWD based on healthcare practice | RWP Analytics Framework Real World Health analytics User Experience Analytics Product Performance Analytics | Request for ongoing data collection to show DHT usage and value |
| Reference to AI/ML technologies | IMDRF Artificial Intelligence Medical Devices (AIMD) Work Item (IMDRF, | N/A | The White paper on Artificial Intelligence is a non‐industry specific document that discusses the adoption of AI across sectors and activities (EC, | N/A | N/A | Artificial Intelligence and Machine Learning in Software as a Medical Device (FDA, | An additional Evidence Framework is expected as from the collaboration between the NICE and MHRA |
| The most relevant aspect for SaMD is that of suggesting a conformity assessment for high‐risk AI applications | Two principles: SaMD Pre‐Specifications Algorithm Change Protocol Document Approach Focused FDA Review Pathway | ||||||
| Work in progress |
Multi‐dimensional assessment framework of mobile medical apps (MMAs). The DECALOGUE
| Domain | Recommendation |
|---|---|
| MMA development and update | Shared decision‐making approaches to the development of apps can enhance the replicability of apps and ultimately lead to improved outcomes |
| Classification and evidentiary requirements | An unambiguous classification of MMAs should be associated with corresponding evidence generation requirements and the possibility to flexibly review the associated level of risk of every single MMA |
| Performance and maintenance monitoring | MMA manufacturers must ensure that technical system implementation does not threaten the overall effectiveness. Analytical validity must be determined by the manufacturers and a core set of indicators should be developed that are coherent with the MMA classification |
| Usability testing | Usability should be continuously monitored, both in the development phase of the solution and after its implementation in the field |
| Additional outcome domains | Besides conventional outcomes similar to all healthcare technologies, patient empowerment associated with use and potential environmental impact are distinguishing outcome domains of MMAs that, where relevant, should be appropriately measured and valued |
| Clinical evidence requirements | Flexible study designs that account for the specific characteristics of MMAs and can generate fast and efficient results should be adopted and coupled with flexible policy arrangements, based on the level of risk and the position in the product lifecycle of the app |
| Safety and security | The assessment of the risks related to data privacy, cyber security and misinformation and their potential impact are emphasized issues for MMAs |
| Equity considerations | Given the context of escalating inequalities and persisting technological divide, it is paramount to evaluate the net effect brought by MMAs on equity |
| Organizational assessment | The assessment of direct and indirect implications of the adoption of MMAs on the organizational level should focus on process, people, structure, culture or management impacts |
| Cost and economic evaluation | The evaluation of cost and economic impacts of MMAs should require no innovative forms and methodologies (reporting should follow established guidelines for economic evaluations), but rather new metrics for outcomes and different structures for cost |