Skip to main content
Advertisement
  • Loading metrics

The NASSS (Non-Adoption, Abandonment, Scale-Up, Spread and Sustainability) framework use over time: A scoping review

  • Hwayeon Danielle Shin ,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Validation, Visualization, Writing – original draft, Writing – review & editing

    hdanielle.shin@mail.utoronto.ca

    Affiliations Institute of Health Policy, Management, and Evaluation, University of Toronto, Toronto, Ontario, Canada, Krembil Centre for Neuroinformatics, Centre for Addiction and Mental Health, Toronto, Ontario, Canada

  • Emily Hamovitch,

    Roles Data curation, Formal analysis, Investigation, Methodology, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliation Institute of Health Policy, Management, and Evaluation, University of Toronto, Toronto, Ontario, Canada

  • Evgenia Gatov,

    Roles Data curation, Formal analysis, Investigation, Methodology, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliation Institute of Health Policy, Management, and Evaluation, University of Toronto, Toronto, Ontario, Canada

  • Madison MacKinnon,

    Roles Data curation, Formal analysis, Investigation, Methodology, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliations Institute of Health Policy, Management, and Evaluation, University of Toronto, Toronto, Ontario, Canada, The Centre for Addiction and Mental Health, Toronto, Ontario, Canada

  • Luma Samawi,

    Roles Data curation, Formal analysis, Investigation, Methodology, Writing – review & editing

    Affiliations Institute of Health Policy, Management, and Evaluation, University of Toronto, Toronto, Ontario, Canada, Child Health Evaluative Sciences, The Peter Gilgan Centre for Research and Learning, The Hospital for Sick Children, Toronto, Ontario, Canada

  • Rhonda Boateng,

    Roles Data curation, Formal analysis, Investigation, Methodology, Writing – review & editing

    Affiliation Institute of Health Policy, Management, and Evaluation, University of Toronto, Toronto, Ontario, Canada

  • Kevin E. Thorpe,

    Roles Supervision, Writing – review & editing

    Affiliation Institute of Health Policy, Management, and Evaluation, University of Toronto, Toronto, Ontario, Canada

  • Melanie Barwick

    Roles Supervision, Writing – review & editing

    Affiliations Institute of Health Policy, Management, and Evaluation, University of Toronto, Toronto, Ontario, Canada, Child Health Evaluative Sciences, The Peter Gilgan Centre for Research and Learning, The Hospital for Sick Children, Toronto, Ontario, Canada, Department of Psychiatry, University of Toronto, Toronto, Ontario, Canada

Abstract

The Non-adoption, Abandonment, Scale-up, Spread, Sustainability (NASSS) framework (2017) was established as an evidence-based, theory-informed tool to predict and evaluate the success of implementing health and care technologies. While the NASSS is gaining popularity, its use has not been systematically described. Literature reviews on the applications of popular implementation frameworks, such as the RE-AIM and the CFIR, have enabled their advancement in implementation science. Similarly, we sought to advance the science of implementation and application of theories, models, and frameworks (TMFs) in research by exploring the application of the NASSS in the five years since its inception. We aim to understand the characteristics of studies that used the NASSS, how it was used, and the lessons learned from its application. We conducted a scoping review following the JBI methodology. On December 20, 2022, we searched the following databases: Ovid MEDLINE, EMBASE, PsychINFO, CINAHL, Scopus, Web of Science, and LISTA. We used typologies and frameworks to characterize evidence to address our aim. This review included 57 studies that were qualitative (n=28), mixed/multi-methods (n=13), case studies (n=6), observational (n=3), experimental (n=3), and other designs (e.g., quality improvement) (n=4). The four most common types of digital applications being implemented were telemedicine/virtual care (n=24), personal health devices (n=10), digital interventions such as internet Cognitive Behavioural Therapies (n=10), and knowledge generation applications (n=9). Studies used the NASSS to inform study design (n=9), data collection (n=35), analysis (n=41), data presentation (n=33), and interpretation (n=39). Most studies applied the NASSS retrospectively to implementation (n=33). The remainder applied the NASSS prospectively (n=15) or concurrently (n=8) with implementation. We also collated reported barriers and enablers to implementation. We found the most reported barriers fell within the Organization and Adopter System domains, and the most frequently reported enablers fell within the Value Proposition domain. Eighteen studies highlighted the NASSS as a valuable and practical resource, particularly for unravelling complexities, comprehending implementation context, understanding contextual relevance in implementing health technology, and recognizing its adaptable nature to cater to researchers’ requirements. Most studies used the NASSS retrospectively, which may be attributed to the framework’s novelty. However, this finding highlights the need for prospective and concurrent application of the NASSS within the implementation process. In addition, almost all included studies reported multiple domains as barriers and enablers to implementation, indicating that implementation is a highly complex process that requires careful preparation to ensure implementation success. Finally, we identified a need for better reporting when using the NASSS in implementation research to contribute to the collective knowledge in the field.

Author summary

In this review, we explored how the Non-adoption, Abandonment, Scale-up, Spread, and Sustainability (NASSS) framework has been used in health and care technology research since its introduction in 2017. The NASSS helps researchers and practitioners understand why some healthcare innovations succeed while others fail. We reviewed 57 studies to see how the framework was applied, what types of technologies it supported, and what lessons were learned from its use. Most studies used the NASSS after implementation to reflect on what worked and what did not, suggesting that researchers are still learning how to apply it during the planning stages. We found that common barriers to successful implementation often occurred within organizations and among the people expected to adopt new technologies, while success was more likely when the technology was perceived as valuable (for example, through benefits such as improved patient outcomes, increased access to care, or enhanced organizational processes). Our findings highlight the importance of using frameworks like the NASSS earlier in the process to anticipate challenges and improve outcomes. We also emphasize the need for clearer reporting when using the NASSS so that others can learn from these experiences and continue to improve how healthcare technologies are implemented in real-world settings.

Introduction

Healthcare technology innovations hold considerable promise for enhancing patient outcomes and service efficiency, but they frequently remain confined to small-scale demonstration initiatives [15]. Moreover, current evidence indicates a prevalent pattern of non-adoption and abandonment of healthcare technology innovations by their intended users, with limited success in integrating these innovations into regular practice or expanding their implementation to different contexts [6]. This challenge is especially evident in complex healthcare settings, where the multifaceted nature of the innovations and the environment can create barriers to successful implementation [7].

Healthcare is described as a complex adaptive system, discouraging simplistic linear cause-and-effect reasoning [8,9]. Instead, there is a growing recognition of the need to emphasize dynamic processes while implementing healthcare practices. This change in perspective reflects an understanding that healthcare is influenced by multifaceted interactions and feedback loops that cannot be adequately explained by linear models alone. In response to this evolving perspective, the Non-Adoption, Abandonment, Scale-up, Spread, and Sustainability (NASSS) framework was introduced in 2017 [10]. The NASSS was developed as an evidence-based and theory-informed approach to enhance the ability to predict and assess the success of implementing innovative technologies in healthcare [10]. Related complexity assessment tools (i.e., NASSS-CAT) were developed in 2020 to enhance understanding, guide monitoring, and facilitate research on technology projects in healthcare or social care settings through collaborations [11].

The NASSS encompasses seven distinct domains: 1) Illness/Condition; 2) Technology; 3) Value Proposition; 4) Adopter System; 5) Organization(s); 6) Wider Context; and 7) Embedding and Adaptation Over Time [10]. Each domain can be categorized as simple, complicated, or complex [10]. The greater the complexity observed within these domains, the more obstacles will likely arise, hindering the successful adoption, scale-up, spread, and sustainability of innovative health and care technologies [10]. The NASSS framework considers the intricate web of dynamic interactions that influence the adoption and outcomes of innovations and aims to provide a more comprehensive and accessible tool for evaluating and improving the implementation of healthcare innovations [10].

Although new, the NASSS framework has been well-received. As reported in the Journal of Medical Internet Research [10], the seminal paper has had nearly 1060 citations at the time of writing. The surge in interest reflects the widespread adoption of the NASSS, which has been utilized prospectively and retrospectively to assess patient-oriented technologies and tools for decision-making purposes [12,13]. For example, Gremyr et al. [12] applied the NASSS and identified various implementation complexities across several domains for their point-of-care dashboard supporting schizophrenia care. Then, they used the NASSS-CAT to generate recommendations for the development and deployment of the dashboard [12]. Their analysis revealed the need for a clear value proposition that includes detailed information on costs, benefits, and risks. This can help guide decisions on allocating additional resources or discontinuing further development [12]. As such, NASSS’s utility has gained popularity in implementation research. However, there has been a lack of systematic documentation regarding the use of the NASSS framework following its release. Likewise, a comprehensive analysis of the framework’s contributions and the insights derived from its application has not been conducted systematically.

The applications of popular implementation theories, models, and frameworks (TMFs), such as the Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM) and the Consolidated Framework for Implementation Research (CFIR), have been well documented in the literature. For example, several literature reviews have been written on the RE-AIM since its inception in 1999 [14,15]. These reviews have described and assessed the application of the RE-AIM, enabling its advancement (e.g., the enhanced RE-AIM/Pragmatic Robust Implementation and Sustainability Model (PRISM) 2019) and novel applications, such as combining the RE-AIM with the Pragmatic Explanatory Continuum Indicator Summary (PRECIS) model [14,15]. Similarly, we aimed to contribute to the field of implementation science by exploring the NASSS framework’s applications to date and identifying opportunities to advance the framework. A scoping review methodology was deemed most appropriate because our primary objective was to provide a breadth of literature currently available on the NASSS application [16]. A preliminary search of PROSPERO, MEDLINE, the Cochrane Database of Systematic Reviews, Open Science Framework, and JBI Evidence Synthesis was conducted in October 2022. No current or in-progress scoping or systematic reviews on the topic were identified.

Review questions

  1. What are the characteristics of studies that used the NASSS framework?
  2. How has the NASSS framework been used in the identified studies, including, but not limited to, timing within implementation, depth of application, and use in combination with other tools (e.g., the NASSS-CAT)?
  3. What are the author-reported lessons learned from applying the NASSS framework?

Inclusion and exclusion criteria

Concept.

This review included all studies that used the NASSS and/or NASSS-CAT framework to inform the overall research design, data collection, analysis, presentation or interpretation. Studies that referred to the NASSS framework but did not apply it were excluded. This included instances where the framework was mentioned only in the introduction or discussion sections of the paper rather than being actively used as a methodological or analytical tool.

Context and population.

There were no exclusion criteria for population and context. Any studies conducted in any context with any population were considered for inclusion. However, due to the available resources in our research team, only English-language publications were included.

Type of sources.

This review included all research designs (e.g., quantitative, observational, qualitative, and mixed methods). We also considered peer-reviewed and grey literature, including conference proceedings and dissertations, but we included only empirical studies. Non-empirical literature, such as commentaries, conceptual papers, books, and literature reviews, was excluded. Reference lists in non-empirical literature (e.g., reviews) were screened to identify relevant primary empirical studies. Only literature published since 2017, the year of the publication of the seminal NASSS framework paper, was included.

Methods

This scoping review was conducted following the JBI methodology for scoping reviews [17,18], and the manuscript was prepared in line with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) [19]. Our a priori protocol [20] was registered on the Open Science Framework.

Search strategy

In collaboration with a health sciences librarian and following the Peer Review of Electronic Search Strategies (PRESS) guideline [21], a comprehensive search strategy was developed to locate relevant scholarly literature using multiple bibliographic databases. The following describes how our search strategy was refined and how it underwent iterative steps before being validated by the librarian using the PRESS guideline. This scoping review followed a three-step search strategy outlined in the JBI methodology. Firstly, an initial limited search of MEDLINE was undertaken to identify articles on the topic. Secondly, the text words in the titles and abstracts of relevant articles and the index terms used to describe the articles were used to develop a complete search strategy. The search string included terms related to the NASSS framework (“NASSS,” “NASSS-CAT”). We looked for articles containing the following words (i.e., non-adoption, abandonment, scale-up, spread, sustainability, Greenhalgh, framework, model) in their titles, abstracts, or keywords. Additionally, we used proximity operators (e.g., (Greenhalgh* NEAR/5 (framework* OR model*)). Then, the entire search strategy, including all identified keywords and index terms, was adapted according to each database. Our search was undertaken on December 20, 2022, in the following databases: Ovid MEDLINE, EMBASE, PsychINFO, CINAHL, Scopus, Web of Science, and Library, Information Science and Technology Abstracts (LISTA). Thirdly, reference lists of relevant reviews were screened to identify eligible empirical studies. The complete search strategies are provided in S1 Appendix. Since the NASSS framework was first published in 2017, databases have been searched from 2017 onwards. In addition to a scholarly database search, a forward citation search [22] was used in Scopus and Web of Science on October 13 and 17, 2022, to complement our database searches. The main steps in this forward citation search included using citation indexes to identify studies that cite the original NASSS framework paper published in 2017. This search strategy helped identify papers our database searches might have missed.

Study/source of evidence selection

All identified records were collated and uploaded into the Covidence [23], and duplicates were automatically removed. Then, five random articles were selected for our pilot testing. All six reviewers on the team independently assessed the titles and abstracts against the inclusion criteria. After pilot testing for the calibration exercise, the remaining titles and abstracts were each screened by two independent reviewers (HDS, EG, MM, EH, LS, RB). Relevant papers were retrieved in full, and their citation details were imported into the Covidence [23]. Two independent reviewers assessed full texts (HDS, EG, MM, EH, LS, RB). Full-text studies that did not meet the inclusion criteria were excluded, and reasons for their exclusion were documented. Any reviewer disagreements were resolved through discussion or with a third reviewer. Scoping reviews typically do not necessitate methodological evaluation [18]; therefore, critical appraisal was omitted.

Data extraction

Teams of two independent reviewers (HDS, EH, EG, MM, RB, LS) extracted data using a data extraction tool developed in collaboration with the research team. We extracted the general characteristics of the paper, intervention characteristics, a description of the NASSS framework application, reported implementation barriers and enablers, study conclusions, and author-reported lessons learned from applying the framework. Any reviewer disagreements were resolved through discussion or with a third reviewer. See S2 Appendix for our data extraction tool.

Data analysis and presentation

A descriptive, analytical approach was used to generate summary statistics (e.g., frequency counts, percentages, etc.) using Microsoft Excel for the general characteristics of the included studies. Subsequently, a content analysis was conducted to characterize the narrative data using the Excel Spreadsheet. First, the digital applications implemented in the included studies were categorized by two reviewers (MM, HDS) by adapting the framework, ‘Evolving Applications of Digital Technology in Health and Health Care’ [24]. Application categories [24] are as follows: 1) Virtual care; 2) Personal health devices; 3) Digital interventions; 4) Knowledge generation and/or integrators; 5) Health information; 6) Surgical/Radio graphic interventions; 7) Diagnostic and imaging [24]. Each innovation could be mapped onto more than one category. Secondly, two reviewers (EH, HDS) categorized health conditions examined in the included studies into disease types. Thirdly, the description of the NASSS framework application in each article was assessed by teams of two independent reviewers (HDS, EH, EG, MM, RB, LS) in terms of its timing within the implementation (i.e., prospective, retrospective, concurrent) and study design aspects (e.g., overall design, data collection, data analysis). This process required some level of interpretation by the team, and any conflicts in interpretation were resolved through discussion with the remaining team members. Fourth, barriers and enablers, often correspondingly reported to the primary NASSS domains, were collated. Then, teams of two reviewers (HDS, EH, EG, MM, RB) categorized these into subdomains of the NASSS framework. Fifth, reported lessons learned from the authors were narratively summarized. The charted results are accompanied by narrative summaries that describe how the results relate to our review objectives and questions.

Results

Our search strategy yielded 1,705 citations (Fig 1). Following the automatic removal of duplicates by Covidence, 823 articles underwent title and abstract screening. Then, 355 articles underwent full-text evaluation, culminating in 57 studies in this review. Most excluded studies cited the NASSS framework in the text (e.g., in the discussion) but did not use the framework to inform the study design, data collection, analyses, presentation, or interpretation. Other excluded studies were non-empirical (e.g., commentary), or their full text was unavailable.

RQ1. Characteristics of included studies

Individual study characteristics are presented in Table 1. As indicated in summary Table 2, among the 57 included studies, the majority were qualitative (n=28), following mixed/multi-methods (n=13), case-studies (n=6), observational (n=3), experimental (n=3), and other designs (e.g., quality improvement, n=4). Many studies originated in the United Kingdom (n=15), Australia (n=13), and the United States (n=9), with a few other studies being set elsewhere in Europe, Southeast Asia, and North America. However, it is noteworthy that the NASSS framework was developed in the United Kingdom, and several included studies were part of the initial empirical testing and refinement of the NASSS domains [25].

Since the NASSS framework was designed for health technology innovations, there were a variety of health conditions for which innovations were implemented, including cardiovascular (n=10), mental health (n=9), general health promotion (n=9), cancer (n=5), and women’s health (n=5), among others. Of the 57 included studies, 53 implemented digital applications, and the rest (n=4) implemented non-digital interventions, such as harm reduction services and COVID-19 testing strategies. Of the 53 digital applications, approximately half of them were virtual care, which included telemedicine (n=24), followed by personal health devices (n=10), knowledge generation applications (n=9), and digital interventions (n=10), such as internet-based Cognitive Behavioural Therapy (iCBT). See Table 3 for a complete list of digital applications and examples.

RQ2. Application of the NASSS framework

As indicated in Table 4, the NASSS framework was used in various aspects of the study methodology. The NASSS framework was used to inform the overall study design (n=9), including conceptualization. Studies used the NASSS framework to inform data collection methods (n=35) by adapting interview guides according to the NASSS domains (e.g., [47,72]). Studies also used the NASSS framework to inform data analysis (n=41), for example, using the NASSS framework for directed content analysis (e.g., [66]). The NASSS framework was also used to inform data presentation (n=33) such as utilizing a table to organize barriers and enablers by NASSS domains (e.g., [54]). Finally, studies used the NASSS framework to interpret results (n=39), for example, by dedicating one paragraph of the discussion to each NASSS domain (e.g., [61]). Most papers (n=43) used the NASSS framework to inform multiple aspects of their study.

Regarding timing, most studies conducted their analyses using the NASSS framework retrospective to implementation, for example, to analyze why implementation succeeded or failed to support adoption, non-abandonment, scale, spread, and sustainability of the innovation within a given context (n=33). The rest applied the framework prospectively to inform future implementations (n=15) or concurrently with implementation (n=8). Approximately one-third (32%) of included studies reported implementation barriers and enablers related to all 7 NASSS domains, and 21% reported barriers and enablers related to 6 domains. The Embedding and Adaptation Over Time domain was often omitted, but studies incorporated this concept into other domains (e.g., whether the technology will require future iterations [27], whether the regulatory context is expected to change [41]). Another one-third (35%) of studies reported barriers and enablers related to four to five NASSS domains, while 12% reported three or fewer domains. The latter relied on advisory committees to identify domains relevant to the study [43].

Studies often identified implementation determinants using the NASSS framework. The barriers and enablers of the successful implementation of innovations are presented by the NASSS domain in Fig 2. The most common barriers across studies (n=47) were in the Organization domain, where organizations were cited as lacking infrastructure, resources, or capacity to innovate and/or where the innovation substantially disrupted organizational routines. Specifically, organizational capacity, such as technical or human resources, was the most frequently reported barrier. Another common barrier within the Organization domain was the extent of change required in routine practice. The following are some exemplary quotes of organizational barriers reported in studies:

thumbnail
Fig 2. Barriers and Enablers identified in included studies, organized according to NASSS domains.

https://doi.org/10.1371/journal.pdig.0000418.g002

“Technical infrastructure was sometimes poor, increasing the likelihood of technical crashes.” [26]

“Representatives from all three groups expressed that an impediment to engaging in the [Quality Improvement] teams was insufficient time and that meeting times conflicted with clinical engagements.” [73]

“Space and the need for dedicated and private telehealth rooms were also common concerns for clinicians. Such spaces need to be fitted with appropriate hardware, software, and peripheral devices.” [72]

“Therapists stated that the intervention was often not discussed in meetings and was not integrated in electronic patient records they used.” [53]

“Participants indicated they were concerned that administrative tasks would continue to be a significant time barrier with increased adoption and scale up.” [29]

The most reported enablers were within the Value Proposition domain. A total of 45 studies noted the technology as profitable (from the supply side) or cost-effective (from the demand side) and reported perceived advantages, including improved patient outcomes, increased access to care, enhancements in organizational processes or workflows, and overall effectiveness of the innovation. The following are exemplar quotes of enablers related to the Value Proposition domain reported in studies:

“With automated monitoring in the specialist hospital, the accuracy of recording and timely data transfer is reliable. Nurses are more aware of the need to accomplish this task when it’s automated.” [27]

“Clinicians valued telehealth for the benefits they felt it afforded patients such as convenience and improved access to care, more so than perceived advantages for themselves.” [55]

“Several practical advantages were mentioned, among which saving time for therapists and patients because of less traveling time and replacing part of in-person treatment with the intervention, an increase of patients’ access to care because they can individually work on their treatment at their own pace, and providing a new way of delivering treatment to patients.” [53]

Factors within the Adopter System domain were also commonly reported as barriers or enablers to implementation. A total of 46 studies reported Adopter System factors as barriers and 41 studies reported them as enablers. These included the attitudes and acceptance of staff, patients, and carers towards the new technology and its ease of use. Notably, staff was more frequently reported than patients as both a barrier and an enabler. The following are some exemplar quotes of barriers and enablers related to the Adopter System domain reported in studies:

“A few therapists were willing to try ICBT-i, but none were initially deeply interested in the new method, only a few were available to take on this extra task, and only a few had the appropriate competence.” [28]

“Lastly, providers described feelings of `Zoom fatigue’ and burnout and mentioned that video visits required more concentration, energy, and adaptations to interpret visual cues in comparison to in-person visits.” [32]

“Most patient participants were interested to see their readings and described the technology as well-designed. They used the tablet and the peripheral devices without too much difficulty and saw great value in monitoring their condition, especially in terms of gaining reassurance and legitimising help-seeking when they needed clinical care.” [13]

RQ3. Lessons learned from the application of the NASSS framework

A few authors (i.e., 25 studies) reported lessons learned from applying the NASSS framework in their studies. Such information had varying levels of detail, often with minimal elaboration. Eighteen studies [12,26,28,33,35,39,41,50,52,5557,62,63,68,73,75,78] recognized the NASSS framework as a valuable and versatile tool, with researchers explicitly noting its utility in various aspects. These studies highlighted how the NASSS framework effectively aids in exploring the complexities of implementation processes, allowing for a deeper understanding of the intricate contexts in which health technologies are introduced. Additionally, the framework was praised for its applicability across different domains within health technology, proving to be particularly useful in navigating the multifaceted challenges of technology adoption and integration. Furthermore, many researchers appreciated its flexibility, which allowed them to adapt the framework to meet the specific needs of their studies, demonstrating its adaptability and relevance to diverse research objectives. For example, Uribe Guajardo et al. [75] highlighted the flexibility of the NASSS framework regarding the timing of application, noting its effectiveness in retrospective analyses and its ability to draw conclusions about the implementation and long-term sustainability of portal and eHealth resources. The authors also suggested that future research and program design would benefit from using the NASSS framework prospectively, especially with new and revised e-resources [75]. A few studies mentioned the comprehensiveness of the tool for identifying implementation determinants and its value in providing a theoretical foundation [12,27,38]. Additionally, two studies [50,78] suggested future directions for the NASSS framework, highlighting the potential opportunity to use the NASSS-CAT tool over time and to explore its applicability in a broader healthcare context. Lastly, two studies [62,67] commented on the limitation of the NASSS framework, noting its lack of consideration for how research design can impact intervention implementation and the need for its expansion to include medical ethics.

Discussion

This scoping review identified 57 empirical studies that used the NASSS framework from its publication in August 2017 until the search commenced in December 2022. Most included studies were qualitative or mixed/multi-methods designs, which can be attributed to the purpose of the NASSS framework in exploring determinants of implementation success. This exploration required substantial contextual information, which qualitative data could effectively provide. The NASSS framework was commonly used to inform data collection, data analysis, and the presentation of results. Almost all included studies focused on technological innovation, such as telemedicine, virtual care, health monitoring or decision support devices and applications, and targeted digital interventions. These innovations were designed for various health conditions, primarily cardiovascular and mental health, or supported general health promotion activities. While approximately one-third of studies reported barriers and enablers for implementation on all 7 NASSS domains, 20% did not report barriers or enablers related to the Over Time domain. The most reported barriers were found in the Organization and Adopter System domains, and the most frequently reported enablers were within the Value Proposition domain.

Most studies in this review used the NASSS framework retrospectively, primarily to evaluate why an innovation failed to be adopted by its intended users, was abandoned shortly after implementation, or did not scale to become routine within the organization, spread to other contexts, or sustain over time. Similar findings have been reported with the i-PARiHS (Integrated Promoting Action on Research Implementation in Health Services) application in research [80]. There is a need for prospective and concurrent applications of implementation TMFs to identify potential hurdles and areas of complexity ahead of implementation so that mitigation strategies can be applied [81,82]. Given the novelty of the NASSS framework, many innovations in this review were implemented in small-scale demonstration projects or as larger implementation studies not informed a priori by any theoretical framework and, therefore, required retrospective evaluation. Nevertheless, the NASSS framework does not offer solutions to identified areas of complexity. While some authors noted that the NASSS framework helped illuminate areas of focus, it remained unclear what actions they intended to take [26]. A companion document (i.e., NASSS-CAT) [11] explicitly recommends the next steps for each domain where complexity is identified; however, only four included studies had used any part of the NASSS-CAT tool [12,52,70,78].

The prevalent implementation determinants (i.e., barriers and enablers) identified in the Organization and Adopter System domains in this review are consistent with findings from previous reviews of other TMFs used in implementation science. The Exploration, Preparation, Implementation, Sustainment (EPIS) [83] is a commonly used framework that highlights key phases guiding implementation as well as factors related to the outer (system) context, inner (organizational) context and the innovation itself. A review of EPIS showed that the Implementation phase was the most commonly examined in research. In this phase, organizational and individual adopter characteristics were the most frequently mentioned factors [84], similar to what we observed in the current NASSS framework review.

In the dynamic field of implementation science, various determinant frameworks serve a similar role in facilitating the understanding of complex factors, focusing on contextual elements that influence the successful implementation of healthcare innovations. The CFIR, a popular determinant framework in implementation science, primarily identifies factors influencing implementation outcomes across the domains of Intervention Characteristics, Outer Setting, Inner Setting, Individual Characteristics, and Implementation Process [85]. The CFIR serves a similar purpose to the NASSS framework. A recent literature review of the CFIR indicated that the most commonly used constructs in studies were “Knowledge and Beliefs about the Intervention,” followed by “Self-Efficacy,” both of which fall within the CFIR domain of Individual Characteristics [86]. This finding echoes the NASSS Adopter System and the Value Proposition domains that are commonly reported barriers and enablers identified in this current review.

The I-PARiHS is another implementation determinant framework with four interacting core constructs: Evidence, Context, Recipients, and Facilitation [87,88]. The inner and outer Contexts in the i-PARiHS mirror the Organization and Wider Context domains of the NASSS. A review of research studies using the i-PARiHS [89] identified variations in how researchers conceptualized outer Context, including specific influences from external organizations, such as guideline-producing entities, and broader political and economic characteristics attributed to “contextual trust” [89]. This conceptualization resonates with the Wider Context of the NASSS framework. Furthermore, leadership was suggested as another key sub-construct within the Context of the i-PARiHS [89], which corresponds to the 5A Capacity subdomain within the Organization domain of the NASSS framework. Although the NASSS framework was initially created to implement health and care technologies, it exhibits similarities with widely used implementation determinant frameworks designed for a broader range of health innovations, including health technology and evidence-based practices. As such, we found four studies in this review that used the NASSS framework for non-digital innovations [45,58,63,76], demonstrating the framework’s adaptability and utility.

Our review found that, when used, the NASSS framework informed many design aspects, including data collection, analysis, presentation and interpretation. The use of the NASSS framework in data collection and analysis was usually consistently and clearly reported. However, consistency and clarity were lacking when it was used to present and interpret results. Often, data were presented within the primary domains of the NASSS framework. We observed several overlaps as our team organized narrative descriptions of barriers and enablers by NASSS subdomains. Furthermore, we identified the potential for these barriers to be mapped onto other primary NASSS domains. This observation may indicate the intricate nature of the implementation under examination in the included studies, which could be explained by the framework’s underlying assumption that, in complex situations, the NASSS domains interact with one another and are interdependent [25]. In other words, when interdependencies among the domains exist, it often leads to the inability to address a singular issue without inadvertently giving rise to new challenges in other domains of the NASSS [25].

For studies that did not present their results within the NASSS domains, even though the authors reported using the NASSS framework for data analysis, it was challenging to determine which domain(s) the results pertained to in predicting or explaining implementation success or failure. This unclear use of the NASSS framework for presenting results and interpreting findings represents a notable gap in the literature. The literature has previously documented that implementation studies lack reporting, leading to low-quality reporting in the field [90,91]. Specifically, many implementation studies have faced criticism for providing inaccurate context descriptions and lacking detailed information on the implementation process [91]. Poor reporting makes it difficult to synthesize evidence from relevant studies [90]. Therefore, enhancing reporting practices to facilitate more straightforward evidence synthesis is essential, aiding future empirical testing and refinement of the NASSS framework.

Additionally, some studies were unclear about how the NASSS framework was used to inform the overall study designs. Clear reporting standards may increase the utility of the NASSS framework by guiding researchers on correctly applying and describing its use. The need for better reporting on how TMFs are used in implementation research is a gap in the literature that has already been discussed [92]. For example, a review identified 159 different implementation TMFs, of which 87% of them were used in five or fewer studies [92]. Despite the substantial number of TMFs, there is limited evidence to describe their use [92]. This limitation restricts opportunities for advancing the science and learning from other researchers. Implementation studies should report more clearly on how TMFs have been incorporated into the study design [93]. Better reporting allows for a coherent synthesis of evidence, application and scaling of the TMFs to other contexts, thereby contributing to the implementation science [93]. We also found that not many authors shared their experience of using the NASSS framework or provided suggestions for its advancement. Two studies in this review mentioned the shortcomings of the NASSS framework [62,67], including ethical principles, which were addressed in the Planning and Evaluating Remote Consultation Services framework in 2021 [94]. It would be beneficial to conduct a review in five years to reassess the application of the NASSS framework, explore grey literature, and gather lessons learned for the ongoing advancement and refinement of the framework.

Reporting issues have led to the creation of reporting checklists in other fields, such as the Consolidated Standards of Reporting Trials (CONSORT) checklist for randomized controlled trials [95]. Some implementation reporting standards are available; one example is the Standards for Reporting Implementation Studies (StaRI) Statement and Checklist [91]. The StaRI checklist prompts authors to describe the implementation method and the intervention [91], encouraging detailed reporting on contextual information. In addition, the StaRI checklist also prompts authors to describe the theoretical underpinnings of the study. Therefore, its use in future implementation studies is encouraged and may improve reporting of TMF applications, including the NASSS framework.

Limitations

Several limitations of this review must be acknowledged. First, quality appraisal was not employed to exclude studies, as scoping reviews generally do not require such assessment. In addition, our primary goal was to explore the breadth and depth of the literature and map available literature about the NASSS framework application. Second, the field of mHealth is rapidly evolving, and our findings may need re-evaluation. Nevertheless, our review remains relevant at the time of publication and contributes to the ongoing evolution of the NASSS framework. Third, this review excluded non-empirical papers, such as commentaries and opinion articles, which could offer authors’ insights regarding their experiences with the NASSS framework. Future reviews aiming to reassess the NASSS application can include grey literature to enhance comprehensiveness. Fourth, we only included studies written in English. While we did include a small number of English studies published in non-English speaking countries, our findings may not provide a comprehensive representation of the NASSS framework’s application in those regions.

Conclusions

This review outlines the characteristics of studies using the NASSS framework and examines patterns of its application. Most of the included studies employed qualitative or mixed/multi-methods designs, which aligns with the NASSS framework’s purpose of exploring determinants of implementation success. This often requires qualitative exploration to assess context. Additionally, most studies retrospectively applied the NASSS framework, likely due to the novelty of the framework. However, this highlights the need for prospective and concurrent utilization of the NASSS framework during the implementation phase, revealing a gap in the current literature. Furthermore, nearly all included studies identified various domains as both implementation barriers and enablers, which aligns with the current literature describing the intricate nature of the implementation process. This underscores the importance of thorough preparation for achieving successful implementation outcomes [96]. Lastly, our review findings highlight the need for improved reporting on the use of the NASSS framework in research, including how it was applied, as well as the need for greater consistency in presenting results and interpreting findings using the NASSS framework to facilitate evidence synthesis in the future.

Acknowledgments

HDS conceptualized this review, and HDS, EH, EG, MM, LS, and RB designed the review protocol following the JBI methodology. LS and HDS wrote the review protocol and developed the search strategy with a librarian. HDS, EH, EG, MM, LS, and RB participated in screening titles and abstracts and assessing full texts against the inclusion criteria. HDS, EH, EG, MM, LS, and RB participated in data extraction. HDS, EH, EG, and MM designed the data analysis plan. HDS, EH, EG, MM, LS, and RB participated in data analysis. HDS, EH, EG, MM, KET, and MB participated in data interpretation. HDS, EH, EG, and MM developed tables and figures for data presentation. HDS, EH, EG, and MM wrote the first draft of the review report. All authors critically reviewed and provided feedback on the manuscript. HDS worked on manuscript revisions. KET and MB supervised this review.

References

  1. 1. Ayyoubzadeh SM, Niakan Kalhori SR, Shirkhoda M, Mohammadzadeh N, Esmaeili M. Supporting colorectal cancer survivors using eHealth: a systematic review and framework suggestion. Support Care Cancer Support Care Cancer. 2020;28:3543–55.
  2. 2. Begin HonM, Eggertson L, Macdonald N. A country of perpetual pilot projects. Canadian Medical Association Journal. 2009;180(12):1185–1185.
  3. 3. Christ C, Schouten MJ, Blankers M, van Schaik DJ, Beekman AT, Wisman MA, et al. Internet and Computer-Based Cognitive Behavioral Therapy for Anxiety and Depression in Adolescents and Young Adults: Systematic Review and Meta-Analysis. J Med Internet Res. 2020;22(9):e17831. pmid:32673212
  4. 4. Lv M, Wu T, Jiang S, Chen W, Zhang J. Effects of Telemedicine and mHealth on Systolic Blood Pressure Management in Stroke Patients: Systematic Review and Meta-Analysis of Randomized Controlled Trials. JMIR Mhealth Uhealth. 2021;9(6):e24116. pmid:34114961
  5. 5. Pouls BPH, Vriezekolk JE, Bekker CL, Linn AJ, van Onzenoort HAW, Vervloet M, et al. Effect of Interactive eHealth Interventions on Improving Medication Adherence in Adults With Long-Term Medication: Systematic Review. J Med Internet Res. 2021;23(1):e18901. pmid:33416501
  6. 6. Schreiweis B, Pobiruchin M, Strotbaum V, Suleder J, Wiesner M, Bergh B. Barriers and Facilitators to the Implementation of eHealth Services: Systematic Literature Analysis. J Med Internet Res. 2019;21(11):e14197. pmid:31755869
  7. 7. Cresswell K, Sheikh A. Organizational issues in the implementation and adoption of health information technology innovations: an interpretative review. Int J Med Inform. 2013;82(5):e73-86. pmid:23146626
  8. 8. Glover WJ, Nissinboim N, Naveh E. Examining innovation in hospital units: a complex adaptive systems approach. BMC Health Serv Res. 2020;20(1):554. pmid:32552869
  9. 9. Greenhalgh T, Papoutsi C. Studying complexity in health services research: desperately seeking an overdue paradigm shift. BMC Med. 2018;16(1):95. pmid:29921272
  10. 10. Greenhalgh T, Wherton J, Papoutsi C, Lynch J, Hughes G, A’Court C, et al. Beyond Adoption: A New Framework for Theorizing and Evaluating Nonadoption, Abandonment, and Challenges to the Scale-Up, Spread, and Sustainability of Health and Care Technologies. J Med Internet Res. 2017;19(11):e367. pmid:29092808
  11. 11. Greenhalgh T, Maylor H, Shaw S, Wherton J, Papoutsi C, Betton V, et al. The NASSS-CAT Tools for Understanding, Guiding, Monitoring, and Researching Technology Implementation Projects in Health and Social Care: Protocol for an Evaluation Study in Real-World Settings. JMIR Res Protoc. 2020;9(5):e16861. pmid:32401224
  12. 12. Gremyr A, Andersson Gäre B, Greenhalgh T, Malm U, Thor J, Andersson A-C. Using Complexity Assessment to Inform the Development and Deployment of a Digital Dashboard for Schizophrenia Care: Case Study. J Med Internet Res. 2020;22(4):e15521. pmid:32324143
  13. 13. Papoutsi C, A’Court C, Wherton J, Shaw S, Greenhalgh T. Explaining the mixed findings of a randomised controlled trial of telehealth with centralised remote support for heart failure: multi-site qualitative study using the NASSS framework. Trials. 2020;21(1):891. pmid:33109254
  14. 14. Gaglio B, Shoup JA, Glasgow RE. The RE-AIM framework: a systematic review of use over time. Am J Public Health. 2013;103(6):e38-46. pmid:23597377
  15. 15. Glasgow RE, Harden SM, Gaglio B, Rabin B, Smith ML, Porter GC, et al. RE-AIM Planning and Evaluation Framework: Adapting to New Science and Practice With a 20-Year Review. Front Public Health. 2019;7:64. pmid:30984733
  16. 16. Wickremasinghe D, Kuruvilla S, Mays N, Avan BI. Taking knowledge users’ knowledge needs into account in health: an evidence synthesis framework. Health Policy Plan. 2016;31(4):527–37. pmid:26324232
  17. 17. Peters M, Godfrey C, McInerney P, Baldini Soares C, Khalil H, Parker D. Chapter 11: Scoping Reviews. In: Aromataris E, Munn Z, editors. Joanna Briggs Institute Reviewer’s Manual. The Joanna Briggs Institute; 2017.
  18. 18. Peters MDJ, Marnie C, Tricco AC, Pollock D, Munn Z, Alexander L, et al. Updated methodological guidance for the conduct of scoping reviews. JBI Evid Synth. 2020;18(10):2119–26. pmid:33038124
  19. 19. Tricco AC, Lillie E, Zarin W, O’Brien KK, Colquhoun H, Levac D, et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation. Ann Intern Med Ann Intern Med. 2018;169: 467–73.
  20. 20. Shin HD, Samawi L, Gatov J, Hamovitch E, MacKinnon M, Boateng R, et al. The NASSS (Non-Adoption, Abandonment, Scale-Up, Spread and Sustainability) framework use over time: A scoping review protocol. 2022. Available: https://osf.io/74csw/
  21. 21. McGowan J, Sampson M, Salzwedel DM, Cogo E, Foerster V, Lefebvre C. PRESS Peer Review of Electronic Search Strategies: 2015 Guideline Statement. J Clin Epidemiol J Clin Epidemiol. 2016;75:40–6.
  22. 22. Wright K, Golder S, Rodriguez-Lopez R. Citation searching: a systematic review case study of multiple risk behaviour interventions. BMC Med Res Methodol. 2014;14:73.
  23. 23. Covidence systematic review software. Melbourne, Australia.; 2019. Available: www.covidence.org
  24. 24. Abernethy A, Adams L, Barrett M, Bechtel C, Brennan P, Butte A, et al. The Promise of Digital Health: Then, Now, and the Future. NAM Perspect. 2022;2022:10.31478/202206e. pmid:36177208
  25. 25. Greenhalgh T, Wherton J, Papoutsi C, Lynch J, Hughes G, A’Court C, et al. Analysing the role of complexity in explaining the fortunes of technology programmes: empirical application of the NASSS framework. BMC Med. 2018;16(1):66. pmid:29754584
  26. 26. Abimbola S, Patel B, Peiris D, Patel A, Harris M, Usherwood T, et al. The NASSS framework for ex post theorisation of technology-supported change in healthcare: worked example of the TORPEDO programme. BMC Med. 2019;17(1):233. pmid:31888718
  27. 27. Alhmoud B, Banerjee A, Bonnici T, Patel R, Melley D, Hicks L. Implementation of a digital early warning score (NEWS2) in a cardiac specialist and general hospital settings in the COVID-19 pandemic. Intensive Care Med Exp. 2022;10.
  28. 28. Kadesjö Banck J, Bernhardsson S. Experiences from implementation of internet-delivered cognitive behaviour therapy for insomnia in psychiatric health care: a qualitative study applying the NASSS framework. BMC Health Serv Res. 2020;20(1):729. pmid:32771024
  29. 29. Barnett A, Kelly JT, Wright C, Campbell KL. Technology-supported models of nutrition care: Perspectives of health service providers. Digit Health. 2022;8:20552076221104670. pmid:35677784
  30. 30. Bezuidenhout L, Joseph C, Thurston C, Rhoda A, English C, Conradsson DM. Telerehabilitation during the COVID-19 pandemic in Sweden: a survey of use and perceptions among physiotherapists treating people with neurological diseases or older adults. BMC Health Serv Res. 2022;22(1):555. pmid:35473602
  31. 31. Brown P, Waite F, Lambe S, Jones J, Jenner L, Diamond R, et al. Automated Virtual Reality Cognitive Therapy (gameChange) in Inpatient Psychiatric Wards: Qualitative Study of Staff and Patient Views Using an Implementation Framework. JMIR Form Res. 2022;6(4):e34225. pmid:35412462
  32. 32. Budhwani S, Fujioka JK, Chu C, Baranek H, Pus L, Wasserman L, et al. Delivering Mental Health Care Virtually During the COVID-19 Pandemic: Qualitative Evaluation of Provider Experiences in a Scaled Context. JMIR Form Res. 2021;5(9):e30280. pmid:34406967
  33. 33. Cartledge S, Rawstorn JC, Tran M, Ryan P, Howden EJ, Jackson A. Telehealth is here to stay but not without challenges: a consultation of cardiac rehabilitation clinicians during COVID-19 in Victoria, Australia. Eur J Cardiovasc Nurs. 2022;21(6):548–58. pmid:34935940
  34. 34. Catapan S de C, Taylor A, Calvo MCM. Health professionals’ views of medical teleconsultation uptake in the Brazilian Unified Health System: A description using the NASSS framework. Int J Med Inform. 2022;168:104867. pmid:36228416
  35. 35. Clarkson P, Vassilev I, Rogers A, Brooks C, Wilson N, Lawson J, et al. Integrating a Web-Based Self-Management Tool (Managing Joint Pain on the Web and Through Resources) for People With Osteoarthritis-Related Joint Pain With a Web-Based Social Network Support Tool (Generating Engagement in Network Involvement): Design, Development, and Early Evaluation. JMIR Form Res. 2020;4(11):e18565. pmid:33242011
  36. 36. Davies SM, Jardine J, Gutridge K, Bernard Z, Park S, Dawson T, et al. Preventive Digital Mental Health for Children in Primary Schools: Acceptability and Feasibility Study. JMIR Form Res. 2021;5(12):e30668. pmid:34898446
  37. 37. Dijkstra A, Heida A, van Rheenen PF. Exploring the Challenges of Implementing a Web-Based Telemonitoring Strategy for Teenagers With Inflammatory Bowel Disease: Empirical Case Study. J Med Internet Res. 2019;21(3):e11761. pmid:30924785
  38. 38. Dyb K, Berntsen GR, Kvam L. Adopt, adapt, or abandon technology-supported person-centred care initiatives: healthcare providers’ beliefs matter. BMC Health Serv Res. 2021;21(1).
  39. 39. Edridge C, Deighton J, Wolpert M, Edbrooke-Childs J. The Implementation of an mHealth Intervention (ReZone) for the Self-Management of Overwhelming Feelings Among Young People. JMIR Form Res. 2019;3(2):e11958. pmid:31045499
  40. 40. Fox D, Coddington R, Scarf V, Bisits A, Lainchbury A, Woodworth R, et al. Harnessing technology to enable all women mobility in labour and birth: feasibility of implementing beltless non-invasive fetal ECG applying the NASSS framework. Pilot Feasibility Stud. 2021;7:214.
  41. 41. Franck LS, Kriz RM, Rego S, Garman K, Hobbs C, Dimmock D. Implementing Rapid Whole-Genome Sequencing in Critical Care: A Qualitative Study of Facilitators and Barriers to New Technology Adoption. J Pediatr. 2021;237:237-243.e2. pmid:34023348
  42. 42. Gorbenko K, Mohammed A, Ezenwafor E, Phlegar S, Healy P, Solly T, et al. Innovating in a crisis: a qualitative evaluation of a hospital and Google partnership to implement a COVID-19 inpatient video monitoring program. J Am Med Inform Assoc. 2022;29:1618–30.
  43. 43. Grady A, Barnes C, Wolfenden L, Lecathelinais C, Yoong SL. Barriers and Enablers to Adoption of Digital Health Interventions to Support the Implementation of Dietary Guidelines in Early Childhood Education and Care: Cross-Sectional Study. J Med Internet Res. 2020;22.
  44. 44. Greenhalgh T, Shaw S, Wherton J, Vijayaraghavan S, Morris J, Bhattacharya S, et al. Real-World Implementation of Video Outpatient Consultations at Macro, Meso, and Micro Levels: Mixed-Method Study. J Med Internet Res. 2018;20(4):e150. pmid:29625956
  45. 45. Hall A, Ewing G, Rowland C, Grande G. A drive for structure: A longitudinal qualitative study of the implementation of the Carer Support Needs Assessment Tool (CSNAT) intervention during hospital discharge at end of life. Palliat Med. 2020;34(8):1088–96. pmid:32491967
  46. 46. Hammerton M, Benson T, Sibley A. Readiness for five digital technologies in general practice: perceptions of staff in one part of southern England. BMJ Open Qual. 2022;11(2):e001865. pmid:35768171
  47. 47. Hehakaya C, Van der Voort van Zyp J, Lagendijk J, Grobbee D, Verkooijen H, Ellen M. Opportunities and challenges in the adoption and implementation of MR-Linac for prostate cancer. Radiother Oncol. 2020;152:S672.
  48. 48. Hehakaya C, Van der Voort van Zyp JR, Lagendijk JJW, Grobbee DE, Verkooijen HM, Moors EHM. Problems and Promises of Introducing the Magnetic Resonance Imaging Linear Accelerator Into Routine Care: The Case of Prostate Cancer. Front Oncol. 2020;10:1741. pmid:32984058
  49. 49. Hehakaya C, Sharma AM, van der Voort Van Zijp JRN, Grobbee DE, Verkooijen HM, Izaguirre EW, et al. Implementation of Magnetic Resonance Imaging-Guided Radiation Therapy in Routine Care: Opportunities and Challenges in the United States. Adv Radiat Oncol. 2022;7:1.
  50. 50. Hollick RJ, Black AJ, Reid DM, McKee L. Shaping innovation and coordination of healthcare delivery across boundaries and borders. JHOM. 2019;33(7/8):849–68.
  51. 51. Jacobs J, Ferguson JM, Van Campen J, Yefimova M, Greene L, Heyworth L, et al. Organizational and External Factors Associated with Video Telehealth Use in the Veterans Health Administration Before and During the COVID-19 Pandemic. Telemed J E Health. 2022;28(2):199–211. pmid:33887166
  52. 52. Jones NL, Read J, Field B, Fegan C, Simpson E, Revitt C, et al. Remote home visits: Exploring the concept and applications of remote home visits within health and social care settings. British Journal of Occupational Therapy. 2021;85(1):50–61.
  53. 53. Kip H, Sieverink F, Van Gemert-Pijnen L, Bouman Y, Kelders SM. Integrating People, Context, and Technology in the Implementation of a Web-Based Intervention in Forensic Mental Health Care: Mixed-Methods Study. J Med Internet Res. 2020;22.
  54. 54. Kozica-Olenski SL, Soldatos G, Marlow L, Cooray SD, Boyle JA. Exploring the acceptability and experience of receiving diabetes and pregnancy care via telehealth during the COVID-19 pandemic: a qualitative study. BMC Pregnancy Childbirth. 2022;22(1):932.
  55. 55. Kozica-Olenski SL, Garth B, Boyle JA, Vincent AJ. Menopause care delivery in the time of COVID-19: evaluating the acceptability of telehealth services for women with early and usual age menopause. Climacteric. 2022;26(1):34–46.
  56. 56. Liverani M, Ir P, Perel P, Khan M, Balabanova D, Wiseman V. Assessing the potential of wearable health monitors for health system strengthening in low- and middle-income countries: a prospective study of technology adoption in Cambodia. Health Policy Plan. 2022;37(8):943–51. pmid:35262172
  57. 57. Longacre ML, Keleher C, Chwistek M, Odelberg M, Siemon M, Collins M, et al. Developing an Integrated Caregiver Patient-Portal System. Healthcare (Basel). 2021;9(2):193. pmid:33578838
  58. 58. Martindale A-M, Pilbeam C, Mableson H, Tonkin-Crine S, Atkinson P, Borek A, et al. Perspectives on COVID-19 testing policies and practices: a qualitative study with scientific advisors and NHS health care workers in England. BMC Public Health. 2021;21(1):1216. pmid:34167491
  59. 59. Merolli M, Marshall CJ, Pranata A, Paay J, Sterling L. User-Centered Value Specifications for Technologies Supporting Chronic Low-Back Pain Management. Stud Health Technol Inform. 2019;264:1288–92. pmid:31438133
  60. 60. Miller C, Christian D, Spencer J, Watkins C. Healthcare professionals perceptions of the stroke-specific education framework; what are the factors influencing effective implementation and adoption in the national stroke workforce? Int J Stroke. 2021;16:35.
  61. 61. Neher M, Nygårdh A, Broström A, Lundgren J, Johansson P. Perspectives of Policy Makers and Service Users Concerning the Implementation of eHealth in Sweden: Interview Study. J Med Internet Res. 2022;24(1):e28870. pmid:35089139
  62. 62. Nguyen HQ, McMullen C, Haupt EC, Wang SE, Werch H, Edwards PE, et al. Findings and lessons learnt from early termination of a pragmatic comparative effectiveness trial of video consultations in home-based palliative care. BMJ Support Palliat Care. 2020;12:E432–40. pmid:33051309
  63. 63. Nimsakul K, Suwannaprom P, Suttajit S. Complexity of implementing harm reduction services in community hospitals: A two-phase qualitative study. The Thai Journal of Pharmaceutical Sciences. 2022;46(4):470–80.
  64. 64. Perdacher E, Kavanagh D, Sheffield J, Healy K, Dale P, Heffernan E. Using the Stay Strong App for the Well-being of Indigenous Australian Prisoners: Feasibility Study. JMIR Form Res. 2022;6(4):e32157. pmid:35394444
  65. 65. Przysucha M, Peters L, Büscher A, Schnellhammer M, Hübner U. What Went Wrong in eMedCare? Formative Evaluation of an IT Project in Primary Care in Two Rural Districts. Stud Health Technol Inform. 2022;296:81–9. pmid:36073492
  66. 66. Pumplun L, Fecho M, Islam N, Buxmann P. Machine learning systems in clinics - How mature is the adoption process in medical diagnostics? Proceedings of the Annual Hawaii International Conference on System Sciences. 2021. p. 6317–26. Available: https://www.scopus.com/inward/record.uri?eid=2-s2.0-85108367707&partnerID=40&md5=c0eb4c92372f9a5f0f2831bbefd17957
  67. 67. Pumplun L, Fecho M, Wahl N, Peters F, Buxmann P. Adoption of Machine Learning Systems for Medical Diagnostics in Clinics: Qualitative Interview Study. J Med Internet Res. 2021;23(10):e29301. pmid:34652275
  68. 68. Rudin RS, Perez S, Rodriguez JA, Sousa J, Plombon S, Arcia A, et al. User-centered design of a scalable, electronic health record-integrated remote symptom monitoring intervention for patients with asthma and providers in primary care. J Am Med Inform Assoc. 2021;28(11):2433–44. pmid:34406413
  69. 69. Schougaard LMV, Mejdahl CT, Christensen J, Lomborg K, Maindal HT, de Thurah A, et al. Patient-initiated versus fixed-interval patient-reported outcome-based follow-up in outpatients with epilepsy: a pragmatic randomized controlled trial. J Patient Rep Outcomes. 2019;3(1):61. pmid:31520247
  70. 70. Schultz K, Vickery H, Campbell K, Wheeldon M, Barrett-Beck L, Rushbrook E. Implementation of a virtual ward as a response to the COVID-19 pandemic. Aust Health Rev. 2021;45(4):433–41. pmid:33840420
  71. 71. Strohm L, Hehakaya C, Ranschaert ER, Boon WPC, Moors EHM. Implementation of artificial intelligence (AI) applications in radiology: hindering and facilitating factors. Eur Radiol. 2020;30(10):5525–32. pmid:32458173
  72. 72. Thomas EE, Chambers R, Phillips S, Rawstorn JC, Cartledge S. Sustaining telehealth among cardiac and pulmonary rehabilitation services: a qualitative framework study. Eur J Cardiovasc Nurs. 2023;22(8):795–803. pmid:36468293
  73. 73. Tolf S, Mesterton J, Söderberg D, Amer-Wåhlin I, Mazzocato P. How can technology support quality improvement? Lessons learned from the adoption of an analytics tool for advanced performance measurement in a hospital unit. BMC Health Serv Res. 2020;20(1):816. pmid:32873286
  74. 74. Tompson A, Fleming S, Lee M-M, Monahan M, Jowett S, McCartney D, et al. Mixed-methods feasibility study of blood pressure self-screening for hypertension detection. BMJ Open. 2019;9(5):e027986. pmid:31147366
  75. 75. Uribe Guajardo MG, Baillie A, Louie E, Giannopoulos V, Wood K, Riordan B, et al. The evaluation of the role of technology in the pathways to comorbidity care implementation project to improve management of comorbid substance use and mental disorders. J Multimorb Comorb. 2022;12:26335565221096977. pmid:35586033
  76. 76. Vali Y, Eijk R, Hicks T, Jones WS, Suklan J, Holleboom AG, et al. Clinicians’ Perspectives on Barriers and Facilitators for the Adoption of Non-Invasive Liver Tests for NAFLD: A Mixed-Method Study. J Clin Med. 2022;11(10):2707. pmid:35628838
  77. 77. Weidner K, Lowman J, Fleischer A, Kosik K, Goodbread P, Chen B, et al. Twitter, Telepractice, and the COVID-19 Pandemic: A Social Media Content Analysis. Am J Speech Lang Pathol. 2021;30(6):2561–71. pmid:34499843
  78. 78. Yakovchenko V, McInnes DK, Petrakis BA, Gillespie C, Lipschitz JM, McCullough MB, et al. Implementing Automated Text Messaging for Patient Self-management in the Veterans Health Administration: Qualitative Study Applying the Nonadoption, Abandonment, Scale-up, Spread, and Sustainability Framework. JMIR Mhealth Uhealth. 2021;9(11):e31037. pmid:34779779
  79. 79. Thomas EE, Taylor ML, Ward EC, Hwang R, Cook R, Ross J-A, et al. Beyond forced telehealth adoption: A framework to sustain telehealth among allied health services. J Telemed Telecare. 2024;30(3):559–69. pmid:35130099
  80. 80. Duan Y, Iaconi A, Wang J, Perez JS, Song Y, Chamberlain SA, et al. Conceptual and relational advances of the PARIHS and i-PARIHS frameworks over the last decade: a critical interpretive synthesis. Implementation Sci. 2022;17(1):.
  81. 81. Ahmed S, Zidarov D, Eilayyan O, Visca R. Prospective application of implementation science theories and frameworks to inform use of PROMs in routine clinical care within an integrated pain network. Qual Life Res. 2021;30(11):3035–47. pmid:32876812
  82. 82. Moullin JC, Dickson KS, Stadnick NA, Albers B, Nilsen P, Broder-Fingert S, et al. Ten recommendations for using implementation frameworks in research and practice. Implement Sci Commun. 2020;1:42.
  83. 83. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38(1):4–23. pmid:21197565
  84. 84. Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA. Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implement Sci. 2019;14:1.
  85. 85. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.
  86. 86. Kirk MA, Kelley C, Yankey N, Birken SA, Abadie B, Damschroder L. A systematic review of the use of the Consolidated Framework for Implementation Research. Implement Sci. 2016;11:72.
  87. 87. Harvey G, Kitson A. PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci. 2016;11:33. pmid:27013464
  88. 88. Rycroft-Malone J. The PARIHS framework--a framework for guiding the implementation of evidence-based practice. J Nurs Care Qual. 2004;19(4):297–304. pmid:15535533
  89. 89. Duan Y, Iaconi A, Wang J, Perez JS, Song Y, Chamberlain SA, et al. Conceptual and relational advances of the PARIHS and i-PARIHS frameworks over the last decade: a critical interpretive synthesis. Implement Sci. 2022;17(1):78. pmid:36476376
  90. 90. Pinnock H, Epiphaniou E, Pearce G, Parke H, Greenhalgh T, Sheikh A, et al. Implementing supported self-management for asthma: a systematic review and suggested hierarchy of evidence of implementation studies. BMC Med. 2015;13:127. pmid:26032941
  91. 91. Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, et al. Standards for Reporting Implementation Studies (StaRI) Statement. BMJ. 2017;356:i6795. pmid:28264797
  92. 92. Strifler L, Cardoso R, McGowan J, Cogo E, Nincic V, Khan PA, et al. Scoping review identifies significant number of knowledge translation theories, models, and frameworks with limited use. J Clin Epidemiol. 2018;100:92–102. pmid:29660481
  93. 93. Rycroft-Malone J, Burton CR. Is it time for standards for reporting on research about implementation? Worldviews Evid Based Nurs. 2011;8(4):189–90. pmid:22123028
  94. 94. Greenhalgh T, Rosen R, Shaw SE, Byng R, Faulkner S, Finlay T, et al. Planning and Evaluating Remote Consultation Services: A New Conceptual Framework Incorporating Complexity and Practical Ethics. Front Digit Health. 2021;3:726095. pmid:34713199
  95. 95. Schulz KF, Altman DG, Moher D. CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. PLoS Med. 2010;7:e1000251.
  96. 96. Alley Z, Chapman J, Schaper H, Saldana L. The relative value of pre-implementation stages for successful implementation of evidence-informed programs. Implement Sci. 2023;18(1):30.