Start Your Search

Trevor Jamieson



Author of

  • +

    Plenary Panel (ID 57)

    • Event: e-Health 2019 Virtual Meeting
    • Type: Keynote Session
    • Track:
    • Presentations: 1
    • Now Available
    • +
      • Abstract
      • Presentation
      • Slides

      Purpose/Objectives:
      Artificial intelligence (AI) has become the buzzword bingo term in healthcare today. Everywhere from health care conferences to organizational strategic plans tout the benefits of AI for addressing health outcomes, efficiencies, and cost barriers we face today in health systems. Exciting as it may be, how do we square this with the current state of health care in Canada? Do we have the quality and quantity of data required to optimize clinical care? Can we implement these tools into existing systems? What are some instances of AI being used today in Canada and what impact is it having? And importantly, where do our policy and regulatory frameworks stand in terms of readiness for AI tools in medical practice. Today?s panel hopes to spark a conversation about what?s required to make the most of AI, and be realistic about the practical approaches we need to take to implement these tools.

      Only Members that have purchased this event or have registered via an access code will be able to view this content. To view this presentation, please login, select "Add to Cart" and proceed to checkout. If you would like to become a member of IASLC, please click here.

      Only Active Members that have purchased this event or have registered via an access code will be able to view this content. To view this presentation, please login or select "Add to Cart" and proceed to checkout.

  • +

    PS05 - How We Figured Out That It Really Worked! (ID 32)

    • Event: e-Health 2019 Virtual Meeting
    • Type: Panel Session
    • Track: Executive
    • Presentations: 1
    • Coordinates: 5/28/2019, 01:15 PM - 02:15 PM, Pod 4
    • +

      PS05.02 - The Digital Health Evaluation Technology Readiness Assessment: Development and Application (ID 174)

      Trevor Jamieson, Institute for Health System Solutions and Virtual Care (WIHV), Women’s College Hospital; Toronto/CA

      • Abstract
      • Slides

      Purpose/Objectives:
      Strategies for evaluation, implementation, spread, and scale of digital health technologies depend on their level of maturity. Technological maturity is intrinsically linked to contextual factors, including stakeholder interests, features of the implementation site(s), evidence of impact, and the alignment between the technology and the proposed problem. This panel outlines a process for assessing technological maturity and a roadmap for identifying evaluation needs. It adopts the NASA Technology Readiness Level (TRL) measurement system as a macro-level conceptual framework to assess technological maturity, outlining the evidence requirements and relevant guiding frameworks at each stage. The purpose of this panel is to help attendees (1) understand technology readiness; (2) identify relevant evaluation objectives; (3) identify appropriate micro-level (stage-specific) evaluation frameworks; and (4) recognize which stakeholders to engage across each stage.


      Methodology/Approach:
      This work reflects the thematic consolidation of learnings across evaluations involving 33 digital health vendors, over 20 clinical implementation sites, and more than 75 digital health stakeholders in Ontario. We outline the *Digital Health Evaluation Technology Readiness A*ssessment (DTA) as a comprehensive tool to help stakeholders navigate the evaluation of digital health technologies. The DTA is focused on evaluation of digital health technologies across the innovation continuum, from development to system procurement. A rapid review was used to identify prominent digital health evaluation frameworks, which we mapped to the corresponding evaluation domain within the DTA. This panel will provide an overview of the literature followed by a synthesis of field experience. We will then present the framework, highlighting how it can be used to address the needs of innovators, evaluators, and system stakeholders using real-world examples from past and present engagements.


      Finding/Results:
      The notion of readiness extends beyond the technology itself. Digital health technology readiness must consider the intersection of the technology, its user(s), their context/site, and the nature of the problem to be solved. The extent to which these factors align determines the stage of readiness. The DTA framework (Table 1) includes descriptions of key objectives, evaluation domains, and associated evaluation process(es). Ten pragmatic evaluation frameworks emerged during the rapid review which map to evaluation domains. Table 1- Digital Health Evaluation Technology Readiness Assessment table 1.jpg DH=Digital health; HTA=health technology assessment.


      Conclusion/Implications/Recommendations:
      Health technology readiness is a product of complex interactions between stakeholders (including government, administration, clinicians, and patients), system context, and setting-specific factors. The burden of evidence required to ensure uptake extends beyond basic functionality to include feasibility, acceptability, impact, and scalability. The DTA framework provides a roadmap to help innovators, evaluators, and system stakeholders navigate the evaluation requirements from technology development to adoption.


      140 Character Summary:
      The Digital Health Evaluation Technology Readiness Assessment provides a framework to guide evaluation from digital solution development to system procurement

      Only Active Members that have purchased this event or have registered via an access code will be able to view this content. To view this presentation, please login or select "Add to Cart" and proceed to checkout.