Start Your Search
S. Whitehouse
Author of
-
EP01 - e-Poster Session 1 (ID 52)
- Event: e-Health 2018 Virtual Meeting
- Type: e-Poster Session
- Track: Clinical Delivery
- Presentations: 2
- Coordinates: 5/28/2018, 10:00 AM - 11:30 AM, e-Poster Station 1, Parq Grand Ballroom, Conference Level
-
EP01.02 - Digital Quality Improvement Survey Improves Accessibility and Patient Engagement (ID 566)
- Abstract
Purpose/Objectives: Black Creek Community Health Centre (BCCHC) provides holistic care and program services to a diverse client population in Toronto, Ontario Canada. We compared traditional paper-based data collection with an interactive mHealth platform Tickit® for the HQO Client Experience Survey and specific BCCHC questions to evaluate and implement quality improvement initiatives efficiently and in real time. Objective Compare traditional paper-based data collection with an interactive mHealth platform Tickit® for the HQO Client Experience Survey and specific BCCHC questions to evaluate and implement quality improvement initiatives more efficiently and in real time. Aim: Increase the response rates by 400% to be more equitably representation of organization population.
Methodology/Approach: Implementing change management, LEAN Six Sigma processes and continuous improvement via the Plan Do Study Act model. Staff and patients were empowered to integrate workflow in the clinic by automating processes to collect the patients voice in a responsive and value-added process: Survey development Workflow and process mapping Review and revise survey drafts Promotional materials Identify staff champions Training Implementation Check in and revision As an added value measure, patients completed the survey pre- and post- appointment.
Finding/Results: Results Section 1: Overall ExperienceSection 2: HQO Quality Dimension -Access Use of an mHealth tool on a tablet resulted in Advance Access by reducing 3rd next appointment from 30 days to 2-3 days by engaging patient population with Tickit. Section 3: HQO Quality Dimension: Client-Centred As per HQO technical specifications, the number of responses for Always (96) and Often (33) were combined, and divided by the total of responses, excluding non-respondents Average time to complete: 3 minutes A Louder Patients Voice 429% Survey response rate increase with Tickit Tickit n=713 vs Paper n=167
Conclusion/Implications/Recommendations: Staff was instrumental in reminding and encouraging clients to participate in the survey, and this was demonstrated by the high rate of responses within the first 5 months. Regular reporting to staff at Sheridan and Yorkgate by QIP committee members regarding the number of surveys completed, along with a thermometer diagram at Sheridan to track progress and report to clients also helped in encouraging clients to complete the survey. Success Factors: Staff champions Harness analytics power for regular monitoring and reporting Check-ins and revisions throughout process Reminders to staff and patients Mobility of technology Lessons Learned: Integration of new Client Engagement Tool Opportunity for intersectoral collaboration and co-design process Identify and improve on current workflows Increase efficiencies in data monitoring, reporting and analysis
140 Character Summary: Using an mHealth tool validated quality improvement initiatives for staff satisfaction, patient-centric care and advocatng for the patient's voice. -
EP01.03 - Youth SBIRT Implementation Support: Tools, Training, and Technical Assistance (ID 595)
- Abstract
Purpose/Objectives: To support health systems in developing processes and procedures to provide high-quality SBIRT in the context of comprehensive adolescent health screening To improve providers skill and comfort in addressing substance use with adolescents To provide comprehensive screening of adolescents and targeted intervention to reduce substance use through the use of technology
Methodology/Approach: Phase 1: Customize Tool and Approach to fit Community Needs.We met with leaders and stakeholders from the participating projects to conduct a needs assessment and benchmark usual care, including resources and characteristics that may affect the need for technical assistance and training. Phase 2: Implementation Planning & Training. We engaged clinic staff from each clinic to specify implementation processes for their site, including which team members will be sent Check Yourself results, protocols for follow up, and tools for documentation in the electronic medical record. Phase 3: Roll-Out of Implementation with Quality Improvement. As the implementation plan rolled out, we conducted regular meetings with healthcare staff and thought leaders for monitoring progress, evaluation and debriefing. Phase 4: Evaluation. We conducted interviews with all levels of participants from each site potentially including administrative staff, program coordinators, thought leaders, providers, and project advisory board members to understand the implementation outcomes (changes in practice thinking and practice behavior, changes in organizations/systems), implementation processes, and next steps.
Finding/Results: In total, 26 participants were recruited from the waiting area of the clinic, 5 through word of mouth, and 1 via flyer. In total, 32 interviews were conducted, though 1 interview was not analyzed as the participant provided predominantly single-word responses that were not felt to enhance understanding of the youths perspective. In general, participants reported that the Check Yourself tool was easy to use and that colorful images and interactive content increased their interest in the health information that was presented: All participants indicated that they would prefer the Check Yourself tool to pencil-and-paper screening. Some adolescents particularly appreciated that questions were presented one at a time such that responses to previous questions were not visible. They felt this feature would help conceal their responses from family members in a waiting area: Adolescents described distinct ways that they felt the Check Yourself tool could enhance their interactions with doctors. Many participants found it easier and less awkward to disclose health risk behaviors on the tool than face-to-face, and perceived the tool as helpful in reducing providers need to ask patients about sensitive topics during the appointment.
Conclusion/Implications/Recommendations: With the opportunity to create screening tools that go beyond collecting information from adolescent patients by providing education and feedback, it is useful to seek youth input in order to design content that is acceptable and effective for this population. Adolescents in this qualitative study were specific about their preferences for electronic, personalized feedback on their health behaviors. Additionally, participants valued feedback that enhances their ability for self-management by facilitating goal setting and offering ongoing technological supports. Future quantitative outcome research should test screening tools that incorporate these suggestions.
140 Character Summary: Check Yourself is a teen-friendly eHealth tool designed to promote motivational discussions between adolescents and their providers about substance use.
-
RF08 - Ideas for Leveraging Approaches and Technology Solutions (ID 35)
- Event: e-Health 2018 Virtual Meeting
- Type: Rapid Fire Session
- Track: Executive
- Presentations: 1
- Coordinates: 5/29/2018, 01:00 PM - 02:00 PM, Fairview II Room, Conference Level
-
RF08.01 - Creating Shorter, Lower-Literacy Patient Experience CAHPS on Digital Platform, Tickit® (ID 586)
- Abstract
Purpose/Objectives: Healthcare policy supports inclusion of patient experience ratings in healthcare quality measurement & reporting. Consumer Assessment of Healthcare Providers and Systems (CAHPS) survey is gold standard for collecting patient perceptions and experiences with care. Traditional mailed CAHPS surveys have low response rates (~30%) Racial/ethnic minorities and older adults are less likely to respond. Mobile technology can be used to capture perceptions of patient experience at the point of care.
Methodology/Approach: Survey Adaptation & Design Partnered with start-up company Shift Health to create tablet-based survey interface that was visually attractive, icon-based, and simple. Adapted Clinician and Group CAHPS (CG-CAHPS) survey to lower literacy levels of questions and shorten survey length. Patient Feedback & Validation Elicited feedback from 3 existing patient advisory councils to refine survey questions, interface, and workflow for administration. Conducted 25 in-depth interviews with patients to: Elicit perspectives about care experiences and preferences for reporting feedback to providers/clinics. Validate a tablet-based survey compared to the standard paper-based CG-CAHPS (patients received both surveys in randomized order). Collect feedback about the interface and content of a tablet-based survey to inform future iterations.
Finding/Results: Preferences for Survey Administration Timing of data collection Value of collecting feedback at the point of care to avoid mail delay, occupy time in waiting room. Prefer tablet-based survey administration Novelty, color, interactivity, fun factor, familiarity with mobile platforms. Prefer paper-based survey administration Tradition, lack of technology skills or interest, privacy. Usability of Tablet-Based Survey High ease of use Tablet quick, easy, and convenient to use. Few technical barriers Difficulty with key-in answers, lack of flexibility in answering questions. Importance of Reporting Patient Experience Survey not capturing positive feedback/experiences Value of reporting good experiences to recognize staff and providers. Survey capturing the need for clinic improvement, but not personal negative experiences Value of reporting bad experiences to get resolution or drive changes in quality of care. Perceptions of Survey Capture of True Experience Quality of provider communication, staff respect, and access to care were prioritized concepts. Need for open-ended reporting format.
Conclusion/Implications/Recommendations: A majority of patients served in safety net healthcare settings are interested in using tablets to provide timely feedback to their clinic. Engaging patients in the design process produced a tablet-based survey with high usability and appropriate content. Clinics can integrate quality improvement & patient experience efforts with enhanced data collection . Current CAHPS questionnaires capture some core concepts of diverse patient care experience, but not the full range of responses. Federal policy should support improved content and format for collecting patient experience data from diverse populations.
140 Character Summary: Using the Tickit platform improved the completion rate and cultural inclusivity of San Francisco Department of Public Health's CAHPS survey.