HSR&D Home » Research » IIR 12-102 – HSR&D Study
Improving Consultation Management between Primary Care and Sub-Specialty Clinics
Michael Weiner, MD MPH
Richard L. Roudebush VA Medical Center, Indianapolis, IN
Shakaib Rehman MD BS
Phoenix VA Health Care System, Phoenix, AZ
Funding Period: July 2014 - June 2016
More than a third of patients in the US are referred to a specialist each year, and most outpatient visits occur with specialists. Although referral between primary and specialty care is a core clinical process, it continues to be a source of difficulty for both patients and their providers. About a third (36%) of all referrals from Veterans Affairs (VA) primary care providers (PCPs) are cancelled or discontinued by consultants. About half of the referrals lack follow-up actions within 30 days. Furthermore, workarounds, communication breakdowns, and redundancies in VA consultation management are common. Barriers to effective consultations may result in delays in clinical care and subsequently adversely affect quality and outcomes. To date, the VA has generated only limited characterizations of referral, and has disseminated no results of rigorous testing of processes that incorporate human-factors principles.
Objectives of this study are to identify barriers, facilitators, and suggested improvements to the referral process and the mechanisms that consultants use to communicate findings to PCPs; characterize providers' teamwork and communication exchanges between PCPs and specialists, including communication breakdowns and communication strategies; and develop and test new processes and electronic health record (EHR) design changes for managing consultation, comparing those changes to the current process.
This multi-method study included semi-structured in-depth interviews, ethnographic observation, and assessment of consultation templates, information flow diagrams, and work sampling techniques in two VA institutions. Interview data were analyzed using inductive qualitative methods, to identify and synthesize emerging themes. Observation data were reviewed, to create vignettes and workflow diagrams, which were analyzed in conjunction with a review of key VA documents related to managing consultations. The study team triangulated themes identified through semi-structured interviews and observations, with teamwork and communication patterns identified through work sampling. Based on triangulation of data sources, design features for improved consultation management were derived for testing during simulation. Through a review of consultation cancellation rates at each site, and purposive sampling, referral templates were selected for heuristic analysis.
A usability simulation study was performed at a third VA medical center, to test performance of changes to technology designed to reduce barriers and improve performance with referrals and consultations. Given the emergence of the Enterprise Health Management Platform (eHMP), comparison prototype templates were designed to incorporate both existing features common to the Computerized Patient Record System (CPRS) and eHMP features currently in development. The simulation consisted of within-subjects "A vs B" assessment, to compare workload, participants' preferences, and time on task, combined with a qualitative think-aloud protocol to identify emerging differences and contextualize qualitative results.
Analysis of interview and observational data identified a set of themes associated with staffing, continuing education, VA inter-facility consultations, non-VA consultations, e-consultations without clinical encounters, development of referral templates, templates' use and characteristics, CPRS functionality, primary-specialty care communication, and patients' roles, along with a set of key cognitive tasks in referral and consultation: decisions to request consultation; triage; scheduling; tracking, cancellations, and discontinuations; and "time goals." Analysis identified barriers, facilitators, and suggested improvements. Inconsistencies were identified between "ideal" consultation workflow and actual workflow.
The heuristic analysis identified 201 violations, with a mean of 8 per template. The three heuristics violated most were aesthetic and minimalist design, error prevention, and consistency and standards. Five usability goals-efficiency, effectiveness, safety, learnability, and utility-had potential negative impacts. Frequent causes of heuristic violations included the inconsistent organization of information, missing or misplaced instructions, instructions causing work outside the template, and selections or actions with unclear outcomes.
Simulation analyses revealed significant differences between interface versions in participants' self-reported measures of satisfaction, mental effort, and workload. The prototype interface was significantly more satisfactory than CPRS, not only overall (2.19 vs. 4.36, respectively; p<0.0001) but also in all three sub-factors: system usefulness (2.1 vs. 4.17, p<0.0001), information quality (2.37 vs. 4.48, p<0.0001), and interface quality (2.06 vs. 4.57, p<0.0001). These results are largely consistent with findings from simulation debriefing interviews. The odds ratio (and 95% CI) for the prototype to receive a more positive evaluation per participant than CPRS overall was 4.6 (3.7, 5.8; p<0.0001) while adjusting for category (p = 0.002). The prototype required less workload (20.72) than CPRS (32.83) as well as significantly less mental effort (36.64 vs. 48.52; p<0.0001 for all). Two-thirds of scenarios were conducted with the participant narrating his/her thought process aloud (voiced). The remaining third was done without narration (unvoiced). In unvoiced scenarios, the prototype required significantly less time (6.77 vs. 15.32 seconds, respectively; p<0.01); results, however, must be considered in context of fewer clinics displayed in the prototype version, and site-specific details of CPRS. The two approaches differed in frequency of task-related mouse clicks: across all scenarios, the prototype required approximately 40% fewer clicks (48.5 vs. 81; p<0.0001). Although the two approaches differed in number of observed errors, incidence (n=4) was too low to be considered meaningful.
Sharing our findings with relevant VA program offices is important, so that local, regional, and national leaders can be up to date with important results, provide any needed feedback, and use the findings as might be appropriate. Therefore, we plan to share the results with leaders of the following three programs.
- Enterprise Health Management Platform (eHMP)
- Veterans Engineering Resource Center (VHA Office of Strategic Integration)
- Human Factors Engineering (VHA Office of Informatics and Information Governance)
Individual VA medical centers have a stake in this work, which has local implications for how referrals and consultations are handled. Therefore, in addition to sharing findings with leaders of the VA medical center in Indianapolis, we will share findings with leaders of the following two medical centers, which were also involved in the study.
- Phoenix VA Health Care System
- VA Palo Alto Health Care System
This multi-method study identified key characteristics of the consultation process, which will provide a starting point to modify the VA consultation management process, to enhance efficiency, reduce providers' workload, and improve safety and access to specialty care. This study has identified design strategies and recommendations for a decision support tool and suggested policies for improved consultation management. The research team is working to ensure that findings are relevant and informative to both CPRS and eHMP.
Heuristic results support guidelines for designing consultation templates: communicate consultants' requirements, limit the need to find external information, and support communication of non-routine information. By addressing usability early in consultation, we expect a decrease in discontinued or cancelled referrals.
This study has informed a Rapid Process Improvement Workshop at the Richard L. Roudebush VA Medical Center (Indianapolis) and two quality improvement initiatives across the Indianapolis, Phoenix, and Palo Alto VA health systems. Specifically, we have stressed, via communication with both referrers and consultants, the importance of the following four recommendations.
- Referrers and consultants should develop, regularly review, and disseminate service agreements that specify guidelines for referral and consultation.
- Each referral should include a specific clinical question and level of urgency, specified by the referrer.
- Referral templates should be complete but concise, reflect service agreements, require only essential information, and avoid a requirement for manual duplication of data available elsewhere in the medical record.
- Questions, uncertainty, disagreement, potential cancellation of a consultation, change to a consultation's type (standard vs. e-consultation), clarifications, and urgent issues should prompt direct, synchronous communication between referrer and consultant.
External Links for this Project
NIH ReporterGrant Number: I01HX000892-01A2
Dimensions for VADimensions for VA is a web-based tool available to VA staff that enables detailed searches of published research and research projects.
If you have VA-Intranet access, click here for more information vaww.hsrd.research.va.gov/dimensions/
VA staff not currently on the VA network can access Dimensions by registering for an account using their VA email address. Search Dimensions for this project
DRA: Health Systems
DRE: Diagnosis, Technology Development and Assessment
MeSH Terms: none