Advancing Clinician’s Skills on Using EMR System
Mr. Rami Alkhleitit, Dr. Yousef Alshrari, Mr. Khalid AlZahrani, Ms. Sulafa Rasheed, Ms. Sara Alenezi, Ms. Fatima AlAwad, Ms. Layla Alahmed
Unified Nursing Research, Midwifery & Women’s Health Journal
Authors Names: Mr. Rami Alkhleitit, Dr. Yousef Alshrari, Mr. Khalid AlZahrani, Ms. Sulafa Rasheed, Ms. Sara Alenezi, Ms. Fatima AlAwad, Ms. Layla Alahmed
Category: Original research
Country: Saudi Arabia
Journals Short Code: UNRMWHJ
Unified Citation Journals, 4(1) 11-45; https://doi.org/10.52402/Nursing2024
ISSN 2754-0944
Volume 2, Issue 3, March 15 2025
Received: December 09, 2024, Reviewed: November 26th, 2024, Accepted: December 26th, 2024, Published: March 20, 2025
Biography:
Health Informatics Specialist, Clinical System Training Department Chairperson, and PMI-certified project, Management Professional. His scope includes Project and program management, Digital Transformation,
System Analysis, Product Management, and Training.
ABSTRACT:
BACKGROUND:
The Ministry of Health in the Kingdom of Saudi Arabia facilitated one of the best
electronic health records system, EPIC, in the tertiary medical care King Fahad Medical City,
which aims to improve the quality of the medical services provided. Despite the passage of four
years of using the system, the preceptors are struggling to attain the required quality and
establishing the standards, which place it at the forefront of medical facilities among the tertiary
health care hospitals. In this article, we will report on the importance of train the end users on the
system before implementing it, and confirming their knowledge about the aspired objectives of
the organization.
OBJECTIVES:
Measuring the quality of data entered into the system by the physicians. Comparing inputs to the system between two groups of trainees who attended basic and additional training and those who attended basic training only.
METHODS:
Analytical Research design was used, and an observational questionnaire evaluated the clinician’s acts during a patient clinic visit.
RESULTS:
40% of the resident physicians in this study attended basic and extra training sessions. The quality of their entries according to each section presented as follows: 1- Reviewing Patient’s History = 68.5% of the required entries, 2- Completing Documentation = 53.3% of the required documentation, 3- Placing Orders = 80.3% of the required clinical orders, 4- Wrap Up the visit = 79.6% of the required tasks to close the clinic visit, 5- Reviewing In Basket = 36.5% of total incoming messages.
60% of the resident physicians in the study attended basic training only, the quality of their entries according to each section presented as follows: 1- Reviewing Patient’s History = 39.5% of the required entries, 2- Completing Documentation = 34.4% of the required documentation, 3- Placing Orders = 44.4% of the required clinical orders, 4- Wrap Up the visit = 46.9% of the required tasks to close the clinic visit, 5- Reviewing In Basket = 13.9% of total incoming messages.
CONCLUSION:
Providing extra training on the system’s tools and activities for physicians enables them to achieve the highest levels of quality in their entries and compliance with best practices advisory, which reflects the quality of medical services provided to patients in health facilities.
1. INTRODUCTION:
Training management of Electronic Health Records provides elementary training for any healthcare provider to obtain an EHR user account. This training enables the trainee to pass the assessment and gain access to practice the required role in the EHR. Although EHR trainers provide several ways to improve the user knowledge and skills in practicing the system, the reports key indicators that are issued from the system show a gap between what the users learned and their practicing in the system.
EHR usability is known as a major barrier to optimizing the quality of the provided healthcare. Improving EHR usability is a major challenge, with the lack of training in usability or the work environment of EHR designers/developers in the vendor community and EHR analysts making significant configurations in healthcare organizations. Our EHR contains reporting tools designated to track each and every single entry to present the analyzed data as productivity or visualized data. The results of these results can obviously show the strengths and weaknesses areas such as Chart Correction Requests, patient chart deficiency rate, delinquency rate, Clinical Audit Monthly on Reports, and clinician calls for a query are the tools that identify the level of clinician usage. (1)
EHR has been proven to improve patient care generally; providing appropriate training for EHR users can improve their documentation, too, which reflects on the quality of patient participation, diagnostics, and patient outcomes. The level of accuracy for the medical information helps the health care providers to make appropriate decisions, evaluate risks, and review the treatment plans and statistical information. However, some other systems replaced paper documentation with transcribed data after each clinic visit or discharge from hospital stay. This process, called Archiving, is found to be vulnerable to errors such as mixed patient information or data omission. Instead of keeping paper documentation and transcribed data, healthcare providers can have basic training that allows them to enter the information into the system directly with a wide range of saved data which guides the user for every and each field that needs to be filled out. (2)
Obtaining chronological, specific, and accurate information regarding the end user’s practice in the EHR is mostly sophisticated to be gathered electronically through the system and requires the adoption of a detailed assessment formed in a survey. This survey represents the end user’s capability to complete the assigned tasks through HIS in concordant with the organizational standards, and governmental regulations accredited by the public health association. The optimum usage of the system and user practice required a playground system in training which estimates the proficiency and usability of users in the clinics. The playground HIS mimics the actual version of the HIS that is going to be implemented in the institution. The training of users is promoted before implementation of the system and sessions after the system is implemented which is known as reinforcement training. Therefore, we have adopted the survey to measure the end user’s abilities based on the training scope provided in both different sessions. Accordingly, the survey would determine the differences between the end users who attended the regular training sessions only and others who attended both regular and reinforcement sessions. (3), (4)
The training program is established to improve the clinicians’ performance on EHR usability. However, we promote to define the minimum requirements for the user upon their practice for the HER. This action supports the user and defines the capability of clinicians’ usability. Nevertheless, improving usability would positively impact not only the users’ performance but also the productivity of clinicians. Moreover, the organization obtains a tangible improvement in the increment of patients’ visits, orders for ancillary services either lab or radiology, and surgeries performed. This improvement would not be achieved if there is no training program that could determine the minimum requirements for Clinicians’ competency of HER usability. (5)
2. MATERIALS AND METHODS
2.1. Materials
This section presents the research methodology used to conduct this study, which includes research design, population, setting, sampling and sampling method, measurement, data collection and analytics procedures, and ethical considerations.
The training management prepared a web-based survey to assist the surveyors in filling in the answers from their mobiles while walking among the clinician’s rooms in the outpatient department. This survey contains questions related to basic activities that clinicians have learned in their basic training class, the surveyors observed these activities and then answered the questions depending on how the clinician acted to these activities, the surveyors asked the clinician if they attended any additional training after the basic training.
The basic training sessions were conducted in October 2022, and the reinforcement sessions ended in December 2022. The data collection started in February 2023 and finished in July 2023.
The first section contained nine questions focused on physician knowledge of reviewing the patient’s history and patient information entered at the beginning of the visit extended to the last four trips, with all types such as lab, imaging, medications, and allergies.
The second section contained eight questions focused on physician knowledge of documenting and writing notes using system features such as shared or unshared notes with patients, marking a note as sensitive, using smart text or phrases, and reviewing the history and labs.
The third section contained ten questions focused on physician knowledge of placing orders in the system to enable the treatment team to provide all the medical services needed to the patient, such as lab orders, imaging orders, medication orders or renewed prescriptions, and the smoking status.
The next section contained ten questions focused on physician knowledge of closing and wrapping up the visit as one additional feature to ensure the quality of the provided services that met the visit standards such as reviewing the reason for the visit, reviewing vital signs, allergies/contraindications, review medication, provide sick leave and or watcher leave, and after visit summary.
The last section contained eight questions focused on physician skills for managing and communicating as part of the patient chart completion and indicating the level of care provided to the patient, such as reviewing the lab/imaging results messages, reviewing an open patient’s chart, and reviewing orders for cosign and canceled orders.
The questions are particular to the tasks assigned to a clinician during the visits, therefor the answers were either yes or no or not applicable depending on the type of the visit either a new visit or a follow-up visit.
2.2. Population and Samples
The accessible population is the resident physician (R1), who joined the organization recently and trained in basic training and or attended extra training sessions. The inclusion criteria were clinicians working in outpatient clinics, particularly in family medicine clinics, and new or follow-up visit types.
The total number of physician training requests was 321 requests, between October and December 2022, resident physicians (R1) were 71 trainees, the sample contained 15 resident physicians, and 6 trainees out of 15 got reinforcement sessions.
2.3. Data Management and Statistical Analysis
The data was analyzed using the total score for each answer, and discussing the condition when the answer was not applicable.
- RESULTS
3.1.1. Reviewing Patient’s History
This section contains 10 questions, for two groups of physicians, group A is the group that got extra training, they were six in total. The answers showed that 37 steps were completed with a percentage of 68.5% of the quality of their entries, while group B who got basic training only and were nine in total, the answers showed that 32 steps were completed with a percentage of 39.5 of the quality of their entries as it shown in the table below:
%68.5 | 37 | Reviewed Patient’s Info | %39.5 | 32 | Reviewed Patient’s Info | |
%14.8 | 8 | Unviewed Patient’s Info | %44.4 | 36 | Unviewed Patient’s Info | |
%16.7 | 9 | Not Applicable | %16 | 13 | Not Applicable |
3.1.2. The table in detailed:
3.2.1. Completing Documentation
This section contains 10 questions, for two groups of physicians, group A is the group that got extra training, they were six in total, the answers showed that 32 steps were completed with a percentage of 53.3% of the quality of their entries, while the group B that got basic training only and they are nine in total, the answers showed that 31 steps completed with a percentage of 34.4% of the quality of their entries as it is shown in the table below:
53.3% | 32 | Completed Documentation | 34.4% | 31 | Completed Documentation | |
15% | 9 | Incomplete Documentation | 50% | 45 | Incomplete Documentation | |
31.7% | 19 | Not Applicable | 15.6% | 14 | Not Applicable |
3.2.2. The table in detailed:
3.3.1 Placing Orders
This section contains 11 questions, for two groups of physicians, group A is the group that got extra training, they were six in total, the answers showed that 62 steps were completed with a percentage of 80.3% of the quality of their entries, while the group B that got basic training only and they are nine in total, the answers showed that 53 steps completed with a percentage of 44.4% of the quality of their entries as it is shown in the table below:
80.3% | 62 | Completed Orders Steps | 44.4% | 53 | Completed Orders Steps | |
4.55% | 12 | Incomplete Orders Steps | 44.4% | 53 | Incomplete Orders Steps | |
15.2% | 19 | Not Applicable | 11.1% | 20 | Not Applicable |
3.3.2. The table in detailed:
3.4.1. Wrapping up the Visit
This section contains 9 questions, for two groups of physicians, group A is the group that got extra training, they were six in total, the answers showed that 43 steps were completed with a percentage of 79.6% of the quality of their entries, while the group B that got basic training only and they are nine in total, the answers showed that 29 steps completed with a percentage of 46.9% of the quality of their entries as it is shown in the table below:
79.6% | 43 | Completed Steps | 46.9% | 29 | Completed Steps | |
14.8% | 8 | Incomplete Steps | 48.1% | 39 | Incomplete Steps | |
5.6% | 3 | Not Applicable | 4.9% | 4 | Not Applicable |
3.4.2. The table in detailed:
3.5.1. Reviewing In Basket
This section contains 16 questions for two groups of physicians; group A is the group that got extra training; there were six in total. The answers showed that 35 steps were completed with a percentage of 36.5% of the quality of their entries, while group B got basic training only, they were nine in total, their answers showed that 20 steps were completed with a percentage of 13.9% of the quality of their entries as it is shown in the table below:
36.5% | 35 | Completed the In Basket Steps | 13.9% | 20 | Completed the In Basket Steps | |
30.2% | 29 | Incomplete the In Basket Steps | 84.7% | 122 | Incomplete the In Basket Steps | |
33.3% | 32 | Not Applicable | 1.4% | 2 | Not Applicable |
3.5.2. The table in detailed:
3.6.1. Evaluate Skill and Knowledge of the Clinician
This section contains 5 questions, for two groups of physicians, group A is the group that got extra training, they were six in total, and their answers showed that 35 steps were completed with a percentage of 36.5% of the quality of their entries, while group B that got basic training only, they were nine in total, their answers showed that 20 steps completed with a percentage of 13.9% of the quality of their entries as it is shown in the table below:
63.3 | 19 | Advance Skills | 64.4 | 29 | Advance Skills | |
3.3 | 1 | Unsatisfied | 17.8 | 8 | Unsatisfied | |
33.3 | 10 | Neutral | 17.8 | 8 | Neutral |
3.6.2. The table in detailed:
4. OBJECTIVES:
- Measuring the quality of data entered into the system by the physicians.
- Comparing inputs to the system between trainees who received basic and reinforcement training and those who received basic training only.
-
PARTICIPANTS AND SURVEY INSTRUMENT:
Clinicians’ in-basket messages show patient chart deficiencies, patient chart correction requests, and the rate of clinician compliance with patient chart regulations. This survey includes all physicians who joined the organization recently. This study targeted resident physicians in the family medicine clinic to compare physicians who attended basic training and extra training sessions with physicians who attended basic training only.
-
TIME-MOTION STUDY:
From the start of the clinic visit until getting printed After Visit Summary at the end of the visit.
-
ETHICS APPROVAL AND CONSENT TO PARTICIPATE:
Ethical approval for conducting this study was obtained from the Research Center (IRB: 22-593) at King Fahad Medical City.
-
DISCUSSION:
This study compared the outcomes of basic training for trainees and the outcomes of basic training plus booster sessions to reach the quality of user inputs. 40% of physicians who received basic training and additional training showed better results than their colleagues who represented 60% of the study. Although the quality level was similar between the two groups, the impact on trainees who received additional training was reflected in other aspects, such as the physician’s confidence in providing services and making decisions and the speed of searching for them in the system through the resident’s impression of the physician’s performance in the clinic. On the other hand, the group of physicians who received only basic training were slower in reviewing patient information, took longer to access medical orders and search for the appropriate diagnosis, and took longer to ask their colleagues about some inputs and their locations in the system. This effect may not be reflected in the table below because of the quantity for each group:
Response | Overview of the workspace | Overview of the schedule information | Overview of using Storyboard | Learning Home Dashboard and F1 Menu | Help Disk | |||||
yes (n=6) | no (n=9) | yes (n=6) | no (n=9) | yes (n=6) | no (n=9) | yes (n=6) | no (n=9) | yes (n=6) | no (n=9) | |
Agree | 83.3 | 100.0 | 100.0 | 100.0 | 100.0 | 100.0 | 33.3 | 22.2 | 0.0 | 0.0 |
Disagree | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 55.6 | 16.7 | 33.3 |
Neutral | 16.7 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 66.7 | 22.2 | 83.3 | 66.7 |
These questions indicate the evaluator’s opinion of how the physician responds to the tasks assigned to them to complete them on the system. (“Yes” means the physician has trained basic type and extra training type; “No” means the physician has trained basic training only). The first question asked about the knowledge of using the tools and activities within the physician interface, and the evaluator can choose among three choices as follows: 1- Agree 2- Neutral 3- Disagree. The second question asked about the physician’s knowledge of completing scheduling steps for the next visit, including ordering labs or radiation. The next question was about viewing the patient’s information in the chart. The section that came after asked about the physician’s capability to find useful tips or guides to perform any workflow required in the system during the visit without asking another college. The last question asked about the physician’s knowledge to notify the system about incorrect or invalid patient information. The result for these questions is shown in the table as follows:
Basic and Extra training | Basic training | |||||||
Group A | 63.3 | 19 | Agree | Group B | 64.4 | 29 | Agree | |
3.3 | 1 | Disagree | 17.8 | 8 | Disagree | |||
33.3 | 10 | Neutral | 17.8 | 8 | Neutral |
The results also depend on how much physicians comply with the best practice guides. On the other hand, some physicians know how to do a particular task, but they don’t do it. The operational management directions never force physicians to comply with the best practices. The system considers the best practice advisory a well-defined platform that measures the system’s usability and maintains patient safety if the physician complies with the guides provided by the system administration.
- CONCLUSION:
Although the percentage of trainees who received basic and reinforcement training is less than the percentage of trainees who received basic training only, the quality of the inputs of the first group, which represents 40% of the participants in this study, was better than the quality of the inputs of the second group, which represents 60% of the participants in This study indicates that providing reinforcement training to trainees on the system after using it enhances the quality of input and increases the trainee’s confidence in using the system.
Attending extra training classes is not showing as expected for several reasons:
- Clinicians learned more from their colleague’s practices than from the training class.
- The mandatory sections in the system bring similar results for both groups, which means both groups will need to do a particular task to complete the clinic visit.
- The primary reason for weak compliance is the operational management’s lack of auditing clinician practices.
However, the five sections in the questionnaire present the quality of the physician’s documentation and the healthcare requirements the health organization provides. Although the EHR vendors have few resources to understand the usability processes employed by EHR vendors during product design and development (6), the platform that the EHR system provided to this health organization was able to be developed and measured based on international practices and the local policies and procedures that controlled by a local credential health organizations.
- BACKGROUND:
The Ministry of Health in the Kingdom of Saudi Arabia applied electronic health records
in several governmental hospitals. King Fahad Medical City is one of the famous medical cities
that provides tertiary medical care and applies one of the best systems, which aims to improvethe quality of the medical services provided. Despite the passage of four years of using the
system, the facility is unable to reach the required quality and standard numbers that place it at
the forefront of medical facilities that provide tertiary health care. In this article, we will report
on the importance of training the end users on the system before implementing it and confirming
their knowledge of the objectives of the organization that it aspires to reach in order to contribute
to achieving those objectives.OBJECTIVES:To measure the data quality entered into the system by two groups of physician trainees
that had attended ‘basic and additional training’ with the ‘basic training’ only.METHODS:An Observational Cross-Sectional Analytical Research design was used, and a
questionnaire consisting of four sections evaluated the clinician's actions during a patient clinic
visit.RESULTS:
‘Basic and Extra training (Group A)’ sessions were attended by 6 (40%) of the Resident
physicians, while ‘Basic training (Group B)’ sessions were attended by the remaining 9 (60%)
Resident physicians. The mean cumulative percentage score depict the quality work, and the
difference between those mean proportions was measured by Independent-Samples Mann-
Whitney U Test. Hence the subsequent comparative assessments between ‘Basic and Extra
training’ to ‘Basic training’ of the Resident physicians in 1-Reviewing Patient’s History was
68.5% : 39.5% (p=0.113), 2-Completing Documentation was 53.3% : 34.4% (p=0.315), 3-
Placing required clinical orders was 80.3% : 44.4% (p=0.023), 4-Wrap Up the visit was 79.6% :
46.9% (p=0.040), and 5-Reviewing In Basket of total incoming messages was 36.5% : 13.9%
(p=0.008).CONCLUSION:Providing extra training on the system’s tools and activities for physicians enables them
to achieve the highest levels of efficiency in their entries and compliance with best practices
advisory, which reflects the quality of medical services provided to patients in health facilities.KEYWORDS:
Health Informatics, Health Information Management, Electronic Medical Records,
Electronic Health Records, Health Care Systems, Health System Management, Training,
Students, Education, Usability, User Interface, User Experience.REFERENCES:
- Zhang, Z., Franklin, A., Walji, M., Zhang, J., & Gong, Y. (2014). Developing Analytical Inspection Criteria for Health IT Personnel with Minimum Training in Cognitive Ergonomics: A Practical Solution to EHR Improving EHR Usability. In AMIA Annual Symposium Proceedings (Vol. 2014, p. 1277). American Medical Informatics Association.
- Subbiah, N. K. (2018). Improving Usability and Adoption of Tablet-based Electronic Health Record (EHR) Applications (Doctoral dissertation, Arizona State University).
- Clarke, M. A., Belden, J. L., & Kim, M. S. (2014). Determining differences in user performance between expert and novice primary care doctors when using an electronic health record (EHR). Journal of evaluation in clinical practice, 20(6), 1153-1161.
- Hoyt, R., Adler, K., Ziesemer, B., & Palombo, G. (2013). Evaluating the usability of a free electronic health record for training. Perspectives in Health Information Management/AHIMA, American Health Information Management Association, 10(Spring).
- Hollister-Meadows, L., Richesson, R. L., Gagne, J. D., & Rawlins, N. (2021). Association between evidence-based training and clinician proficiency in electronic health record use. Journal of the American Medical Informatics Association, 28(4), 824-831.
- Ratwani, R. M., Zachary Hettinger, A., Kosydar, A., Fairbanks, R. J., & Hodgkins, M. L. (2017). A framework for evaluating electronic health record vendor user-centered design and usability testing processes. Journal of the American Medical Informatics Association : JAMIA, 24(e1), e35–e39. https://doi.org/10.1093/jamia/ocw092
Upcoming UCG Conference;
- 15th American Healthcare & Hospital Management Summit from May 14-16, 2025 in San Francisco, USA.
More Details: https://healthcare.utilitarianconferences.com/
Submit Abstract: https://healthcare.utilitarianconferences.com/submit-abstract
To Register in-person: https://healthcare.utilitarianconferences.com/registration
To Register Virtually: https://healthcare.utilitarianconferences.com/virtual-registration - 15th World Healthcare, Hospital Management, Nursing, and Patient Safety Conference from May 14-16, 2025 in San Francisco, USA.
More Details: https://nursing.utilitarianconferences.com/
Submit Abstract: https://nursing.utilitarianconferences.com/submit-abstract
To Register in-person: https://nursing.utilitarianconferences.com/registration
To Register Virtually: https://nursing.utilitarianconferences.com/virtual-registration - 16th International Healthcare, Hospital Management, Nursing, and Patient Safety Conference September 9-11, 2025, Dubai, UAE.
More Details: https://nursing-healthcare.utilitarianconferences.com/
Submit Abstract: https://nursing-healthcare.utilitarianconferences.com/submit-abstract
To Register in-person: https://nursing-healthcare.utilitarianconferences.com/registration
To Register Virtually: https://nursing-healthcare.utilitarianconferences.com/virtual-registration