Cancercase reportPathology

Advancing Clinician’s Skills on Using EMR System

Mr. Rami Alkhleitit

Global Journal of Pathology & Laboratory Medicine
Volume 2, Issue 3, January 2024
Received: December 09, 2024, Reviewed: November 26th, 2024, Accepted: December 26th, 2024, Published: January 02, 2025

Unified Citation Journals, Pathology 2023, 2(3) 1-7; https://doi.org/10.52402/Pathology222
ISSN 2754-0952Global Journal of Pathology & Laboratory Medicine
Unified Citation Journals, Pathology 2024, ISSN2754-0952

Presented in 14th Emirates Pathology, Digital Pathology & Cancer Conference Holiday Inn Dubai, UAE & Virtual
Authors Names: Mr. Rami Alkhleitit

Biography:

December 15, 2022 

IRB Log Number: 22-593 

Department: Assistant Executive Administration of Medical InformationEpic Training Management Category of Approval: EXEMPT 

Dear Rami Mohammad Alkhleitit, Yousef S. Alshrari, Sulafa Asad M. Rasheed, Khalid A. AlZahrani and Sarah Mabrouk G. AlEnizi

I am pleased to inform you that your submission dated December 12, 2022 for the study titled Advancing Clinician’s Skills on using EMR Systemwas reviewed and was approved according to ICH GCP guidelines. Please note that this approval is from the research ethics perspective only. You will still need to get permission from the head of department or unit in KFMC or an external institution to commence data collection

We wish you well as you proceed with the study and request you to keep the IRB informed of the progress on a regular basis, using the IRB log number shown above

Please be advised that IRB for administrative purposes requires that you submit a progress report on your research every 6 months. You are required to submit any manuscript resulting from this research for approval by IRB before submission to journals for publication. 

As a researcher you are required to have current and valid certification on protection human research subjects that can be obtained by taking a short online course at the US NIH site or the Saudi NCBE site followed by a multiple choice test. Please submit your current and valid certificate for our records. Failure to submit this certificate shall a reason for suspension of your research project.

 

ABSTRACT:

BACKGRAUND:

The ministry of health in Kingdom of Saudi Arabia applied the electronic health record in several governmental hospitals, King Fahad Medical City is one of the famous medical cities that provides tertiary medical care and applying one of the best systems which aims to improve the quality of the medical services provided. Despite the passage of four years of using the system, the facility is unable to reach the required quality and standard numbers that place it at the forefront of medical facilities that provide tertiary health care. In this article, we will report on the importance of train the end users on the system before implement it and confirming their knowledge of the objectives of the organization that it aspires to reach in order to contribute to achieving those objectives.

OBJECTIVS:

Measuring the quality of data entered into the system by the physicians. Comparing inputs to the system between two groups of trainees who attended basic and additional training and those who attended basic training only.

METHODS: 

Analytical Research design was used, and an observational questionnaire evaluated the clinician’s acts during a patient clinic visit.

RESULTS:

40% of the resident physicians in this study attended basic and extra training sessions. The quality of their entries according to each section presented as follows: 1- Reviewing Patient’s History = 68.5% of the required entries, 2- Completing Documentation = 53.3% of the required documentation, 3- Placing Orders = 80.3% of the required clinical orders, 4- Wrap Up the visit = 79.6% of the required tasks to close the clinic visit, 5- Reviewing In Basket = 36.5% of total incoming messages.

60% of the resident physicians in the study attended basic training only, the quality of their entries according to each section presented as follows: 1- Reviewing Patient’s History = 39.5% of the required entries, 2- Completing Documentation = 34.4% of the required documentation, 3- Placing Orders = 44.4% of the required clinical orders, 4- Wrap Up the visit = 46.9% of the required tasks to close the clinic visit, 5- Reviewing In Basket = 13.9% of total incoming messages.

CONCLUSION:

Providing extra training on the system’s tools and activities for physicians enables them to achieve the highest levels of quality in their entries and compliance with best practices advisory, which reflects the quality of medical services provided to patients in health facilities.

1. INTRODUCTION:

Training management of Electronic Health Records provides elementary training for any healthcare provider to obtain an EHR user account. This training enables the trainee to pass the assessment and gain access to practice the required role in the EHR. Although EHR trainers provide several ways to improve the user knowledge and skills in practicing the system, the reports key indicators that are issued from the system show a gap between what the users learned and their practicing in the system.

EHR usability is known as a major barrier to optimizing the quality of the provided healthcare. Improving EHR usability is a major challenge, with the lack of training in usability or the work environment of EHR designers/developers in the vendor community and EHR analysts making significant configurations in healthcare organizations. Our EHR contains reporting tools designated to track each and every single entry to present the analyzed data as productivity or visualized data. The results of these results can obviously show the strengths and weaknesses areas such as Chart Correction Requests, patient chart deficiency rate, delinquency rate, Clinical Audit Monthly on Reports, and clinician calls for a query are the tools that identify the level of clinician usage. ‎(1) 

EHR has been proven to improve patient care generally; providing appropriate training for EHR users can improve their documentation, too, which reflects on the quality of patient participation, diagnostics, and patient outcomes. The level of accuracy for the medical information helps the health care providers to make appropriate decisions, evaluate risks, and review the treatment plans and statistical information. However, some other systems replaced paper documentation with transcribed data after each clinic visit or discharge from hospital stay. This process, called Archiving, is found to be vulnerable to errors such as mixed patient information or data omission. Instead of keeping paper documentation and transcribed data, healthcare providers can have basic training that allows them to enter the information into the system directly with a wide range of saved data which guides the user for every and each field that needs to be filled out. ‎(2)

Obtaining chronological, specific, and accurate information regarding the end user’s practice in the EHR is mostly sophisticated to be gathered electronically through the system and requires the adoption of a detailed assessment formed in a survey. This survey represents the end user’s capability to complete the assigned tasks through HIS in concordant with the organizational standards, and governmental regulations accredited by the public health association. The optimum usage of the system and user practice required a playground system in training which estimates the proficiency and usability of users in the clinics. The playground HIS mimics the actual version of the HIS that is going to be implemented in the institution. The training of users is promoted before implementation of the system and sessions after the system is implemented which is known as reinforcement training.  Therefore, we have adopted the survey to measure the end user’s abilities based on the training scope provided in both different sessions. Accordingly, the survey would determine the differences between the end users who attended the regular training sessions only and others who attended both regular and reinforcement sessions. ‎(3), ‎(4)

The training program is established to improve the clinicians’ performance on EHR usability. However, we promote to define the minimum requirements for the user upon their practice for the HER. This action supports the user and defines the capability of clinicians’ usability. Nevertheless, improving usability would positively impact not only the users’ performance but also the productivity of clinicians. Moreover, the organization obtains a tangible improvement in the increment of patients’ visits, orders for ancillary services either lab or radiology, and surgeries performed. This improvement would not be achieved if there is no training program that could determine the minimum requirements for Clinicians’ competency of HER usability. ‎(5)

2. MATERIALS AND METHODS

2.1. Materials

This section presents the research methodology used to conduct this study, which includes research design, population, setting, sampling and sampling method, measurement, data collection and analytics procedures, and ethical considerations.

The training management prepared a web-based survey to assist the surveyors in filling in the answers from their mobiles while walking among the clinician’s rooms in the outpatient department. This survey contains questions related to basic activities that clinicians have learned in their basic training class, the surveyors observed these activities and then answered the questions depending on how the clinician acted to these activities, the surveyors asked the clinician if they attended any additional training after the basic training.

The basic training sessions were conducted in October 2022, and the reinforcement sessions ended in December 2022. The data collection started in February 2023 and finished in July 2023.

The first section contained nine questions focused on physician knowledge of reviewing the patient’s history and patient information entered at the beginning of the visit extended to the last four trips, with all types such as lab, imaging, medications, and allergies.

The second section contained eight questions focused on physician knowledge of documenting and writing notes using system features such as shared or unshared notes with patients, marking a note as sensitive, using smart text or phrases, and reviewing the history and labs.

The third section contained ten questions focused on physician knowledge of placing orders in the system to enable the treatment team to provide all the medical services needed to the patient, such as lab orders, imaging orders, medication orders or renewed prescriptions, and the smoking status.

The next section contained ten questions focused on physician knowledge of closing and wrapping up the visit as one additional feature to ensure the quality of the provided services that met the visit standards such as reviewing the reason for the visit, reviewing vital signs, allergies/contraindications, review medication, provide sick leave and or watcher leave, and after visit summary.

The last section contained eight questions focused on physician skills for managing and communicating as part of the patient chart completion and indicating the level of care provided to the patient, such as reviewing the lab/imaging results messages, reviewing an open patient’s chart, and reviewing orders for cosign and canceled orders.

The questions are particular to the tasks assigned to a clinician during the visits, therefor the answers were either yes or no or not applicable depending on the type of the visit either a new visit or a follow-up visit.

2.2. Population and Samples

The accessible population is the resident physician (R1), who joined the organization recently and trained in basic training and or attended extra training sessions. The inclusion criteria were clinicians working in outpatient clinics, particularly in family medicine clinics, and new or follow-up visit types.

The total number of physician training requests was 321 requests, between October and December 2022, resident physicians (R1) were 71 trainees, the sample contained 15 resident physicians, and 6 trainees out of 15 got reinforcement sessions.

2.3. Data Management and Statistical Analysis

The data was analyzed using the total score for each answer, and discussing the condition when the answer was not applicable.

  1. RESULTS

3.1.1. Reviewing Patient’s History

This section contains 10 questions, for two groups of physicians, group A is the group that got extra training, they were six in total. The answers showed that 37 steps were completed with a percentage of 68.5% of the quality of their entries, while group B who got basic training only and were nine in total, the answers showed that 32 steps were completed with a percentage of 39.5 of the quality of their entries as it shown in the table below:

 

%68.5 37 Reviewed Patient’s Info    %39.5 32 Reviewed Patient’s Info 
%14.8 8 Unviewed Patient’s Info     %44.4 36 Unviewed Patient’s Info  
%16.7 9 Not Applicable   %16 13 Not Applicable

3.1.2. The table in detailed:

3.2.1. Completing Documentation

This section contains 10 questions, for two groups of physicians, group A is the group that got extra training, they were six in total, the answers showed that 32 steps were completed with a percentage of 53.3% of the quality of their entries, while the group B that got basic training only and they are nine in total, the answers showed that 31 steps completed with a percentage of 34.4% of the quality of their entries as it is shown in the table below:

53.3% 32 Completed Documentation   34.4% 31 Completed Documentation
15% 9 Incomplete Documentation   50% 45 Incomplete Documentation
31.7% 19 Not Applicable   15.6% 14 Not Applicable

3.2.2. The table in detailed:

3.3.1 Placing Orders

This section contains 11 questions, for two groups of physicians, group A is the group that got extra training, they were six in total, the answers showed that 62 steps were completed with a percentage of 80.3% of the quality of their entries, while the group B that got basic training only and they are nine in total, the answers showed that 53 steps completed with a percentage of 44.4% of the quality of their entries as it is shown in the table below:

80.3% 62 Completed Orders Steps   44.4% 53 Completed Orders Steps
4.55% 12 Incomplete Orders Steps   44.4% 53 Incomplete Orders Steps
15.2% 19 Not Applicable   11.1% 20 Not Applicable

 

3.3.2. The table in detailed:

3.4.1. Wrapping up the Visit

This section contains 9 questions, for two groups of physicians, group A is the group that got extra training, they were six in total, the answers showed that 43 steps were completed with a percentage of 79.6% of the quality of their entries, while the group B that got basic training only and they are nine in total, the answers showed that 29 steps completed with a percentage of 46.9% of the quality of their entries as it is shown in the table below:

79.6% 43 Completed Steps   46.9% 29 Completed Steps
14.8% 8 Incomplete Steps   48.1% 39 Incomplete Steps
5.6% 3 Not Applicable   4.9% 4 Not Applicable

 

3.4.2. The table in detailed:

3.5.1. Reviewing In Basket

This section contains 16 questions for two groups of physicians; group A is the group that got extra training; there were six in total. The answers showed that 35 steps were completed with a percentage of 36.5% of the quality of their entries, while group B got basic training only, they were nine in total, their answers showed that 20 steps were completed with a percentage of 13.9% of the quality of their entries as it is shown in the table below:

 

36.5% 35 Completed the In Basket Steps   13.9% 20 Completed the In Basket Steps
30.2% 29 Incomplete the In Basket Steps   84.7% 122 Incomplete the In Basket Steps
33.3% 32 Not Applicable   1.4% 2 Not Applicable

 

3.5.2. The table in detailed:

3.6.1. Evaluate Skill and Knowledge of the Clinician

This section contains 5 questions, for two groups of physicians, group A is the group that got extra training, they were six in total, and their answers showed that 35 steps were completed with a percentage of 36.5% of the quality of their entries, while group B that got basic training only, they were nine in total, their answers showed that 20 steps completed with a percentage of 13.9% of the quality of their entries as it is shown in the table below:

63.3 19 Advance Skills   64.4 29 Advance Skills
3.3 1 Unsatisfied   17.8 8 Unsatisfied
33.3 10 Neutral   17.8 8 Neutral

 

3.6.2. The table in detailed:

4. OBJECTIVES:

  1.  Measuring the quality of data entered into the system by the physicians.
  2.  Comparing inputs to the system between trainees who received basic and reinforcement training and those who received basic training only.
  1. PARTICIPANTS AND SURVEY INSTRUMENT:

Clinicians’ in-basket messages show patient chart deficiencies, patient chart correction requests, and the rate of clinician compliance with patient chart regulations. This survey includes all physicians who joined the organization recently. This study targeted resident physicians in the family medicine clinic to compare physicians who attended basic training and extra training sessions with physicians who attended basic training only. 

  1. TIME-MOTION STUDY:

From the start of the clinic visit until getting printed After Visit Summary at the end of the visit.

  1. ETHICS APPROVAL AND CONSENT TO PARTICIPATE:

Ethical approval for conducting this study was obtained from the Research Center (IRB: 22-593) at King Fahad Medical City.

  1. DISCUSSION:

This study compared the outcomes of basic training for trainees and the outcomes of basic training plus booster sessions to reach the quality of user inputs. 40% of physicians who received basic training and additional training showed better results than their colleagues who represented 60% of the study. Although the quality level was similar between the two groups, the impact on trainees who received additional training was reflected in other aspects, such as the physician’s confidence in providing services and making decisions and the speed of searching for them in the system through the resident’s impression of the physician’s performance in the clinic. On the other hand, the group of physicians who received only basic training were slower in reviewing patient information, took longer to access medical orders and search for the appropriate diagnosis, and took longer to ask their colleagues about some inputs and their locations in the system. This effect may not be reflected in the table below because of the quantity for each group:

Response Overview of the workspace Overview of the schedule information Overview of using Storyboard Learning Home Dashboard and F1 Menu Help Disk
yes (n=6) no (n=9) yes (n=6) no (n=9) yes (n=6) no (n=9) yes (n=6) no (n=9) yes (n=6) no (n=9)
Agree 83.3 100.0 100.0 100.0 100.0 100.0 33.3 22.2 0.0 0.0
Disagree 0.0 0.0 0.0 0.0 0.0 0.0 0.0 55.6 16.7 33.3
Neutral 16.7 0.0 0.0 0.0 0.0 0.0 66.7 22.2 83.3 66.7

These questions indicate the evaluator’s opinion of how the physician responds to the tasks assigned to them to complete them on the system. (“Yes” means the physician has trained basic type and extra training type; “No” means the physician has trained basic training only). The first question asked about the knowledge of using the tools and activities within the physician interface, and the evaluator can choose among three choices as follows: 1- Agree 2- Neutral 3- Disagree. The second question asked about the physician’s knowledge of completing scheduling steps for the next visit, including ordering labs or radiation. The next question was about viewing the patient’s information in the chart. The section that came after asked about the physician’s capability to find useful tips or guides to perform any workflow required in the system during the visit without asking another college. The last question asked about the physician’s knowledge to notify the system about incorrect or invalid patient information. The result for these questions is shown in the table as follows:

Basic and Extra training Basic training
Group A 63.3 19 Agree Group B 64.4 29 Agree
3.3 1 Disagree 17.8 8 Disagree
33.3 10 Neutral 17.8 8 Neutral

 

The results also depend on how much physicians comply with the best practice guides. On the other hand, some physicians know how to do a particular task, but they don’t do it. The operational management directions never force physicians to comply with the best practices. The system considers the best practice advisory a well-defined platform that measures the system’s usability and maintains patient safety if the physician complies with the guides provided by the system administration. 

  1. CONCLUSION:

Although the percentage of trainees who received basic and reinforcement training is less than the percentage of trainees who received basic training only, the quality of the inputs of the first group, which represents 40% of the participants in this study, was better than the quality of the inputs of the second group, which represents 60% of the participants in This study indicates that providing reinforcement training to trainees on the system after using it enhances the quality of input and increases the trainee’s confidence in using the system.

Attending extra training classes is not showing as expected for several reasons:

  • Clinicians learned more from their colleague’s practices than from the training class.
  • The mandatory sections in the system bring similar results for both groups, which means both groups will need to do a particular task to complete the clinic visit.
  • The primary reason for weak compliance is the operational management’s lack of auditing clinician practices.

However, the five sections in the questionnaire present the quality of the physician’s documentation and the healthcare requirements the health organization provides. Although the EHR vendors have few resources to understand the usability processes employed by EHR vendors during product design and development ‎(6), the platform that the EHR system provided to this health organization was able to be developed and measured based on international practices and the local policies and procedures that controlled by a local credential health organizations.

 REFERENCES:

  1. Zhang, Z., Franklin, A., Walji, M., Zhang, J., & Gong, Y. (2014). Developing Analytical Inspection Criteria for Health IT Personnel with Minimum Training in Cognitive Ergonomics: A Practical Solution to EHR Improving EHR Usability. In AMIA Annual Symposium Proceedings (Vol. 2014, p. 1277). American Medical Informatics Association. ‏
  2. Subbiah, N. K. (2018). Improving Usability and Adoption of Tablet-based Electronic Health Record (EHR) Applications (Doctoral dissertation, Arizona State University).‏
  3. Clarke, M. A., Belden, J. L., & Kim, M. S. (2014). Determining differences in user performance between expert and novice primary care doctors when using an electronic health record (EHR). Journal of evaluation in clinical practice20(6), 1153-1161.‏
  4. Hoyt, R., Adler, K., Ziesemer, B., & Palombo, G. (2013). Evaluating the usability of a free electronic health record for training. Perspectives in Health Information Management/AHIMA, American Health Information Management Association10(Spring).‏
  5. Hollister-Meadows, L., Richesson, R. L., Gagne, J. D., & Rawlins, N. (2021). Association between evidence-based training and clinician proficiency in electronic health record use. Journal of the American Medical Informatics Association28(4), 824-831.‏
  6. Ratwani, R. M., Zachary Hettinger, A., Kosydar, A., Fairbanks, R. J., & Hodgkins, M. L. (2017). A framework for evaluating electronic health record vendor user-centered design and usability testing processes. Journal of the American Medical Informatics Association : JAMIA24(e1), e35–e39. https://doi.org/10.1093/jamia/ocw092

Upcoming Conferences;

  1. 15th World Pathology, Digital Pathology & Cancer Conference from September 02-04, 2025 in Abu Dhabi, UAE
    To know more Details: https://pathology.utilitarianconferences.com/
    To Submit Your abstract Visit: https://pathology.utilitarianconferences.com/submit-abstract
    Register here: https://pathology.utilitarianconferences.com/registration
  2. 13th World Digital Pathology & AI UCG Congress from September 02-04, 2025 in Abu Dhabi, UAE
    To know more Details: https://digitalpathology.utilitarianconferences.com/
    Submit your Abstract here: https://digitalpathology.utilitarianconferences.com/submit-abstract
    Register here: https://digitalpathology.utilitarianconferences.com/registration
  3. 15th World Gastroenterology, IBD, and Hepatology Conference. Join us on December 17-19, 2025, in Dubai, UAE

    Submit your Abstract here: https://gastroenterology.utilitarianconferences.com/submit-abstract
    Register here: https://gastroenterology.utilitarianconferences.com/registration

    This abstract of Manuscript/Paper/Article is an open access Manuscript/Paper/Article distributed under the Creative Commons Attribution License (https://doi.org/10.52402/Pathology222) which allows and permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited and accepted.

    To citation of this article: Yousef Assy M.D. • Yamama H. Mahamid •Roaa Badran M.D. •Waheeb Alturi M.D. • Hammodeh Abdel HadiGlobal Journal of Pathology & Laboratory Medicine

     

Related Articles

Back to top button