Journal of Clinical Images and Medical Case Reports

ISSN 2766-7820
Research Article - Open Access, Volume 4

Application of workplace-based assessment for family medicine postgraduates students in faculty of medicine Menoufia university

Alkalash SH 1,2; Farag NA2*

1Department of Community Medicine and Healthcare, Faculty of Medicine, Umm Al-Qura University, Al-Qunfudah, KSA.

2Department of Family Medicine, Faculty of Medicine, Menoufia University, Shebin ElKom, Menoufia, Egypt.

*Corresponding Author : Farag NA
Department of Family Medicine, Faculty of Medicine, Menoufia University, Shebin ElKom, Menoufia, Egypt.
Email: [email protected]

Received : Nov 21, 2022

Accepted : Dec 29, 2022

Published : Jan 05, 2023

Archived :

Copyright : © Farag NA (2023).


Background: Formative assessment is one of the most valuable approaches that could help medical students to learn and improve their clinical knowledge and skills. Its value comes from the informative and constructive feedback after careful observation of the candidates by their medical educators.

Aim of work: to assess changes in clinical performance of family medicine postgraduates after application of three consecutive Mini Clinical Evaluation Exercises (MINICEX) and two successive Direct Observation Of Procedural Skills (DOPS) as a formative assessment in addition to evaluation of their satisfaction by them.

Participants and methods: A prospective study was executed in family medicine department - Menoufia faculty of medicine-Egypt and passed through two-phases.

First phase was application of awareness lecture for family medicine staff members (N.11) and postgraduates (N.21) about WBAs.

Second phase was implementation of a monthly MINICEX and DOPS for family medicine postgraduates (N. 21) (from January to March. 2020) with total of (105 feedbacks) from Mini-CEX (N.63) and DOPS (N.42 feedbacks of DOPS about adult CPR) were collected and analyzed. Finally, students’ satisfaction by application of WBAs was assessed.

Results: A significant improvement was detected in MINICEX and DOPS feedback scores of the postgraduates throughout the consecutive sessions (9.5 ± 2.7, 24.9 ± 2.5 & 27.29 ± 1.5) (P value < 0.001) for Mini-CEX and (6.1 ± 1.8 versus 9.0 ± 1.2) (P value < 0.001) for DOPS. About 93%of students recommended its application for other medical students and 86% of them requested to perform it again for other different cases and procedures.

Conclusion: MINI-CEX and DOPS proved their capabilities to improve clinical knowledge and skills among family medicine postgraduates. Additionally; they were satisfied by its application as a formative assessment in order to improve their clinical performance and alleviate their stress related to final clinical exams.

Keywords: Clinical knowledge; DOPs; Family medicine; MINICEX; Postgraduate education; Skills; Workplace-based assessment.

Abbreviations: CPR: Cardiopulmonary Resuscitation.; DOPS: Direct Observation of Procedural Skills.; MINICEX: Mini Clinical Evaluation Exercise.; N: Number.; WBA: Workplace-Based Assessment.

Citation: Alkalash SH, Farag NA. Application of workplace-based assessment for family medicine postgraduates students in faculty of medicine Menoufia university. J Clin Images Med Case Rep. 2023; 4(1): 2230.


Family doctors have a fundamental role to play in achieving health reforms and policy goals. Both access to care and its quality are essential components of health insurance and universal health coverage [1]. Family physicians with additional postgraduate clinical training to become specialists can improve the quality of primary and district level care when they are part of healthcare teams [2,3], also they accomplish this through their roles as clinicians, consultants, capacity builders, clinical trainers, leaders of clinical governance and supporters of community-orientated primary care [4]. or just over two decades leading educationists, including medical educators, have highlighted the intimate relationship between learning and assessment. Indeed, in an educational context it is now argued that learning is the key purpose of assessment (van der Vleuten 1996; Gronlund 1998, Shepard 2000). At the same time as this important connection was being stressed in the education literature; there were increasing concerns about the workplace-based training of doctors.

A study by Day et al. (1990) in the United States documented that the vast majority of first-year trainees in internal medicine were not observed more than once by a faculty member in a patient encounter where they were taking a history or doing a physical examination. Without this observation, there was no opportunity for the assessment of basic clinical skills and, more importantly, the provision of feedback to improve performance.

As one step in encouraging the observation of performance by faculty, the American Board of Internal Medicine proposed the use of the mini-Clinical Evaluation Exercise (mini-CEX) (Norcini et al. 1995). In the mini-CEX, a faculty member observes a trainee as he/she interacts with a patient around a focused clinical task. Afterwards, the faculty member assesses the performance and provides the trainee feedback. It was expected that trainees would be assessed several time throughout the year of training with different faculty and in different clinical situation.

In medicine evaluating clinical performance is essential but difficult. In the past; evaluations have been implicit, unreliable, and reliant on individual or subjective judgments (the apprenticeship model) [5]. However recent changes to postgraduate medical education have introduced new methods for evaluating students’ performance and competency [6]. One of these systems is workplace-based assessment. Workplace based assessment (WPBA) is the term given to a collection of assessment methods that measure trainees’ performance in clinical settings. The hallmark of WPBA is the component that involves observing the trainee’s performance in a real professional situation and providing pertinent feedback, encouraging reflective practice [7].

Workplace based evaluation or simply workplace-based assessment is an examination of what doctors really do in practice as defined by the assessment of day-to-day procedures conducted in the working environment. Although a doctor’s knowledge or competence can be demonstrated in a variety of ways, there is evidence that competency does not consistently predict performance in clinical practice. The capacity to measure performance at the workplace is a significant benefit [8].

There are several WPBA techniques that all try to evaluate different aspects of trainee performance. Direct observation of procedural abilities (DOPS) and Mini-clinical Evaluation Exercise (MINICEX). A healthcare institution’s trainee-patient contact is evaluated by an assessor during the MINICEX. These clinical interactions are anticipated to take approximately 15 minutes during which the student is expected to complete a focused history and/or physical examination [9]. At the conclusion, the student offers a diagnosis and a treatment strategy. The performance is then rated using a systematic evaluation form and helpful criticism is offered. A nine-point Likert scale from “unsatisfactory” to “outstanding,” is used by the assessors [10].

Direct observation of procedural skills (DOPS) was introduced by the Royal College of Physicians and now forms an integral component of WPBA for doctors in the foundation year and those in specialist training. It was specifically designed to assess procedural skills involving real patients in a single encounter [11].

An assessor analyzes a trainee doing a procedure as part of his or her usual practical training during the DOPS, and then a face-to-face feedback session is held [12]. DOPS is a scoring system for clinical exams and practical activities. In the UK, it has been demonstrated that such approach of evaluation is valid, reliable, and practicable when evaluating postgraduate medical registrars [13].

For trainees looking for improving their performance in a skill, DOPS is regarded as a valuable learning opportunity. Its timely and efficient operation requires detailed collaboration between the assessor and the learner. DOPS assessments are in tailor-made to be conveniently integrated into trainees’ and assessors’ daily routine and hence considered highly feasible [14]. Therefore this work was done to detect changes in clinical performance of family medicine postgraduates after application of three consecutive Mini Clinical Evaluation Exercise (MINICEX) and two successive Direct Observation Of Procedural Skills (DOPS) as a formative assessment in addition to evaluation of their satisfaction by them.


Study design and setting: A quasi-experimental prospective study was executed in family medicine department - Menoufia faculty of medicine Egypt.

Participants: All family medicine staff members (N.11) and postgraduate students (N.21) were invited to participate in study over 12 week study period (from January to March 2020). Procedure: This study passed through two-phases.

First phase was application of an awareness lecture for family medicine staff members and postgraduate students about WBAs as they are methods of evaluation focus on the top of the pyramid in Miller’s framework for evaluating clinical competence and gather data on how well doctors perform in their regular practice. Some of the most popular techniques for workplace-based evaluations include Direct Observation of Procedural Skills (DOPS), Mini-Clinical Evaluation Exercise (mini-CEX), and Case-Based Discussion (CBD). The benefits of mini-CEX and DOPS evaluation techniques were described and how they might be used to enhance active, learner-centered learning and the provision of developmental care for medical students and postgraduates. One lecture was given for about 120 minutes. The objectives of this lecture involved definition of workplace based assessment, its different approaches, how to conduct, privileges and challenges of its implementation.

Second phase involved rolling out a monthly MINICEX and DOPS for postgraduate’s students in family medicine (N.21). (From January to March 2020). Implementation of a monthly (MINICEX) for three consecutive months. MINICEX was conducted in family medicine clinic on a selected case from their curriculum (chronic obstructive pulmonary disease) using a checklist which was designed according to (royal college of general practitioner).

Each student had 15 minutes to conduct his/her Mini-CEX followed by five minutes to receive an informative feedback from his/her observer through using a predesigned checklist. This checklist involved items of history taking, physical examination skills, communication skills, clinical judgment, professionalism, organization/efficiency, and overall clinical care (Table 1).

Rating Scale: A rating scale consisting of six-points is used in Mini-CEX; unsatisfactory =if score was 2 or less, borderline if he/she scored 3 while performance score 4-6 was considered to meet or above the expectation.

DOPS is a formative assessment method for workplace-based procedural skills evaluation for postgraduate students. It is the counterpart of mini-CEX on the practical skills as part of the quality control process. For maximum educational impact, the trainee is monitored throughout their regular daily performance at their work and is initially given constructive feedback. they were subjected to perform DOPS in skill lab on adult cardiopulmonary resuscitation CPR for two consecutive months using a checklist was designed according to American heart association and getting 42 feedback sheets of DOPS about adult CPR.

Rating Scale: scoring for student’s performance during DOPS was conducted in the form of score (1) when the procedure was done while (0) if not done.

Each student delivered feedback about his/her performance at the end of each session (MINICEX or DOPS). Finally, ten questions were used to evaluate how satisfied the students were with the application of WBAs. There are three questions about MINICEX (whether it helps them understand the required material, if it provides them useful feedback on how they performed, and if the feedback given to students about their MINICEX performance was friendly), and four questions about DOPS (it is useful in understanding contents in CPR & choking, it is provide effective feedbacks about students DOPS performance, if feedbacks about students DOPS performance was given friendly). Also the postgraduates were asked about their willing to undertake this type of formative exam again and if they would recommend it for other medical students. They really should inquire about the probability of mistakes made easily and knowledge gaps. Workplace-based assessments of students’ satisfaction revealed five scores: Strongly agree, neutral, agree, disagree, and strongly disagree.

Data management: Data were analyzed using Excel, 2013 and Statistical Package of Social Science (SPSS) version 23 (using IBM personal computer). Quantitative data were expressed as mean and standard deviations (X±SD) and analyzed by student t-test. Anova test was used to detect Statistical differences among the means of MINICEX change scores and Chi square to detect Statistical differences among the frequency change scores of DOPS.

Figure 1: Changes in MINICEX student’s feedback scores.

Figure 2: Satisfaction of postgraduate by application of workplace based assessment (WBAs)


A total of 21 family medicine postgraduates were recruited in this study, their ages ranged from 26 to 32 years with mean ±SD (27.7 ± 1.68) years with female sex predominance 19 females (90.9%).

There was a significant improvement in MINICEX feedback scores of the postgraduates students for three consecutive months regarding medical history, physical examination, and evaluating communication skills, management, organization and professionalism (9.5 ± 2.7, 24.9 ± 2.5 & 27.29 ± 1.5) (P value < 0.001) ( Table 2 & Figure 1).

There was a significant improvement in DOPS feedback scores of the postgraduates students for two consecutive months in skill lab (6.1 ± 1.8 versus 9.0 ± 1.2) (P value < 0.001).Table 3.

About 93% of students recommended its application for other medical students and 86% of them agreed to perform it again for other different cases and procedures (Table 4 & Figure 2).

Table 1:Specific competencies assessed on mini-CEX. Adapted from AMEE Guide,2007 [15].
History taking Facilitates the accurate collection of a patient's medical history and effectively applies questions to get the necessary information. Creates trust and confidentiality and demonstrates respect, compassion, and empathy.
Physical Examination Follows an effective, logical sequence balances in problem diagnosis and screening caring about the safety and respect of the patient
Communication skills Communicates effectively with patients and their relatives. Explains the purpose of the test or treatment and gets the patient's permission. Teaches/advises on disease management.
Management Makes appropriate diagnosis and management with Considers risks and benefits of prescribed treatment.
Professionalism Demonstrates respectful and professional behavior when interacting with patients, their attendants, and other professionals (e.g., peers, consultants, nursing, professionals and support personnel) accepts and completes responsibilities with displays ethics.
Organization/efficiency Prioritizes; is timely and succinct; summarizes
Overall Clinical Competence Demonstrates judgment, synthesis, caring, effectiveness and efficiency in patient care

Table 2:Changes in MINICEX students’ feedback scores
Items of assessment First MINICEX Second MINICEX Third MINICEX F P value
History taking 1.19 ± 0.40 3.42 ± 0.67 4.0 ± 0.55 151.14 < 0.001*
Physical examination 1.19 ± 0.40 3.38 ± 0.74 3.6 ± 0.49 117.96 < 0.001*
Communication skills 2.00 ± 1.34 4.04 ± 0.59 4.1 ± 0.0.57 37.29 < 0.001*
Management 1.61 ± 0.92 3.48 ± 0.51 3.9 ± 0.36 72.96 < 0.001*
Professionalism 1.19 ± 0.60 3.57 ± 0.51 3.9 ± 0.54 151.78 < 0.001*
Organization 1.09 ± 0.30 3.57 ± 0.59 4.0 ± 0.63 182.75 < 0.001*
Overall clinical care 1.19 ± 0.51 3.38 ± 0.67 4.0 ± 0.63 110.30 < 0.001*
Total score 9.5 ± 2.7 24.9 ± 2.5 27.29 ± 1.5 379.00 < 0.001*

F= One way Anova test * highly statistically significant difference

Table 3: Changes in Direct Observation of Procedures (DOPS of Adult CPR) feedback scores of the postgraduates.
Items of assessment First DOPS Second DOPS Paired T test P value
-Assessment of scene safety/PPE 0.62 ± 0.49 0.86 ± 0.36 2.02 0.06
-Check victim response 0.52 ± 0.51 1.00 ± 0.00 4.26 < 0.001**
-Check carotid pulse and breathing not more than 10 second 0.67 ± 0.48 1.00 ± 0.00 3.16 0.005*
-Call for help(ambulance /AED) 0.478 ± 0.51 0.81 ± 0.40 2.32 0.03*
-Remove victim clothes 0.71 ± 0.46 0.90 ± 0.30 1.45 0.16
-Start effective compressions:
Proper hand placement at lower 1/3 of sternum.
Compress at least 2 inches.
Conduct 30 compressions.
Allow complete chest recoil.
0.62 ± 0.49 0.95 ± 0.22 2.65 0.02*
-Open airway (head tilt chin lift) 0.67 ± 0.48 1.00 ± 0.00 3.16 0.005*
-Effective breathing (2 breath every 30 compression)
notice chest rise by pocket mask in one rescuer CPR.
0.81 ± 0.40 0.86 ± 0.36 0.44 0.67
-Effective breathing (2 breath every 30 compression)
notice chest rise by bag mask in 2 rescuers CPR.
0.52 ± 0.51 0.76 ± 0.43 2.02 0.06
-Automated External Defibrillator (AED) use:
Turn on it.
Follow AED prompts.
Give shock when advised
0.48 ± 0.51 0.86 ± 0.36 2.96 0.008*
-Total DOPS score 6.09 ± 1.84 9.00 ± 1.22 5.79 < 0.001**

Table 4: The course of endoscopic treatments.
Strongly disagree Disagree Neutral Agree Strongly agree Items
0.0% 0.0% 3.1% 45.5% 48.5% -MINICEX useful for understanding required contents.
0.0% 0.0% 3.1% 62.5% 34.4% -Direct observation of procedure (DOPS) [CPR &Chocking] useful for understanding required contents.
3.1% 6.3% 43.8% 9.4% 37.5% -Direct observation of procedure (DOPS) [IUD insertion] useful for understanding required contents.
0.0% 0.0% 6.3% 50.0% 43.8% -DOPS (of CPR &Chocking) is useful for understanding required contents.
0.0% 0.0% 3.1% 46.9% 50.0% -Effective feedback was received about student’s performance in MINI-CEX.
0.0% 0.0% 3.1% 53.1% 43.8% -Effective feedback was received about student’s performance in DOPS0.0%
0.0% 3.1% 6.3% 56.2% 34.4% -The instructor gave his feedback about student’s performance in MINICEX friendly.
0.0% 0.0% 12.5% 46.9% 40.6% -The instructor gave his feedback about student’s performance in DOPS friendly.
1.7% 8.1% 22.1% 23.1% 45.0% -Opportunity for mistakes to be picked (up) easily and knowledge gaps (to be) addressed earlier.
0.0% %2.3 %11.1 20.2% 66.4% -Student’s request to perform this type of assessment again for different cases and procedures.
0.0% 0.0% 7.0% 26% 67.0% -Students’ recommendation of this form of assessment for other medical students.


Assessment acts as the steering wheel for medical education at all levels of training. Medical school graduates will face a variety of healthcare scenarios all of which require professional, competent, and skillful responses. This is because the field of medicine is highly complex [16]. During the training time evaluation plays a significant role in helping the trainees identify their strengths and areas need improvement. It also assists them in developing the skills necessary to handle a variety of situations. The safety of the patients must also be taken very seriously. Medical graduates who participate in assessments have a special opportunity to develop professionally depending on input from assessors of Workplace-Based Evaluation [17].

The purpose of this study was to conduct a lecture on WBAs for family medicine staff members and postgraduate students, apply three Mini Clinical Evaluation Exercises (MINICEX) and two Direct Observations of Procedural Skills (DOPS) for family medicine postgraduates, and then assess postgraduate satisfaction with WBA.

The postgraduate students’ MINICEX feedback scores for the medical history, physical examination, and evaluating communication skills, management, organization, and professionalism showed a significant improvement for three consecutive months in allover clinical care score (9.5 ± 2.7, 24.9 ± 2.5, and 27.29 ± 1.5) (P = 0.001) (Table 2 & Figure 1). Various previous researchers found that the mini-CEX positively affects the learning process. Mini-CEX is a useful assessment tool and has a positive influence on the learning process [18,19]. By doing mini-CEX repetitively students will spend more time in practicing history taking and physical examinations and increase their learning time. Mini-CEX provides a positive experience, both in terms of knowledge and clinical skills [20]. This finding coincides with that of, Vaughan et al., who study students’ learning response toward feedback during mini-CEX on 24 participants (9 males and 15 females) they are undergoing clerkship in the internal medicine rotation in order to prevent recall bias [21]. Feedback content is useful to describe the students’ performance in the achievement of competence and performance gap and it is important to improve the students’ learning response. Also, Khali et al., who study (mini-CEX) in 20 postgraduate students (12 males, 8 female) were assessed Clinical Competence of Postgraduate Trainees in Pediatric reported that feedback has a positive impact on the students’ future performance [22]. This could be one of the main advantages of involving MINI-CEX for formative assessment of postgraduates.

There was a significant improvement in the postgraduate students’ two consecutive months of total DOPS feedback scores in the skill lab (6.1 ± 1.8 versus 9.0 ± 1.2) (P < 0.001) (Table 3). This is consistent with numerous results including those from studies by Hengameh H et al. and Nazari R et al. which compared the effects of DOPS with the standard evaluation method on Iran nursing students and found that DOPS significantly increased clinical skills (P= 0.000) [23,24]. Additionally, Shahgheibi Sh et al. evaluated the effects of DOPS on students’ learning levels during clinical externships in the obstetrics department. The results of a study conducted in Iran with 73 medical students (42 control and 31 intervention) showed that DOPS can be more effective at enhancing students’ skill than the control group (P = 0.001) [25].

Also 193 students in a surgical skills lab course participated in a prospective randomized experiment in 2015 in Austria conducted by Profanter C and Perathoner. The DOPS group demonstrated a high degree of clinical skills than control group. The DOPS dimensions appear to enhance tutoring and performance rates [26]. In 2014, in India, Dabhadkar S et al. Examining how DOPS has affected second-year postgraduate OBGY students’ learning five out of six students who had unsatisfactory performance in the first round of DOPS improved to a satisfactory level in the second round of DOPS [27]. Bagheri M et al. examined the effects of DOPS on postgraduate students’ learning in Iran studying emergency Medicine (25 in experiment and 21 in control group) In comparison to the control group, the experimental group’s mean scores were significantly higher (P = 0.0001, t = 4.9) [28].

In this study; about 93% of students recommended its application for other medical students and 86% of them agreed to perform it again for other different cases and procedures (Table 4 & Figure 3). It is consistent with the study conducted in surgical residents who readily accepted the WBAs tool at a government medical college and tertiary care teaching hospital in India 2017. Their great level of satisfaction with the performance provided as evidence of this. This gives them the ability to continuously improve in the identified weak areas with an improvement in residents’ performance over the WBAs period [29]. As per Tenzin et al. study, OBGYN residents were the most satisfied with workplace assessments at 90%, followed by pediatricians at 80% [30].


Workplace based assessment proved its ability to improve clinical knowledge and skills among family medicine postgraduates’ who showed great satisfaction by it. This initiative’s main goals were to develop highly qualified specialists and competency and outcome-based yet learner-centered programs.


Conflict of Interest: None declared.

Funding Sources: No Funding sources

Acknowledgements: All authors acknowledge and appreciate the considerable efforts of all the respectable participants in this current work.

Ethical approval: The Institutional Ethical Committee Board of the Faculty of Medicine, Menoufia University, Egypt approved the study with ensured the confidentiality of data. IRB 3/2020FAML 2-C

Authors’ contributions:

SHA; Conceived and designed this study, designed the instruments of data collection, provided research materials, conducted awareness lecture, MINICEX,DOPS, collected and organized data, analyzed and interpreted the obtained data, edited, reviewed the article and revised it critically for important intellectual content.

NAF; Conceived and designed the study, designed the instruments of data collection, provided research materials and organized data, wrote initial and final draft of article, edited, reviewed the article and submitted it on the journal web site.


  1. Africa G of S. 1. Chapter 10: promoting Health: National development plan 2030, 1st ed. Pretoria: Government of South Africa, 2013: 330–351.
  2. Von Pressentin KB, Mash RJ, Baldwin-Ragaven L, Botha RPG, Govender I & Steinberg WJ: The bird’s-eye perspective: how do district health managers experience the impact of family physicians within the South African district health system? A qualitative study, South African Family Practice, 2018: 60:1, 13-20, DOI: 10.1080/20786190.2017.1348047
  3. Von Pressentin KB, Mash RJ, Baldwin-Ragaven L, Botha RPG, Govender I & Steinberg WJ, et al. The perceived impact of family physicians on the district health system in South Africa: a cross-sectional survey. BMC Fam Pract 19, 24 (2018).
  4. Mash R, Ogunbanjo G, Naidoo SS, Hellenberg D. The contribution of family physicians to district health services: a national position paper for South Africa. South African Fam Pract. 2015; 57: 54–61.
  5. Scheele F, Teunissen P, Van Luijk S, Heineman E, Fluit L, Mulder H, et al. Introducing competency-based postgraduate medical education in the Netherlands. Med Teach. 2008; 30: 248-53.DOI: 10.1080/01421590801993022
  6. Rethans JJ, Norcini JJ, Barón-Maldonado M, Blackmore D, Jolly BC, LaDuca T, et al. The relationship between competence and performance: implications for assessing practice performance. Med Educ. 2002; 36(10): 901-9. doi: 10.1046/j.1365-2923.2002.01316.x.
  7. Franche R-L, Cullen K, Clarke J, Irvin E, Sinclair S, Frank J. Workplace-based return-to-work interventions: a systematic review of the quantitative literature. J Occup Rehabil. 2015;15(4):607–631. doi: 10.1007/s10926-005-8038-8.
  8. Swanwick T and Chana N. Postgraduate Medical Education and Training Board Workplace Based Assessment Subcommittee. Workplace based assessment. Postgraduate Medical Education and Training Board. Br J Hosp Med (Lond) 2009; 70:290-3. doi: 10.12968/hmed.2009.70.5.42235.
  9. Augustine K, McCoubrie P, Wilkinson J, McKnight L.Workplace-based assessment in radiology—where to now? Clin Radiol. 2010;65(4):325–332. doi: 10.1016/j.crad.2009.12.004. Epub 2010 Feb 4.
  10. Pelgrim E, Kramer A, Mokkink H, Van den Elsen L, Grol R, Van der Vleuten C. In-training assessment using direct observation of single-patient encounters: a literature review. Adv Health Sci Educ. 2011;16(1):131–142. doi: 10.1007/s10459-010-9235-6. Epub 2010 Jun 18.
  11. Bindal N, Goodyear H, Bindal T, Wall D.DOPS assessment: A study to evaluate the experience and opinions of trainees and assessors. Med Teach.2013; 35(6): e1230–e1234. doi: 10.3109/0142159X.2012.746447.
  12. Kogan JR and Holmboe E. Realizing the promise and importance of performance-based assessment. Teach Learn Med. 2013;25(Suppl1): S68-74.DOI: 10.1080/10401334.2013.842912.
  13. Heeneman S, Oudkerk Pool A, Schuwirth LW, van der Vleuten CP, Driessen EW. The impact of programmatic assessment on student learning: theory versus practice. Med Educ. 2015;49(5):487-498. DOI: 10.1111/medu.12645.
  14. Burford B, Illing J, Kergon C, Morrow G, Livingston M. User perceptions of multi-source feedback tools for junior doctors. Med Educ 2010; 44:165-76.
  15. Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Medical Teacher. 2007; 29(9):855–71. doi: 10.1080/01421590701775453.
  16. Henry D and West DC. The clinical learning environment and workplace-based assessment: frameworks, strategies, and implementation. Pediatr Clin North Am. 2019;66(4):839- 54. doi: 10.1016/j.pcl.2019.03.010.
  17. Prins SH, Brøndt SG, Malling B. Implementation of workplace-based assessment in general practice. Educ Prim Care. 2019;30(3):133-44. doi: 10.1080/14739879.2019.1588788.
  18. Malhotra S, Hatala R and Courneya CA. Internal medicine residents’ perceptions of the Mini-Clinical Evaluation Exercise. Med Teach. 2008; 30: 414-419. doi: 10.1080/01421590801946962.
  19. Weller JM, Jones A, Merry AF, Jolly B and Saunders D. Investigation of trainee and specialist reactions to the mini-Clinical Evaluation Exercise in anesthesia: implications for implementation. Br J Anaesth. 2009; 103: 5. doi: 10.1093/bja/aep211. Epub 2009 Aug 17.
  20. Suhoyo Y, van Hell EA, Prihatiningsih TS, Kuks JB and Cohen-Schotanus J. Exploring cultural differences in feedback processes and perceived instructiveness during clerkships: replicating a Dutch study in Indonesia. Med Teach. 2014; 36: 223-229. doi: 10.3109/0142159X.2013.853117. Epub 2013 Dec 2.
  21. Vaughan, Brett and Moore K. The mini–Clinical Evaluation Exercise (mini-CEX) in a pre-registration osteopathy program: Exploring aspects of its validity. International Journal of Osteopathic Medicine, 2017; 19. 61 - 72. ISSN 1746- 0689. DOI:10.1016/J.IJOSM.2015.07.002
  22. Khalil S, Aggarwal A, Mishra D. Implementation of a Mini-Clinical Evaluation Exercise (Mini-CEX) Program to Assess the Clinical Competence of Postgraduate Trainees in Pediatrics.2017; 15;54(4):284-287. doi: 10.1007/s13312-017-1089-z. Epub 2017 Feb 2.
  23. Hengameh H, Afsaneh R, Morteza K, Hosein M, Marjan SM, Abbas E.The Effect of Applying Direct Observation of Procedural Skills (DOPS) on Nursing Students’ Clinical Skills: A Randomized Clinical Trial. Glob. J. Health Sci. 2015;7(7 Spec No):17-21. doi: 10.5539/gjhs.v7n7p17.
  24. Nazari R, Hajihosseini F, Sharifnia H, Hojjati H.The effect of formative evaluation using “direct observation of procedural skills” (DOPS) method on the extent of learning practical skills among nursing students in the ICU. IJNMR.2013;18(4):290-3.
  25. Shahgheibi Sh, Pooladi A, Bahram Rezaie M, Farhadifar F, Khatibi R. Evaluation of the Effects of Direct Observation of Procedural Skills (DOPS) on Clinical Externship Students’ Learning Level in Obstetrics Ward of Kurdistan University of Medical Sciences. Journal of Medicine Education. 2009; 13 (1, 2): 29-33.
  26. Profanter C and Perathoner A. DOPS (Direct Observation of Procedural Skills) in undergraduate skills-lab: Does it work? Analysis of skills performance and curricular side effects. GMS J. Med. Educ. 2015;32 (4). doi: 10.3205/zma000987.
  27. Dabhadkar S, Wagh G, Panchanadikar T, Mehendale S, Saoji V. To evaluate Direct Observation of Procedural Skills in OBGY. NJIRM. 2014;5(3):92-7.
  28. Bagheri M, Sadeghnezhad M, Sayyadee T, Hajiabadi F. The Effect of Direct Observation of Procedural Skills (DOPS) Evaluation Method on Learning Clinical Skills among Emergency Medicine Students Iran J Med Edu. 2014;13(12): 1073-81.
  29. Joshi MK, T Singh T, Badyal DK. Acceptability and feasibility of mini-clinical evaluation exercise as a formative assessment tool for workplace-based assessment for surgical postgraduate students, 2017; 63(2):100-105. doi: 10.4103/0022-3859.201411.
  30. Tenzin, Sonam Gyamtsho , Tshering Wangdon , Pema C. Buttia, Lalitha Chandans , Nirmala Rege. Effect of use of direct observation of procedural skills for assessment for learning in Obstetrics and Gynecology postgraduate students at Medical University, Bhutan: a prospective study Karma . Bhutan Health Journal, 2019;5(1): 10-12.