Skip to main content

Workplace-based assessments of entrustable professional activities in a psychiatry core clerkship: an observational study

Abstract

Background

Entrustable professional activities (EPAs) in competency-based, undergraduate medical education (UME) have led to new formative workplace-based assessments (WBA) using entrustment-supervision scales in clerkships. We conducted an observational, prospective cohort study to explore the usefulness of a WBA designed to assess core EPAs in a psychiatry clerkship.

Methods

We analyzed changes in self-entrustment ratings of students and the supervisors’ ratings per EPA. Timing and frequencies of learner-initiated WBAs based on a prospective entrustment-supervision scale and resultant narrative feedback were analyzed quantitatively and qualitatively. Predictors for indirect supervision levels were explored via regression analysis, and narrative feedback was coded using thematic content analysis. Students evaluated the WBA after each clerkship rotation.

Results

EPA 1 (“Take a patient’s history”), EPA 2 (“Assess physical & mental status”) and EPA 8 (“Document & present a clinical encounter”) were most frequently used for learner-initiated WBAs throughout the clerkship rotations in a sample of 83 students. Clinical residents signed off on the majority of the WBAs (71%). EPAs 1, 2, and 8 showed the largest increases in self-entrustment and received most of the indirect supervision level ratings. We found a moderate, positive correlation between self-entrusted supervision levels at the end of the clerkship and the number of documented entrustment-supervision ratings per EPA (p < 0.0001). The number of entrustment ratings explained 6.5% of the variance in the supervisors’ ratings for EPA 1. Narrative feedback was documented for 79% (n = 214) of the WBAs. Most narratives addressed the Medical Expert role (77%,n = 208) and used reinforcement (59%,n = 161) as a feedback strategy. Students perceived the feedback as beneficial.

Conclusions

Using formative WBAs with an entrustment-supervision scale and prompts for written feedback facilitated targeted, high-quality feedback and effectively supported students’ development toward self-entrusted, indirect supervision levels.

Peer Review reports

Background

Introducing entrustable professional activities (EPAs) to competency-based undergraduate medical education (UME) has led to new approaches for the design of workplace-based assessments (WBAs). EPAs are observable clinical tasks and serve as units of assessment that are often based on entrustment-supervision scales [1,2,3,4,5,6].WBAs (i.e., any type of structured assessment done in the workplace such as the Mini-clinical exercise or clinical work sampling) serve multiple purposes [7,8].In a low-stakes context, they are intended to create opportunities for structured observation, feedback, and to support the achievement of competency-based learning goals (assessments for learning) [7,9,10,11].In the context of linking clinical UME and graduate medical education (GME) curricula [12,13]— and short clerkship rotations—the emphasis should be on maximizing the value of formative WBAs, as described in GME [14].This value depends on the context, content, and quality of the feedback resulting from the WBAs [15,16,17,18,19,20,21].While a number of studies have explored the potential of WBAs for generating high-quality narrative feedback in GME [22,23,24,25], little is known about the relationship between WBAs based on an entrustment-scale, their narrative feedback output, and the perceived need for supervision (i.e., self-entrustment) in early-stage clinical students. In particular, changes in self-entrustment can be used as an indicator of self-efficacy [26,27,28,29].Thus, developing higher levels of self-entrustment is relevant for self-regulated learning in clinical workplaces.

WBAs have become a central part of many graduate training programs [20,30,31,32,33] and are increasingly used to assess EPAs and competencies in undergraduate clinical training programs as well [2,4,34,35,36,37].Typically, they are used to support the direct or indirect observation of trainees’ clinical activities and to provide assessment information for both low- and high-stakes purposes. Despite the potential of WBAs to provide formative feedback and their key role within assessment programs [8,11], major feasibility issues have been identified in workplaces. These include a lack of understanding regarding the purpose of WBAs from both trainers and trainees, time constraints, and a lack of training within faculties [21,32].Duijn et al. [21] identified specific criteria for meaningful feedback taken from students’ perspectives on EPAs. These corresponded to the general feedback quality criteria described by Lefroy et al. [19], which included reinforcement, key point identification, strategy development, and whether feedback is actionable.

The requirements and complexity involved in aligning valid learning goals such as EPAs, rating scales, and feedback narratives from WBAs has been explored in GME [22,38] and UME [21,34,39].However, we are unaware of any studies that have addressed the content and quality of narrative feedback resulting from WBAs based on entrustment-supervision scales in psychiatry clerkships. Prospective entrustment-supervision scales differ from traditionally abstract WBA scores in that each supervision level directly reflects the degree of supervision required when subsequently performing a clinical activity [4,6,40].The level of supervision in ad hoc entrustment decisions (i.e., an instant entrustment decision in a real working context) is set by a clinical supervisor.

In contrast, self-entrustment (that is, evaluating one’s ability to perform a clinical activity under a given level of supervision) [27], as related to self-efficacy, is a students’ judgment concerning the ability to face a potentially challenging situation [41].Hence, higher self-entrustment has been identified as an important factor for the active engagement of students with clinical work and self-regulated learning [27,41].Due to a number of different (self-) entrustment-supervision scales in use [4,35,42], the relationship between self-entrustment and formative WBAs with documented ad hoc entrustment ratings and narrative feedback remains unclear. A further criticism of learner-initiated WBAs is that medical students may selectively pick favorable clinical activities for graded WBAs, resulting in selection bias [43].To our knowledge, the question of how clerkship students use a mandatory, but purely formative and self-initiated, WBA with entrustment-supervision scales has not been studied in a core clerkship setting.

Therefore, we conducted an observational study to explore the usefulness of a developmental (that is formative) assessment tool (a structured, paper-based observation format) designed to assess Swiss core EPAs in a psychiatry clerkship based on a prospective supervision-entrustment scale. Formative assessment in our study included aspects that supported learning in a clinical workplace (assessmentforlearning) in contrast to summative assessmentsof学习。我们的主要结果参数是change in students’ self-entrusted supervision level per core EPA following a core clerkship rotation in psychiatry. Secondary outcome parameters were the timing and frequencies of learner-initiated WBAs, potential predictors for reaching self-entrusted indirect supervision levels per core EPA, the frequency, quality and content of narrative feedback resulting from this formative assessment tool, and students’ evaluations of the WBA tool.

Methods

Study design

We chose an observational, prospective cohort study design.

Context and participants

Students were prepared for their core clerkship rotations (in the 4th year of medical school) with didactic lectures organized by specialty; clinical-skills training for taking patient histories as well as physical and mental examinations; and communication training with standardized patients. During the core clerkship year, students rotated through nine different specialties, lasting two to four weeks per clerkship rotation, in teaching hospitals affiliated with the University of Bern. The majority of students remained on the same ward as members of the clinical team during their four-week psychiatry rotation. Clinical supervision, which often including the signing-off on WBAs, was typically provided by residents, psychologists, and attendings (see Table1). A national competency-based learning catalogue was introduced for the core clerkship year in UME in 2019 at the University of Bern, Switzerland [44].The Principal Relevant Objectives and Framework for Integrative Learning and Education in Switzerland (PROFILES) was based on the CanMEDS roles and nine core EPAs (see Table2for descriptions) [44].The CanMEDS roles have been extensively described elsewhere [44,45].我们稍微修改了核心EPA标题(例如,instead of “Take a patient’s history” we used “Take a patient’s psychiatric history”, or instead of “Contribute to a culture of safety” we used “Identify and report opportunities to improve patient safety in a psychiatric hospital”) to our clinical context.

Table 1 Characteristics of the study participants
Table 2 Self-entrustment ratings and workplace-based assessments (WBAs) per entrustable professional activity (EPA)

Clerkship students were required to submit at least four documented entrustment-supervision ratings on one or several WBA forms per rotation, of which one had to be for EPA 1 (“Taking a patient’s psychiatric history”) and one for EPA 2 (“Assess physical & mental status). It was the students’ responsibility to collect these WBAs during their clerkship rotations. Clinical supervisors at our teaching hospital have been working with WBAs in clerkship rotations since 2010 and are, therefore, familiar with the general WBA format [34].Furthermore, clerkship directors attended a two-hour seminar prior to starting the clerkship year on the nature and purpose of the novel WBA tool for EPAs. Supervising residents at our teaching hospital were also instructed during departmental meetings and via email. All assessments were formative, mandatory, and students did not receive any grades. The WBAs did not include the self-entrustment ratings of the students. To successfully complete the clerkship rotation, students needed to hand in all assessments signed by their supervisor.

To support the observation and assessment of EPAs, a form listing the nine adapted core EPAs, together with an entrustment-supervision rating scale, was developed (Fig. S1). The scale was adapted to our context from a published, prospective entrustment-supervision scale for UME [1].Our scale included six levels of supervision, ranging from “Observe only” (level 1, minimal expectation for students entering the clerkship year) to “The student can do this if he/she can ask for help when needed” (level 6, target level for graduation from medical school), as the highest level allowed at our institution (Fig. S1). Clinical supervisors could rate the supervision level for one or more observed EPAs per form. We also included dedicated space on the assessment form for narrative feedback. To prompt specific and actionable feedback, we added guiding questions (“What was done well?”, “What can be improved?”, “Next steps?”) on the WBA form.

We collected demographic, assessment, and evaluation data from the clerkship students (n = 83) during their psychiatry rotation at our academic teaching hospital between March and November 2019. We analyzed the self-entrusted levels of supervision per student per core EPA (see Table2) at different time points, one on the first day and two on the last day of the clerkship (current level and retrospective level for the first day). This was done to explore changes after the one-month clerkship rotation and retrospective adjustments of the perceived need for supervision.

Data analysis

All WBAs with supervisors’ ratings were inputted into Microsoft Excel, version 16.37, for analysis. For descriptive analyses per EPA, we included the proportion of students that received an indirect supervision score (level 6 on the entrustment scale) and comparisons of post self-entrustment ratings with supervisors’ entrustment ratings. We ran a Wilcoxon Signed-Rank test for students’ self-entrustment comparisons (pre and post clerkship ratings), a Spearman’s rho correlation to determine the relationship between the post clerkship self-assessed need for supervision per EPA and the number of WBAs per EPA, and a Mann-Whitney U test for a comparison of the proportion of students’ self-entrustment ratings (post clerkship) and the proportion of WBA ratings per EPA at the indirect supervision level. A stepwise regression analysis using SPSS (version 25; IBM Corp, Armonk, NY) was used to identify predictors of WBA entrustment ratings and self-entrustment ratings at the end of the clerkship rotation.

The feedback narratives from all of the WBAs were imported into the qualitative research software MAXQDA (version 20.0.8; VERBI GmbH, Berlin, Germany) for thematic content analysis [46].We used a published coding framework for the CanMEDS roles [22] to code individual narrative feedback. No further codes emerged from our data analysis. In addition, we defined codes for high-quality feedback based on published guidelines with strong evidence for feedback effectiveness in clinical education [19)(即强化,重点识别、strategy development, self-awareness, EPA-specific, actionable). We added one code for narrative feedback that explicitly stated entrustment. The first ten feedback narratives were fully and independently coded by two researchers. Discrepancies were resolved through discussion, and anchoring examples were used for further coding. One researcher coded the remainder of the full data set. Samples from the full data set were used to regularly check that there were no new coding discrepancies. All of the students from all of the teaching hospitals in psychiatry were invited to evaluate the novel WBAs after each clerkship. The other psychiatry teaching hospitals used a general WBA form for all core clerkships (and not adjusted to the psychiatry clinical context), which was also based on a prospective entrustment-supervision scale and prompts for narrative feedback.

Since these evaluations were conducted externally, the data included students from the pilot month and were only available in a summary report. We compared the students’ WBA evaluations with the previous year (if data were available) where Mini Clinical Evaluation Exercises (Mini-CEX) were used as the WBA format [34].

Ethics

The ethics committee of the canton of Bern reviewed the research design and exempted this study from additional ethical approval. Confidentiality and anonymity regarding electronic data was maintained throughout the study. Any names or potentially identifying information were removed before analyzing the data. All direct quotes were translated from German to English.

Results

Clerkship students in our sample (n628年= 83)提交entrustment-supervis截然不同ion ratings from clinical supervisors (each rating corresponded to one EPA) on 271 WBA forms containing one or more ratings. There was an average of three forms per student and two ratings per form. The average number of observed and documented EPAs per student was 7.5 (SD = 1.2), which exceeded the minimum requirement of four per student. In addition, each student filled out three self-entrustment ratings per core EPA (Table2). Except for the pilot data from the first month, the full clerkship-year data (nine months) were included in the final analysis.

Participant characteristics

Students in our sample represented one third of the 2019 clerkship cohort and 53% originated from the canton of Bern. The remainder of the students originated from other cantons in Switzerland, including three different language regions. Females were marginally over-represented in the sample (female-to-male ratio = 3:2), and the average student age was 24 years. Interest in psychiatry as a specialty was low at the beginning of the clerkship rotation, with only 10% planning further training. This increased to 17% after the rotation.

A total of 66 different raters signed-off on the WBA forms during the study period. Most students (81%) had their WBAs signed-off by one or two different raters (Table2). The gender ratio of the raters was equal. The majority of WBAs were signed-off by clinical residents (71%), followed by psychologists (15%), nurses (5%), and attending physicians (5%). The majority of resident raters (76%) had undertaken their undergraduate medical training at a different medical school. Descriptive statistics of the study participants are summarized in Table1.

Change in self-entrustment ratings and clinical supervisors’ ratings per EPA

With regard to self-entrustment ratings, 87% (n= 72)的学生并不认为自己as ready for indirect supervision for any of the nine EPAs on the first day of their clerkship rotation. Only a small proportion of students self-entrusted at the indirect supervision level in the beginning of the clerkship rotation, which ranged from 0% (EPA 5) to 10% (EPA 8). Retrospective self-entrustment ratings showed a similar distribution per EPA, except for EPA 8 (“Document & present a clinical encounter”) for which more students felt ready for indirect supervision in the beginning of the clerkship compared to retrospectively (10% versus 2%). The proportion of students that self-assessed as ready for indirect supervision was higher for all of the EPAs at the end of the clerkship (with an average increase of 15% per EPA, range: 3–44%) and showed a similar distribution per EPA compared to supervisors’ entrustment ratings on the WBAs. The overall increase in self-entrusted indirect supervision level was highest for EPAs 1, 2 and 8 (increases of 44, 35 and 26% of students, respectively). Most students received at least one WBA entrustment rating indicating readiness for indirect supervision (level 6 on the supervision scale, Fig. S1) for EPA 1 (72%) and EPA 2 (67%). For EPAs 3–8, the percentage of students receiving at least one indirect supervision rating ranged between 4 and 24% (Table2). Supervisors rated a higher proportion of students as ready at the indirect supervision level for EPAs 1, 2, 3, 4, 5, and 7 and a similar or smaller proportion for EPAs 6, 8 and 9 in comparison to the end-of-clerkship-self-entrustment ratings of students. We found a moderate, positive, monotonic correlation between the perceived need for supervision-level and the quantity of WBAs per EPA (Spearman rho = 0.46,n = 684,p < 0.0001). Thus, more WBAs were associated with a higher self-entrusted supervision level (i.e., more independence).

Predictors for achieving indirect supervision levels per EPA

A regression analysis showed that the month of clerkship rotation, age, gender, quantity of entrustment-supervision ratings, or self-assessed need for supervision at the beginning of the clerkship did not predict the level of self-assessed need for supervision per EPA at the end of the clerkship. However, the number of entrustment ratings explained 6.5% of the variance in the supervisors’ ratings for EPA 1 (F(1,63) =4.408,p < 0.05), with an R2of 0.065. For EPA 2, interest in the specialty explained 7.5% of the variance in supervisors’ ratings (F(1,63) =5.103,p < 0.05) with an R2of 0.075, and the number of ratings correlated positively with the level of ratings (r = 0.25, p < 0.05). We found no predictors for supervisors’ ratings for EPA 8. The interest in the specialty explained 24% of the variance in the change in perceived need of supervision for EPA 1 (F(1,59) =18.753,p < 0.05) with an R2of 0.241 and 13% for EPA 2 (F(1,59) =8.715, p < 0.05) with an R2of 0.129. Interest in the specialty negatively correlated with a change in perceived need of supervision, indicating greater increases of perceived independence for students who were less interested in the specialty (Pearson r = − 0.491,p < 0.001 for EPA 1 and r = − 0.359;p < 0.01 for EPA 2). For all other EPAs, we could not analyze predictors because too few WBAs were done.

Timing and frequencies of learner-initiated WBAs

The average student collected one entrustment-supervision rating in the first clerkship week and two per week during the remainder of the clerkship rotation (Fig.1a). Most entrustment-supervision ratings were collected for EPA 1 (“Take a patient’s psychiatric history”,n = 189), EPA 2 (“Assess physical & mental status”,n = 280) and EPA 8 (“Document & present a clinical encounter”,n = 71). Except for EPA 9 (“Contribute to a culture of safety”), all other EPAs were used for WBAs, but less frequently (Fig.1b).

Fig. 1
figure 1

Use pattern of formative workplace-based assessments (WBAs) in clerkship rotations

Narrative feedback resulting from the WBA tool

Students received narrative feedback on 79% (n = 214) of all WBA forms (Table3). On two WBA forms with no narrative feedback we found statements indicating that students had received oral feedback. The average length of the feedback narratives was 12.3 words (SD = 6.1). In terms of content, most narratives addressed the Medical Expert role (77%,n = 208), followed by the Professional (17%,n = 44) and Communicator roles (15%,n = 37). The application of knowledge for structuring a clinical interview, taking a patient history, examining mental status, and performing a physical exam were most frequently mentioned (46%), followed by communication skills with patients and showing compassion and empathy 25% (Table3). The following quotes illustrate the narrative feedback extracted from the WBA forms. The first one was signed off by a resident, who used reinforcement in their feedback, which relates back to a previous clinical observation. It provides EPA-specific feedback with regard to the mental status exam and communication tactics:

Table 3 Narrative feedback on all submitted workplace-based assessment (WBA) forms during psychiatry clerkship rotations between March and November 2019

“Great structure of clinical interview despite challenging situation, improved focus compared to admission of first patient. Important information [suicidality marked on WBA form] must be addressed directly; sometimes you need to insist a little bit!”

(WBA form 176: EPAs 1 and 2, signed-off by resident. Supervision scale: level 5 for both)

Another salient quote by a supervising nurse illustrates reinforcement and key aspect identification in the context of performing a venipuncture:

“Correct execution [of venipuncture]. Great instruction for patient. [Clerkship student] shows friendly and empathetic interaction with patient. Needs more routine, otherwise well performed. Don’t forget hand disinfection!”

(WBA form 204: EPA 5, signed-off by nurse. Supervision scale: level 2)

Only 8% of the WBAs contained narrative elements that explicitly commented on entrustment (n = 21) such as “Independent patient admission, including mental status and documentation” (WBA form 153). In terms of feedback quality, we found different frequencies of high-quality feedback indicators. The most frequent feedback strategy was reinforcement (59%,n = 161), followed by specifically commenting on observed EPAs (42%,n = 113), and providing actionable feedback (32%,n = 87). Some clinical supervisors structured their narratives using the headings “positive” and “negative”. EPAs 1, 2 and 8 were the most frequently addressed in the narrative feedbacks.

Student evaluations of the WBA tool

All students agreed that the required number of WBAs had been documented (agreement rate at our institution was 100%). With regard to receiving verbal feedback for clinical competencies after WBAs, 92% of students stated that they always or mostly received verbal feedback, and 8% answered that they rarely or never received verbal feedback.

Concerning narrative feedback, 75% of all students stated that they always or mostly received narrative feedback, while 25% said they rarely or never received narrative feedback. On a 6-point Likert scale (1 = I completely disagree, 6 = I fully agree), the overall, average rating for “I benefited from the feedback after workplace-based assessments” was 4.8 (4.4 in the previous clerkship year) and for “The learning goals defined after the workplace-based assessments were actionable” it was 5.1 (4.6 in the previous clerkship year).

Discussion

The aim of our study was to explore changes in students’ self-entrusted supervision level per Swiss core EPA after introducing a novel WBA format based on a prospective entrustment-supervision scale during a core clerkship rotation in psychiatry. Our results suggest that self-entrustment ratings changed per core EPA over the course of a clerkship rotation. The use of the novel WBA format correlated with an increase in self-entrustment per core EPA. That is, when used in a clerkship rotation, this WBA format was associated with progress towards higher levels of self-entrustment. Students predominantly chose three core EPAs (“Taking a patient’s history”, “Assess the physical & mental status”, and “Document & present a clinical encounter”) for WBAs. The narrative feedback generated with this WBA format centered on aspects of the CanMEDS roles: Medical Expert, Professional, and Communicator. We also found that clerkship students were entrusted and observed in clinical activities by different health professionals on the ward team. Our findings could inform future reforms of national EPA-based frameworks and competency-based curriculum designs in psychiatry for UME.

Self-entrustment ratings and predictors of achieving indirect supervision levels

Changes in self-entrustment ratings showed a similar distribution pattern as the WBAs across EPAs and a moderate, yet significant, correlation of self-entrustment level and number of WBAs. The number of observed and documented ratings per EPA could explain some of the variance in self-entrustment ratings at the end of the clerkship. The strength of the correlation was comparable to findings from other types of WBAs [15].Differences in the nature of assessed clinical activities (e.g., taking a history from a psychotic patient versus doing a motivational interview) and complexity levels (e.g., a mental status assessment of a patient with mild depressive symptoms versus an acutely suicidal patient) could have moderated the true association between the number of WBAs and the perceived need for supervision [6,9].Clinician educators in UME could use these types of formative WBAs to help students achieve the next supervision level. In our study, students who were initially less interested in the specialty appeared to benefit more from these WBAs in terms of an increase in self-entrustment. This finding provides evidence for the potential of these WBAs to be used as effective assessment tools for supporting self-efficacy-related learning in workplaces, as has been described in the GME context [47].

An important finding of this study was that less than half of the students felt prepared for indirect supervision in any of the core EPAs at the end of their clerkship. As we found no significant retrospective adaption of self-entrustment ratings per EPA, this type of self-assessment appears to be more stable compared to self-assessments of competence [27,48].Reflecting on the ability of performing a concrete clinical task might be different than reflecting on a more abstract concept of competence and be less prone to social desirability bias [41].Taking into consideration that, for most students at our institution, this was the only clinical exposure in psychiatry, it is questionable whether setting a target of an indirect supervision level across all core EPAs is realistically achievable in this relatively short time period. Factors that might influence the achievement of indirect supervision levels include the length of the clerkship rotation and the UME curriculum structure both before and during the clerkship. The division between the envisioned and actual entrustment-supervision level of graduating medical students has been described similarly in the context of pediatrics [40].Longitudinal clerkships might provide advantages in this respect [12].

Timing and frequencies of learner-initiated WBAs

Students predominantly used EPAs 1, 2 and 8 for WBAs. On the one hand, this might be a result of our specific curricular structure, with at least one WBA required for EPAs 1 and 2. On the other hand, students completed more WBAs than required and covered all other EPAs, except for EPA 9. Our interpretation of this pattern is that it most likely reflects those core EPAs that were most appropriate for the training level of early clinical students in the workplace and, thus, the activities that were predominantly entrusted to students. Other studies in UME on EPA progression and assessment have shown similar patterns, with students achieving higher supervision levels at an early phase for EPAs related to history taking, physical examination, presentation, and documentation [34,49].

However, the observed pattern might also reflect the discrepancy between core EPAs designed for UME in general and specialty-specific clinical learning environments as described in other contexts [49].There might be unique aspects to the practice and learning of clinical psychiatry [50] such as initially managing acutely suicidal or agitated patients (core EPA 6: Recognize & treat an emergency) that would make WBAs of certain (nested) EPAs especially important for achieving competency-based learning goals. Our findings support the need for the systematic development of EPAs for UME through a construct validity lens that takes into account the specific clinical context and informs clerkship curriculum design [39].

The observed distribution of WBAs also indicates that students used formative WBAs throughout their clerkship rotation without cherry-picking clinical situations at the end of the rotation. One potential reason for this might be the formative nature of our WBAs as opposed to graded WBAs, as has been described in the context of a comparable clerkship rotation program [43].

叙述WBA工具的反馈和学生evaluations

We found that, despite the time constraints of clinical staff and the challenges of WBAs as described in the literature [15,30,43], most WBA forms contained high-quality narrative feedback, which supports the findings from GME [25,51].We did not find any red-flag elements in the feedback narratives. This finding was also supported by external students’ evaluations of the novel WBA format, as well as a trend towards perceived higher benefits and more actionable feedback. Therefore, we conclude that using an entrustment-supervision scale with prompts for written feedback facilitates targeted feedback. A recent review has revealed that WBAs with entrustment-supervision scales and narrative feedback might be under-utilized in the context of EPAs [2].

此外,我们发现学生走近different health professionals, including psychologists and nurses, to solicit feedback on clinical activities. This suggests more systematic exploration of the potential for an inter-professional educational design of clerkship curricula should be conducted. We are unaware of any studies on formalized inter-professional entrustment processes in clerkships. The narratives also addressed different CanMEDS roles. Given the workplace-based portfolios in clerkships, our data indicate that narrative feedback from WBAs with entrustability-supervision scales might be a valuable source of information for the Medical Expert role, which has been less represented (in comparison to the Professional and Communicator roles) in learning portfolios that were examined in a multi-center study [52].A better understanding of the perceptions of students and residents with regard to the feedback process based on narratives is necessary to identify the reasons why a fifth of WBA forms did not contain any narratives [18].

Limitations

Due to the observational design of our study, we cannot make any causal inferences about the relationship between WBAs based on entrustment-supervision scales and changes in the perceived or actual need for supervision. Furthermore, the context of psychiatry as a clinical specialty might have influenced the relative focus on clinical interviewing and communication skills for the WBAs. However, we collected data longitudinally, with both quantitative and qualitative methods, using a representative sample of the clerkship cohort. This allowed us to gain a multidimensional perspective on WBAs with a prospective entrustment-supervision scale and their formative value based on changes in self-entrustment ratings and resultant narrative feedback content and quality.

Conclusions

Using a WBA tool with an entrustment-supervision scale and prompts for written narratives appeared to facilitate targeted feedback. WBA use was correlated with students’ development towards self-entrusted, indirect supervision levels that are a prerequisite to achieving competency-based learning goals. This WBA format should, therefore, be considered for inclusion in core clerkships to support self-regulated learning. Factors influencing the achievement of indirect supervision levels, and how to leverage inter-professional clinical supervision, need further exploration.

Availability of data and materials

Anonymized quantitative data available on request.

Abbreviations

EPA :

Entrustable professional activity

UME:

Undergraduate medical education

GME:

Graduate medical education

WBA:

Workplace-based assessment

References

  1. Chen CH, van den Broek SWE, ten Cate O. The case for use of entrustable professional activities in undergraduate medical education. Acad Med. 2015;90(4):431–6.https://doi.org/10.1097/ACM.0000000000000586.

    ArticleGoogle Scholar

  2. 女警CC, et al。反馈和评估工具entrustment decisions in the clinical workplace: a systematic review. J Vet Med Educ. 2019;46(3):340–52.https://doi.org/10.3138/jvme.0917-123r.

    ArticleGoogle Scholar

  3. Meyer EG, Chen HC, Uijtdehaage S, Durning SJ, Maggio LA. Scoping review of Entrustable professional activities in undergraduate medical education. Acad Med. 2019;94(7):1040–9.https://doi.org/10.1097/ACM.0000000000002735.

    ArticleGoogle Scholar

  4. Peters H, Holzhausen Y, Maaz A, Driessen E, Czeskleba A. Introducing an assessment tool based on a full set of end-of-training EPAs to capture the workplace performance of final-year medical students. BMC Med Educ. 2019;19(1):207.https://doi.org/10.1186/s12909-019-1600-4.

    ArticleGoogle Scholar

  5. Shorey S, Lau TC, Lau ST, Ang E. Entrustable professional activities in health care education: a scoping review. Med Educ. 2019;53(8):766–77.https://doi.org/10.1111/medu.13879.

    ArticleGoogle Scholar

  6. Ten Cate O, Schwartz A, Chen HC. Assessing Trainees and Making Entrustment Decisions: On the Nature and Use of Entrustment-Supervision Scales. Acad Med. 2020.https://doi.org/10.1097/acm.0000000000003427.

  7. van der Vleuten CP, Schuwirth LW. Assessing professional competence: from methods to programmes. Med Educ. 2005;39(3):309–17.https://doi.org/10.1111/j.1365-2929.2005.02094.x.

    ArticleGoogle Scholar

  8. van der Vleuten CP, et al. A model for programmatic assessment fit for purpose. Med Teach. 2012;34(3):205–14.https://doi.org/10.3109/0142159X.2012.652239.

    ArticleGoogle Scholar

  9. Ten Cate TJO, Snell L, Carraccio C. Medical competence: the interplay between individual ability and the health care environment. Med Teach. 2010;32(8):669–75.https://doi.org/10.3109/0142159X.2010.500897.

    ArticleGoogle Scholar

  10. Harris P, Bhanji F, Topps M, Ross S, Lieberman S, Frank JR, et al. Evolving concepts of assessment in a competency-based world. Med Teach. 2017;39(6):603–8.https://doi.org/10.1080/0142159X.2017.1315071.

    ArticleGoogle Scholar

  11. Schut S, Heeneman S, Bierer B, Driessen E, Tartwijk J, Vleuten C. Between trust and control: Teachers' assessment conceptualisations within programmatic assessment. Med Educ. 2020;54(6):528–37.https://doi.org/10.1111/medu.14075.

    ArticleGoogle Scholar

  12. Hirsh D, Holmboe E, ten Cate O. Time to trust: longitudinal integrated clerkships and entrustable professional activities. Acad Med. 2014;89(2):201–4.https://doi.org/10.1097/ACM.0000000000000111.

    ArticleGoogle Scholar

  13. Murray KE, et al. Crossing the gap: using competency-based assessment to determine whether learners are ready for the undergraduate-to-graduate transition. Acad Med. 2019;94(3):338–45.https://doi.org/10.1097/ACM.0000000000002535.

  14. Govaerts M. Workplace-based assessment and assessment for learning: threats to validity. J Grad Med Educ. 2015;7(2):265–7.https://doi.org/10.4300/JGME-D-15-00101.1.

    ArticleGoogle Scholar

  15. Norcini, J. And V. Burch, Workplace-based assessment as an educational tool: AMEE Guide No. 31.Med Teach. 2007;29:855–871, 9-10, DOI:https://doi.org/10.1080/01421590701775453.

    ArticleGoogle Scholar

  16. Crossley J, Jolly B. Making sense of work-based assessment: ask the right questions, in the right way, about the right things, of the right people. Med Educ. 2012;46(1):28–37.https://doi.org/10.1111/j.1365-2923.2011.04166.x.

    ArticleGoogle Scholar

  17. Pelgrim EA, Kramer AWM, Mokkink HGA, van der Vleuten CPM. Quality of written narrative feedback and reflection in a modified mini-clinical evaluation exercise: an observational study. BMC Med Educ. 2012;12(1):97.https://doi.org/10.1186/1472-6920-12-97.

    ArticleGoogle Scholar

  18. Watling CJ, Lingard L. Toward meaningful evaluation of medical trainees: the influence of participants’ perceptions of the process. Adv Health Sci Educ. 2012;17(2):183–94.https://doi.org/10.1007/s10459-010-9223-x.

    ArticleGoogle Scholar

  19. Lefroy J, Watling C, Teunissen PW, Brand P. Guidelines: the do’s, don’ts and don’t knows of feedback for clinical education. Perspectives on medical education. 2015;4(6):284–99.https://doi.org/10.1007/s40037-015-0231-7.

    ArticleGoogle Scholar

  20. Barrett A, et al. A BEME (Best Evidence in Medical Education) review of the use of workplace-based assessment in identifying and remediating underperformance among postgraduate medical trainees: BEME Guide No. 43. Med Teach. 2016;38:1188–98.

    ArticleGoogle Scholar

  21. 女警CC, et al。我准备好了吗?学生全氯乙烯eptions of meaningful feedback on entrustable professional activities. Perspectives on medical education. 2017;6(4):256–64.https://doi.org/10.1007/s40037-017-0361-1.

    ArticleGoogle Scholar

  22. Ginsburg S, Gold W, Cavalcanti RB, Kurabi B, McDonald-Blumer H. Competencies “plus”: the nature of written comments on internal medicine residents' evaluation forms. Acad Med. 2011;86(10 Suppl):S30–S4.https://doi.org/10.1097/ACM.0b013e31822a6d92.

    ArticleGoogle Scholar

  23. Ginsburg S, Eva K, Regehr G. Do in-training evaluation reports deserve their bad reputations? A study of the reliability and predictive ability of ITER scores and narrative comments. Acad Med. 2013;88(10):1539–44.https://doi.org/10.1097/ACM.0b013e3182a36c3d.

    ArticleGoogle Scholar

  24. Cook DA, Kuper A, Hatala R, Ginsburg S. When assessment data are words: validity evidence for qualitative educational assessments. Acad Med. 2016;91(10):1359–69.https://doi.org/10.1097/ACM.0000000000001175.

    ArticleGoogle Scholar

  25. Young JQ, Sugarman R, Holmboe E, O'Sullivan PS. Advancing our understanding of narrative comments generated by direct observation tools: lessons from the Psychopharmacotherapy-structured clinical observation. J Grad Med Educ. 2019;11(5):570–9.https://doi.org/10.4300/JGME-D-19-00207.1.

    ArticleGoogle Scholar

  26. Brydges R, Butler D. A reflective analysis of medical education research on self-regulation in learning and practice. Med Educ. 2012;46(1):71–9.https://doi.org/10.1111/j.1365-2923.2011.04100.x.

    ArticleGoogle Scholar

  27. Sagasser MH, Kramer AWM, Fluit CRMG, van Weel C, van der Vleuten CPM. Self-entrustment: how trainees’ self-regulated learning supports participation in the workplace. Adv Health Sci Educ. 2017;22(4):931–49.https://doi.org/10.1007/s10459-016-9723-4.

    ArticleGoogle Scholar

  28. Butler DL, Brydges R. Learning in the health professions: what does self-regulation have to do with it? Med Educ. 2013;11:1057–9.

    ArticleGoogle Scholar

  29. Sagasser MH, Kramer AW, Van Der Vleuten CP. How do postgraduate GP trainees regulate their learning and what helps and hinders them? A qualitative study. BMC Med Educ. 2012;12(1):67.https://doi.org/10.1186/1472-6920-12-67.

    ArticleGoogle Scholar

  30. Brittlebank A, Archer J, Longson D, Malik A, Bhugra DK. Workplace-based assessments in psychiatry: evaluation of a whole assessment system. Acad Psychiatry. 2013;37(5):301–7.https://doi.org/10.1176/appi.ap.11110198.

    ArticleGoogle Scholar

  31. Shalhoub J, Vesey AT, Fitzgerald JEF. What evidence is there for the use of workplace-based assessment in surgical training? J Surg Educ. 2014;71(6):906–15.https://doi.org/10.1016/j.jsurg.2014.03.013.

    ArticleGoogle Scholar

  32. Massie J, Ali JM. Workplace-based assessment: a review of user perceptions and strategies to address the identified shortcomings. Adv Health Sci Educ. 2016;21(2):455–73.https://doi.org/10.1007/s10459-015-9614-0.

    ArticleGoogle Scholar

  33. Hodwitz K, Tays W, Reardon R. Redeveloping a workplace-based assessment program for physicians using Kane’s validity framework. Can Med Educ J. 2018;9(3):e14–24.https://doi.org/10.36834/cmej.42286.

    ArticleGoogle Scholar

  34. Montagne S, Rogausch A, Gemperli A, Berendonk C, Jucker-Kupper P, Beyeler C. The mini-clinical evaluation exercise during medical clerkships: are learning needs and learning goals aligned? Med Educ. 2014;48(10):1008–19.https://doi.org/10.1111/medu.12513.

    ArticleGoogle Scholar

  35. Klapheke M, Johnson T, Cubero M. Assessing Entrustable professional activities during the psychiatry clerkship. Acad Psychiatry. 2017;41(3):345–9.https://doi.org/10.1007/s40596-017-0665-9.

    ArticleGoogle Scholar

  36. Curran VR, Deacon D, Schulz H, Stringer K, Stone CN, Duggan N, et al. Evaluation of the characteristics of a workplace assessment form to assess Entrustable professional activities (EPAs) in an undergraduate surgery Core clerkship. J Surg Educ. 2018;75(5):1211–22.https://doi.org/10.1016/j.jsurg.2018.02.013.

    ArticleGoogle Scholar

  37. Dory V, Gomez-Garibello C, Cruess R, Cruess S, Cummings BA, Young M. The challenges of detecting progress in generic competencies in the clinical setting. Med Educ. 2018;52(12):1259–70.https://doi.org/10.1111/medu.13749.

    ArticleGoogle Scholar

  38. Ginsburg S, van der Vleuten C, Eva KW, Lingard L. Hedging to save face: a linguistic analysis of written comments on in-training evaluation reports. Adv Health Sci Educ. 2016;21(1):175–88.https://doi.org/10.1007/s10459-015-9622-0.

    ArticleGoogle Scholar

  39. 泰勒D, et al。构造方法Entrustable professional activity development that deliver valid descriptions of professional practice. Teach Learn Med. 2021;33(1):89–97.https://doi.org/10.1080/10401334.2020.1784740. Epub 2020 Jul 7.

  40. Schumacher DJ, Schwartz A, Zenel JA, Paradise Black N, Ponitz K, Blair R, et al. Narrative performance level assignments at initial entrustment and graduation: integrating EPAs and milestones to improve learner assessment. Acad Med. 2020;95(11):1736–44.https://doi.org/10.1097/ACM.0000000000003300.

    ArticleGoogle Scholar

  41. Eva KW, Regehr G. Self-assessment in the health professions: a reformulation and research agenda. Acad Med. 2005;80(Supplement):S46–54.https://doi.org/10.1097/00001888-200510001-00015.

    ArticleGoogle Scholar

  42. Schatte D, Gavero G, Thomas L, Kovach J. Field guide to boot camp curriculum development. Acad Psychiatry. 2019;43(2):224–9.https://doi.org/10.1007/s40596-018-0933-3.

    ArticleGoogle Scholar

  43. Daelmans HE, et al. What difficulties do faculty members face when conducting workplace-based assessments in undergraduate clerkships? Int J Med Educ. 2016;7:19–24.https://doi.org/10.5116/ijme.5689.3c7f.

    ArticleGoogle Scholar

  44. Michaud P-A, Jucker-Kupper P. PROFILES; Principal Relevant Objectives and Framework for Integrated Learning and Education in Switzerland, in The Profiles Working Group. Bern: Joint Commission of the Swiss Medical Schools; 2017.

    Google Scholar

  45. Frank JR, Danoff D. The CanMEDS initiative: implementing an outcomes-based framework of physician competencies. Med Teach. 2007;29(7):642–7.https://doi.org/10.1080/01421590701746983.

    ArticleGoogle Scholar

  46. Pope C, Ziebland S, Mays N. Analysing qualitative data. BMJ. 2006:63–81.

  47. 普伦蒂斯年代,本森J,柯克帕特里克E, Schuwirth L。Workplace-based assessments in postgraduate medical education–a hermeneutic review. Med Educ. 2020;54(11):981–92.https://doi.org/10.1111/medu.14221.

    ArticleGoogle Scholar

  48. Nagler M, Feller S, Beyeler C. Retrospective adjustment of self-assessed medical competencies–noteworthy in the evaluation of postgraduate practical training courses. GMS Z Med Ausbild. 2012;29(3):Doc45.https://doi.org/10.3205/zma000815. Epub 2012 May 15.

  49. Colbert-Getz JM, et al. To what degree are the 13 Entrustable professional activities already incorporated into Physicians’ performance schemas for medical students? Teach Learn Med. 2019;31(4):361–69.https://doi.org/10.1080/10401334.2019.1573146. Epub 2019 Mar 15.

  50. Pinilla S, Lenouvel E, Strik W, Klöppel S, Nissen C, Huwendiek S. Entrustable professional activities in psychiatry: a systematic review. Acad Psychiatry. 2020;44(1):37–45.https://doi.org/10.1007/s40596-019-01142-7.

    ArticleGoogle Scholar

  51. Young JQ, McClure M. Fast, Easy, and Good: Assessing Entrustable Professional Activities in Psychiatry Residents with a Mobile App. Acad Med. 2020.https://doi.org/10.1097/acm.0000000000003390.

  52. Michels NR, et al. Content validity of workplace-based portfolios: a multi-Centre study. Med Teach. 2016;38(9):936–45.https://doi.org/10.3109/0142159X.2015.1132407.

    ArticleGoogle Scholar

Download references

Acknowledgements

We are grateful for the statistical support and advice of Daniel Stricker from the Institute for Medical Education (IML) at the University of Bern, Switzerland.

Funding

Not applicable

Author information

Authors and Affiliations

Authors

Contributions

SP and SH conceptualized the study. SP and AK were responsible for data collection and extraction. SK, WS and CN were responsible for curriculum design and assessment. All authors were involved in the final data synthesis and in drafting and revising the final manuscript.

Corresponding author

Correspondence toSeverin Pinilla.

Ethics declarations

Ethics approval and consent to participate

The cantonal ethics committee of Bern (Kantonale Ethikkommission Bern, Gesundheits- und Fürsorgedirektion des Kantons Bern, member of the Swiss Association of Research Ethics Committees) reviewed the research design and exempted this study from additional ethical approval (05.11.2018, project ID: 2018–01966).

Consent for publication

Not applicable.

Competing interests

All authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1: Figure S1.

Translated WBA-tool based on a prospective entrustment-supervision scale.

Rights and permissions

Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visithttp://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Pinilla, S., Kyrou, A., Klöppel, S.et al.Workplace-based assessments of entrustable professional activities in a psychiatry core clerkship: an observational study.BMC Med Educ21, 223 (2021). https://doi.org/10.1186/s12909-021-02637-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI:https://doi.org/10.1186/s12909-021-02637-4

Keywords

  • Entrustable professional activities
  • Entrustment
  • Workplace-based assessment
  • Undergraduate medical education
  • Clerkship