Skip to main content

Evaluating the effectiveness of a single-day simulation-based program in psychiatry for medical students: a controlled study

Abstract

Background

Training in psychiatry requires specific knowledge, attitudes, and skills that are obtainable by simulation, of which the use is only recent and still needs further development. Evidence is accumulating on its effectiveness but requires further validation for medical students. We aimed to evaluate the effectiveness of a single-day optional teaching program in psychiatry by simulation for medical students and validate a scale measuring Confidence in Psychiatric Clinical Skills (CPCQ), as part of the assessment.

Methods

This was a controlled study in a French University that compared (using paired-sample Student t-tests) knowledge and attitudes (university grades and CPCQ scores) before, just after teaching with simulated patients, and 2 months later. Satisfaction with the program (including the quality of the debriefing) was also investigated. The CPCQ scale was validated by assessing the factor structure, internal consistency, and test-retest reliability. Finally, a comparison was run with a control group who received the usual psychiatric instruction using covariance analyses.

Results

Twenty-four medical students were included in the simulation group and 76 in the control group. Just after the simulation, knowledge and attitudes increased significantly in the simulation group. Satisfaction with the training and debriefing was very high. The CPCQ scale showed good psychometric properties: a single-factor structure, acceptable internal consistency (α = 0.73 [0.65–0.85]), and good test-retest reliability (ICC = 0.71 [0.35–0.88]). Two months after the simulation, knowledge and attitudes were significantly higher in the simulation group than the control group, despite a lack of difference in knowledge before the simulation.

Conclusions

Adding a simulation program in psychiatry to the usual teaching improved the knowledge and confidence of medical students. The CPCQ scale could be used for the evaluation of educational programs.

Peer Review reports

Background

In Medical Education, the use of the pedagogical method of simulation has increased greatly since the introduction of” Harvey”, the cardiology patient simulator, in the 1970s, followed by other experiments in surgery, pediatrics, obstetrics, and anesthesia, and now concerns all medical fields [37].It includes technology-enhanced simulation (virtual reality simulators, mannequin-based simulation, or computer simulation with virtual patients [19]) as well as standardized patients (SPs), notably in psychiatry [2].

Simulation promotes learning through the experience, which facilitates knowledge production. According to Kolb’s experiential learning theory, “learning is the process whereby knowledge is created through the transformation of experience”. It implies four stages starting with concrete experience (doing/having an experience/feeling), followed by reflective observation (watching/reflecting on the experience) which leads to the formation of Abstract Conceptualisation and generalizations (thinking, concluding and making sense of what has happened). The last stage is active experimentation (considering how to put what has been learnt into practice) [47].The debriefing, that follows the simulated or real experience, consists of a facilitated conversation in which participants analyze their actions, thought process, and emotional states during three stages. It is a major component which emphasis reflective observations, conceptualization, and active experimentation [1,13,48,68].Two methods are well-established: the Debriefing Assessment for Simulation in Healthcare (DASH) [13] and the Objective Structured Assessment Debriefing (OSAD) [9].

In psychiatry, simulation is a relatively new field [53].Despite a dramatic evolution over the last couple of years, it continues to have show ongoing development and changes [2,50].Training in psychiatry requires particularly specific knowledge, attitudes, and skills that cannot simply be learned theoretically without experiential learning. Simulation provides an excellent opportunity to develop specific communication [55], psychotherapeutic, clinical, technical [46,64], and teamwork skills [10] to assess and manage various psychiatric disorders. Simulation is also efficient to make practitioners from other fields than psychiatry to acquire skills in mental health. For instance, mental health simulation programs have been developed for general practitioners [51], emergency physicians [17], paediatricians [18].A recent meta-analysis on simulation training in psychiatry for medical students, post-graduate trainees, and qualified doctors reported a threefold increase in research over the past 10 years [62].Several universities have introduced such training as compulsory [39] because medical students are demanding it for several reasons. All students, and not only those who want to become psychiatrists, may benefit from such learning because psychiatry is de facto practiced by many physicians, starting with GPs [65].A significant proportion of medical students never participate in a clerkship in psychiatry due to the limited number of places and even for those who do, training may be insufficient [3,50].Simulation could reduce fear and stigmatization. It also avoids the potential inconvenience of inexperienced students interacting with vulnerable individuals and, above all, exposing them to large groups of students [12,24].

lea标准参考评估rning intervention is Kirkpatrick’s Training Evaluation Model [45], which measures the impact on five levels: 1) reaction effect: satisfaction/dissatisfaction of participants; 2) learning effect: participants improve their knowledge; 3) behavioral effect: changes in attitudes, skills, or learners’ confidence in their own psychiatric clinical skills and anticipatory anxiety [43,61,70]; 4) patient results (i.e. if the intervention improves diagnosis or management of the disorders) to approach operational effectiveness and to reach patient-reported outcomes, which are essential in the evaluation of an intervention [56]; and 5) return of investment. The meta-analysis showed the global effectiveness of simulation training in psychiatry on attitudes, skills, knowledge, and satisfaction [62].We analyzed the results for medical students (48 surveys, 16 RCTs, and 32 controlled studies, 10 with follow-up). Most interventions (N = 44) were one session, 4 were repeated. A positive impact on satisfaction was reported for level 1 in seven studies. However, structured investigations of medical students’ satisfaction with the debriefing in psychiatry simulation-based teaching are scarce in the literature. Studies investigating medical students’ satisfaction with simulation mainly used non-validated tools [8,28,32,58].The present study thus addresses a significant literature gap about debriefing satisfaction in medical students learning psychiatry with simulation, by evaluating debriefing satisfaction with DASH.

For level 2, an improvement in student’s objective knowledge after simulation relative to other pedagogical methods was reported in 14 studies (and none in 6). However, this improvement was usually assessed immediately after the intervention. Only four studies evaluated this outcome with a gap between 1 and 4 months. Still, they did not strictly focus on general psychiatry, as two of them investigated alcohol abuse [7,41], the other one opioid abuse [54], and the last one geriatric medicine [30].The present study addresses this literature gap by investigating the long-term effectiveness of a simulation-based education across the whole national psychiatry program for medical students.

For level 3, 28 studies assessed attitudes such as empathy and skills, and 10 reported an improvement in self-confidence [26,29,61] without an assessment over time.

The review reported any study evaluating level 4. However, a study conducted among 3rd year medical students in Israel showed the effectiveness of a single-day simulated-based training in improving communication skills with real patients [8].

We developed and evaluated an intervention for medical students with three features: 1) a single-day program, 2) coordination between teachers of psychiatry for adults, children, and adolescents and those of general medicine, and 3) presentation of common situations encountered in daily practice in primary care or in the emergency room. We also developed a scale to measure the learners’ confidence in their psychiatric clinical skills. Examining one’s own practices is a fundamental dimension to characterize skills and clinical performance ([49,73].No tool was suitable for our study. Confidence scales from nursing education with excellent psychometric properties [27,31,36] were not adapted for medical students. General self-assessment scales of psychiatric competence did not study their psychometric properties [6,72].Finally, confidence scales with satisfactory internal consistency but specific to certain clinical situations (such as suicidal risk [52,57,63] and depression [61] could not be used for our clinical situations.

The present study thus had two objectives:

1)对单日的有效性进行评估simulation program for medical students in terms of satisfaction (including satisfaction with the debriefing), knowledge (after the intervention and with a long follow-up: exam grades 2 months later, and attitudes (self-confidence in clinical skills and changes in professional practices) and

2) to explore the psychometric properties of a new self-report questionnaire on confidence in one’s clinical skills in psychiatry.

Methods

Design

The study had a mixed design, with comparisons of before and after the intervention and between the intervention and control groups (Fig.1).

Fig. 1
figure 1

Design of the study

Population

The population was recruited during the 2019–2020 academic year among the 131 fifth-year undergraduate students at the University of Versailles Saint-Quentin-en-Yvelines-Paris Saclay (the year of compulsory psychiatric training).

Intervention

The intervention consisted of single day (8 h) of teaching of psychiatry by simulation with a simulated patient, according to the definition proposed by Adamo of a “medical encounter conducted for purely educational purposes” [4].The scenarios, decided within a group of 10 hospital-university teachers (from university fellow to professor in general medicine, child and adolescent psychiatry, or adult psychiatry), had to be: 1) addressed in the curriculum of the official national program [5], 2) a pathology frequently encountered in general practice, and 3) realistically performed by a team of psychiatry teachers without specific background in dramatic arts (eliminating the scenario of schizophrenia, which is challenging to play [62]). A complete example of a scenario is presented inSupplementary Material. The scenario had the following general structure:

  • Information about the learners (their expected level of formation, the prerequisite, and their role in the simulation)

  • Information about the trainers (their level of formation and their function in the simulation: simulated patient, briefing, re-briefing, facilitator, debriefing …)

  • Information about how the room should be prepared

  • Pedagogical objectives (medical & technical / non-technical) with references according to the french national program for advanced medical student

  • The critical points of the briefing and debriefing

  • A summary of the clinical situation for the trainers

  • 公关的内容e-briefing, with the description of the beginning of the situation that has to be communicated to the learner

  • A precise description of the role of the simulated patient (biography, social context, medical history, personality)

  • The description of the expected progress of the simulation (how the simulated patient should behave emotionally, what she/he should say, when she/he should move from one stage to another according to the learner behaviour

As in several previous investigations, simulated patients were portrayed by mental health professionals [34,58,59,61,71,74].Simulated patients were trained to portray the simulated patient by two academic teachers (PR & MR), themselves trained as health simulation trainer in a short (one-week) university course. Moreover, 6 academic teachers in the team had a former teaching psychiatry experience with simulation. The choice to use a teacher rather than a professional actor was based on a concern for realism, as psychiatry teachers should have a better knowledge of symptoms of psychiatric disorders than a professional actor. Some authors have indeed suggested that actors sometimes tend to overplay their role or go out of their script [25], or even caricature mental disorders [21].Besides, the additional costs of actors could not be supported in the present study, with an estimation of up to £250 per scenario for initial training and £75 for each new simulation session [21].

The four scenarios presented a drug suicide attempt in the context of borderline personality disorder and alcohol addiction assessed in the emergency room by a psychiatrist (Supplementary Information) and bereavement associated with post-traumatic stress disorder, hypomania, and a refusal to go to school by a 14-year-old adolescent assessed in a general practice setting.

Each simulation session included a briefing (10 min), the simulation (10–20 min), a structured debriefing (45 min), and a theoretical synthesis slide presentation (20 min). We used the Karlsen’s RUST (Reaction – Understanding – Summarise – Take home message) model of debriefing [44].During the understanding phase, we used the “debriefing with good judgment” [67].This approach relies first on the detection of performance deficits in students. The teacher then tries to uncover the underlying knowledge, assumptions, and feelings that drove students’ actions. Finally, the teacher gives feedback about the performance deficits and tries to bring the underlying student’s intentions and cognitive frames to light using a conversational technique pairing advocacy and inquiry.

Learners were divided into groups of eight (2 actively participating, 6 watching the live video broadcast in an adjacent room). The play was based on voluntary participation; students were asked to play at least in one scenario, and it was the condition to validate the optional teaching unit. Three teachers were involved, one playing the patient’s role, one as a potential facilitator, and one staying with the learners. No randomisation was used to assign the intervention or control group status. Students in the intervention group accepted an optional teaching unit on the condition that they actively participated in one scenario. The control group received the same usual psychiatric instruction as the simulation group in the form of a compulsory two-day interactive seminar with the flipped classroom technique [38].All included participants were provided with the pedagogical written content of the simulation sessions to ensure that any differences between the groups were related to the simulation teaching technique itself.

Measures

Knowledge

Theoretical knowledge was measured using multiple choice questions (MCQs) three times: for all students, 2 months before the simulation teaching (45 questions before the compulsory psychiatry seminar) and 2 months after (50 MCQs during the psychiatry examination, covering the entire national program of psychiatry, that is a challenging test) and for the simulation group, before and after the teaching (28 same questions but asked in a different order). All scores were scaled from 0 to 20.

Attitudes (Supplementary Information1)

Confidence was assessed by the specifically created Confidence in Psychiatric Clinical skills Questionnaire (CPCQ): 12 items, rated on a four-level Likert scale, explored confidence in theoretical knowledge, clinical skills (clinical reasoning and psychiatric interviewing), communication and interpersonal skills (with the patient, the patient’s proxies, and other professionals), and the management of psychiatric disorders. The individual mean score was used in the analyses. The change in professional practice was evaluated with one question.

Satisfaction (Supplementary Information2and3)

General satisfaction was rated out of 10. A 10-item questionnaire, rated on a four-level Likert scale, explored various aspects of satisfaction, such as the preference for simulation over another pedagogical modality, the perceived realism of the situation, the importance of being actively involved, etc. In addition, learners who underwent a clerkship in psychiatry were asked to compare it to the simulation. Questions about the scenarios and free comments were collected.

Satisfaction with the briefing and debriefing was assessed using the student version of the DASH [69].This scale, with excellent internal consistency (0.82–0.95) [14,23,66], explores the climate, structure of the debriefing, ability to engage in exchange, and strengths and areas for improvement. The mean across all items (6 overall assessments, 23 behavioral assessments) was used.

Statistical analysis

Pre/post-simulation comparisons

The average CPCQ and knowledge test scores just before and after simulation were compared using paired sample Student t-tests. Satisfaction was measured post-simulation.

Psychometric characteristics of the CPCQ scale

Construct validity was explored by exploratory factor analysis using oblim rotation and maximum likelihood factorization as the factorization method. Two criteria were used to determine the number of significant factors: first, Catell’s scree test, i.e. factors present to the left of the eigenvalue curve deflection [22], and second, Kaiser’s criteria, i.e. factors for which the eigenvalue is > 1[42].The internal consistency of each identified factor was evaluated using Cronbach’s α coefficient [20),可接受性阈值设置为0.7 (11].These analyses were carried out on the largest sample for the same time of measurement (final exam) and by bringing the two groups together.

Test-retest reliability was assessed by the intra-class correlation coefficient (ICC), calculated using a mixed model with a random double effect. It was defined as poor for an ICC <  0.4, acceptable between 0.4 and 0.59, good between 0.6 and 0.74, and excellent between 0.75 and 1 [15].The two times of measurement chosen to calculate it were those for which the least possible change was expected, i.e., just after the simulation and 2 months later.

Comparisons between simulation and control groups

First, age and participation in a clerkship in psychiatry (a potential confounding factor for confidence [57]) were compared between the two groups using chi2tests and scores on the pre-requisite exam using Student’s t-test. Analyses of covariance (ANCOVA) was then carried out with the mean CPCQ score and the psychiatry final exam score as dependent variables, the group as the independent variable, and the covariates that differed significantly between the two groups (clerkship in psychiatry).

Number of required subjects

The number of required subjects was calculated for knowledge (score out of 20 on the usual psychiatric examination). According to the results of the previous year, the average score was 13.3, with a standard deviation of 1.9. To show a mean difference of 2 points with an alpha risk of 5% and a statistical power of 90% required at least 19 subjects per group.

Ethics statement

The research was authorised on 20/12/2019 by the Ethics Committee of the University of Paris-Saclay (CER-Paris-Saclay-2019-061). All participants signed written and informed consent.

Results

Participants

The study was proposed to all 131 medical students in their fifth-year at the University of Versailles Saint-Quentin-en-Yvelines-Paris Saclay through an announcement by the educational department. Among these students, 100 accepted to participate ():N = 24 received the intervention, andN = 107 did not. The first 24 included students who accepted to participate in the intervention were assigned to the intervention arm. As the teaching was optional, it was not possible to randomise participants between two arms, as the inclusion within the intervention arm should be entirely based on student preferences. The simulation and control groups did not differ in terms of either the sex ratio (X2 = 0,p = 0.93) or initial knowledge (t (97) = 1.2,p = 0.24). There were more students with a clerkship in psychiatry in the intervention group (X2 = 3.7,p = 0.056) (Table1).

Table 1 Comparison of the Simulation group (N = 24) and Control group (N = 7 6) among the 131 fifth-year medical students at the University of Versailles Saint-Quentin-En-Yvelines for characteristics, knowledge, attitudes (confidence, change in professional practice) and satisfaction in the simulation group

Knowledge (Table1)

Theoretical knowledge (t (23) = 2.6,p = 0.01) improved after teaching. The ANCOVA showed better theoretical knowledge on the psychiatry exam (F (1.96) = 6,p = 0.016) in the simulation group than in the control group.

Attitudes (Table1)

Confidence measured with the CPCQ scale improved after teaching (t (23) = 8.2,p <  0.001). Learners shifted from an average low confidence level (2.2 ± SD 0.3) before instruction to an average high confidence level (2.7 ± 0.2) afterwards. This gain was maintained for 2 months, insofar as the reassessment of confidence with the CPCQ scale was not significantly different between immediately after the simulation and 2 months later (t (15) = 0.7,p = 0.52). The ANCOVA showed higher confidence on the CPCQ scale (F (1.89) = 6,p = 0.003) for the simulation group than the control group.

Satisfaction (Table1)

The overall satisfaction score was excellent (9.3 + −SD 0.6). The average scores on the satisfaction questionnaire (3.5 ± SD 0.2) showed that learners were satisfied to very satisfied with the teaching. The lowest score was obtained on the question about optional or compulsory teaching “(2.7 ± SD 0.8), suggesting a neutral position for the group of learners. The highest score was obtained for the questions on the preference of courses rather than the simulation (3.9 ± SD 0.3) and on the simulations’ realism (3.8 ± SD 0.4). All learners who had a clerkship in psychiatry felt that the simulation was more (4/5) or much more (1/5) informative than the clerkship.

The level of difficulty was found to be appropriate on average (3.1 ± SD 0.2). The scenarios were judged to be informative or very informative (3.5 ± SD 0.4).

Free comments were positive and suggested improvement areas (summarising an ideal psychiatric interview, furthering theoretical reminders, including other pathologies, such as eating disorders and schizophrenia).

The average total DASH score showed the briefing and debriefing of the simulation sessions to be rated as very good (6.5 ± SD 0.4). Behavioural scores suggested a good sense of security for the learners.

Psychometric characteristics of the CPCQ scale

Factor structure

A scree diagram (Fig.2) showed a single-factor structure of the CPCQ scale according to the Catell criterion, as the deflection of the curve occurred for two factors. The first factor was the only one with an eigenvalue > 1 (Kaiser criterion) and accounted for 20.5% of the variance.

Fig. 2
figure 2

Slump Diagram of the Confidence in Psychiatric Clinical Skills scale

Internal consistency

With a Cronbach’s alpha coefficient of 0.73 [0.65–0.85], the internal consistency was satisfactory.

Test-retest reliability

The ICC, equal to 0.71 [0.35–0.88], suggests good test-retest reliability.

Discussion

We evaluated the effectiveness of a single-day simulation program with simulated for medical students as a complement to usual teaching. There was an improvement after simulation for knowledge and attitudes. Satisfaction, including that concerning the debriefing, was high.

The effectiveness in improving knowledge of our simulation training in psychiatry is coherent with previous results with medical students [62].We confirmed the maintained positive impact at 2 months, showing for the first time an improvement of academic performance across the whole national program of psychiatry, i.e. a challenging test, 2 months later. Simulation in psychiatry, an essential addition to other pedagogical tools, allows sustainable acquisition of knowledge that cannot be simply learned theoretically and memorized without experiential learning [16].All participants who had a clerkship in psychiatry and the simulation found the latter to be more informative, which questions the clerkship’s pedagogical benefit. Simulation with a simulated patient provides an opportunity for real-time feedback and reflection on performance, which is rarely the case in interactions between medical students and people with psychiatric disorders [21].The improvement in attitudes is also consistent with previous studies on medical students [62].This is the first study to report that self-confidence improvement is maintained 2 months after receiving the training. Among attitudes, self-confidence is that which has been the most explored relative to empathy or stigmatisation [62),和与更好的技能有关example, in assessing suicide risk [49].It is important to notice that the improvement in knowledge and attitude occurred despite the simulated patients were played by teachers and not professional actors. Our results thus suggest that using academic teacher could be an efficient alternative in settings were actors are unavailable. Simulation is popular among students. Our study confirmed a very high level of satisfaction with the content of the teaching and its usefulness for practice. The average DASH score, well above the usual acceptability threshold of four [14], suggests effective briefings and debriefings in a safe educational framework. Despite the lack of teachers trained to act, the simulations appeared to be realistic to the learners. Future studies could use validated scales, such as theMaastricht Assessment of Simulated Patients[75], to assess the quality of simulated patient roleplay reliably. Compared to evaluated simulation programs, our program has the particular form of a single-day training while the others are mostly shorter (one session) or longer (repeated). Only one other program with a similar format to ours in its general approach to different psychiatric disorders reported a positive impact upon communication and psychiatric skills for medical students [8].The CPCQ scale of the confidence of medical students’ clinical skills in psychiatry showed satisfactory psychometric properties (acceptable internal consistency, good test-retest reliability, and a unifactorial structure) and it proved to be an easy and rapid evaluation tool. It is an important addition to tools for which the psychometric properties are not known [40].Given the small sample size for measuring test-retest reliability, the confidence interval obtained was large and the results should be replicated on a larger sample.

Our study had several limitations, despite a relatively high median quality, as assessed by the MERSQI (Medical Education Research Study Quality Instrument), i.e. 12 (Supplementary Information4) vs 10.8 for studies reported in the meta-analysis [62].As in most previous studies [62], the main limitations were the absence of random assignment and a control condition that was teaching as usual, not a control pedagogical intervention with the same duration as simulation. It was not possible to randomise since this teaching was optional and therefore based on student preferences. The higher proportion of learners with a clerkship in psychiatry in the simulation group may suggest a selection bias toward individuals with a high level of interest and motivation for the discipline. Second, some measures were missing: prior exposure to simulation experiments for both groups and a measure of pre-intervention confidence for the control group (the simulation group may have had a higher level of confidence than the control group prior to the intervention, in connection with participation in a clerkship in psychiatry). Third, the generalizability of our results is limited by the small sample size and a single teaching site. Fourth, our study lacked a hetero-evaluation of psychiatric clinical skills. We did not find a validated scale to assess skill in psychiatry, despite the efforts of certain authors to develop objective measures of the efficiency of a psychiatric interview [59], and an assessment by teachers was not possible, as the students participated in the simulation only once. Moreover, we did not explore level 4 of Kirkpatrick’s model for simulation, i.e. the outcome on management and individuals with a mental disorder. This would be important to allow wider dissemination of this pedagogical technique in the mental health field. The fact that teachers played the simulated patient has its limitations. It would have affected the training’s realism due to a lack of specific dramatic arts background in these non-professional actors. This interpretation is not supported by the fact that students reported they found the situational scenarios very realistic (3.8/4 + − 0.4), suggesting that teachers’ acting was satisfactory. A social desirability bias might have explained the high level of satisfaction reported in this study, as the teachers played the simulated patients’ roles. However, all questionnaires were anonymously filled, which should have limited this bias. Moreover, if this bias might have led to a global tendency to overestimated satisfaction, it would not explain why the item focusing on the realism of the simulation reached the second highest score of satisfaction. If there had been a realism issue, this item would have been scored lower than the other ones. One might have also argued that the present study’s simulation format was more a structured roleplay [35] than a simulated interview. The main difference between a simulated interview and a roleplay would be the predetermination of the role and the scenario [33].在扮演一个这样有tw之间的对称o players who can perform either the patient’s or doctor’s role. The clinical situation is partly unpredictable, with possible improvisation and weak determination of the fictional role. It is based on the personal and professional experience of each person. In contrast, a simulated patient strongly relies on a structured scenario, with prewritten dialogues and a precise emotional, biographical, and personality portrayal. Within a simulated interview, it is impossible to switch the role, as there is a strong asymmetry between the simulated patient’s and doctor’s background. Our study used detailed scenarios for the four simulated patients, with prewritten dialogues, fixed progress, and facilitators in order to make the scenario move forward in case of a stalled situation. We choose the simulated patient format rather than the roleplay format because our experience with roleplaying was a low medical student’s satisfaction. This experience was confirmed by the responses to the satisfaction question “Would you have preferred roleplaying (with some students playing the role of patients) rather than simulation teaching to complement your psychiatric training?” which were between “disagree” and “strongly disagree” (3.5/4 + − 0.5).

Future studies should compare the pedagogical efficiency of simulation programmes in psychiatry using simulated patients played by health professionals against simulated patients played by professional actors. They should also focus on level 4 of Kirkpatrick, for instance, by investigating the impact of simulation on students’ psychiatric competencies during interviews with the real patients they encounter during their clerkship in psychiatry. In a future research direction, it would be interesting to compare the pedagogical efficiency and the satisfaction in terms of secure feelings between simulation involving one and two simultaneous active students like in this study. In this study, we did not use video recording for debriefing: it would be important to elaborate on how this tool that avoids cognitive bias in remembering could be integrated into a debriefing with medical students without hampering the debriefing fluidity.

Conclusion

Our study shows the effectiveness in terms of knowledge gained, attitudes, and satisfaction of a single-day program of teaching psychiatry through simulated-based simulation as a complement to usual teaching for fifth-year medical students in France. The teaching has the disadvantage of being resource-intensive [60], especially in terms of human resources, with a teacher/learner ratio of 3/8. The Confidence in Psychiatric Clinical Competence Scale shows acceptable psychometric properties and may be used by other educational teams involved in teaching psychiatry to medical students.

Availability of data and materials

The datasets used and/or analysed during the current study available from the corresponding author on reasonable request.

Abbreviations

MCQ:

Multiple-choice questions

CPCQ Questionnaire:

Confidence in Psychiatric Clinical Skills Questionnaire

DASH:

Evaluation of debriefing for health simulation

ICC:

Intra-class correlation coefficient

ANCOVA:

Analyses of covariance

References

  1. Abatzis VT, Littlewood KE. Debriefing in simulation and beyond. Int Anesthesiol Clin. 2015;53(4):151–62.https://doi.org/10.1097/AIA.0000000000000070.

    ArticleGoogle Scholar

  2. Abdool PS, Nirula L, Bonato S, Rajji TK, Silver IL. Simulation in undergraduate psychiatry: exploring the depth of learner engagement. Acad Psychiatry J Am Assoc Dir Psychiatr Resid Train Assoc Acad Psychiatry. 2017;41(2):251–61.

    Google Scholar

  3. Abed R, Teodorczuk A. Danger ahead: challenges in undergraduate psychiatry teaching and implications for community psychiatry. Br J Psychiatry J Ment Sci. 2015;206(2):89–90.https://doi.org/10.1192/bjp.bp.114.146852.

    ArticleGoogle Scholar

  4. 阿达莫g .模拟和标准化病人在操作系统CEs: achievements and challenges 1992-2003. Med Teach. 2003;25(3):262–70.https://doi.org/10.1080/0142159031000100300.

    ArticleGoogle Scholar

  5. AESP, CNUP, CUNEA. Référentiel de Psychiatrie et d’Addictologie Psychiatrie de l’adulte. Psychiatrie de l’enfant et de l’adolescent. Addictologie: Presse Universitaire François-Rabelais; 2016.

  6. Ajaz A, David R, Bhat M. The PsychSimCentre: teaching out-of-hours psychiatry to non-psychiatrists. Clin Teach. 2016;13(1):13–7.https://doi.org/10.1111/tct.12382.

    ArticleGoogle Scholar

  7. Albright G, Adam C. 2016. Simulated Conversations with Virtual Patients to Prepare Health Professionals to Conduct Screening & Brief Intervention (SBI) for Substance Use and Mental Health. Accessed 18 Jul 2020.https://kognito.com/wp-content/uploads/At-Risk_in_Primary_Care_KognitoResearch_2016.pdf

    Google Scholar

  8. Amsalem D, Gothelf D, Soul O, Dorman A, Ziv A, Gross R. Single-day simulation-based training improves communication and psychiatric skills of medical students. Front Psychiatry. 2020;11:221.https://doi.org/10.3389/fpsyt.2020.00221.

    ArticleGoogle Scholar

  9. Arora A, Ahmed M, Sevdalis N. 2015. Evidence-based performance debriefing for surgeons and surgical teams: the Observational Structured Assessment of Debriefing tool (OSAD). Accessed 17 Jul 2020.https://www.imperial.ac.uk/media/imperial-college/medicine/surgery-cancer/pstrc/debriefingosadtool.pdf

    Google Scholar

  10. Attoe C, Kowalski C, Fernando A, Cross S. Integrating mental health simulation into routine health-care education. Lancet Psychiatry. 2016;3(8):702–3.https://doi.org/10.1016/S2215-0366(16)30100-6.

  11. Bland JM, Altman DG. Statistics notes: Cronbach’s alpha. BMJ. 1997;314(7080):572.https://doi.org/10.1136/bmj.314.7080.572.

    ArticleGoogle Scholar

  12. Brenner AM. Uses and limitations of simulated patients in psychiatric education. Acad Psychiatry J Am Assoc Dir Psychiatr Resid Train Assoc Acad Psychiatry. 2009;33(2):112–9.

    Google Scholar

  13. Brett-Fleegler M, Rudolph J, Eppich W, Monuteaux M, Fleegler E, Cheng A, et al. Debriefing assessment for simulation in healthcare: development and psychometric properties. Simul Healthc J Soc Simul Healthc. 2012;7(5):288–94.https://doi.org/10.1097/SIH.0b013e3182620228.

    ArticleGoogle Scholar

  14. Brown DK, Wong AH, Ahmed RA. Evaluation of simulation debriefing methods with interprofessional learning. J Interprof Care. 2018b;32(6):779–81.https://doi.org/10.1080/13561820.2018.1500451.

  15. Cicchetti DV. Guidelines, criteria, and rules of thumb for evaluating normed and standardized assessment instruments in psychology. Psychol Assess. 1994;6(4):284–90.https://doi.org/10.1037/1040-3590.6.4.284.

    ArticleGoogle Scholar

  16. Clapper TC. Beyond Knowles: what those conducting simulation need to know about adult learning theory. Clin Simul Nurs. 2010;6(1):e7–e14.https://doi.org/10.1016/j.ecns.2009.07.003.

    ArticleGoogle Scholar

  17. Coggins A, Marchant D, Bartels J, Cliff B, Warburton S, Murphy M, Mitra T, Ryan CJ. Simulation-based medical education can be used to improve the mental health competency of emergency physicians. Australas Psychiatry Bull R Aust N Z Coll Psychiatr. 2020;28(3):354–8.https://doi.org/10.1177/1039856220901480.

  18. Colburn MD, Harris E, Lehmann C, Widdice LE, Klein MD. Adolescent Depression Curriculum Impact on Pediatric Residents’ Knowledge and Confidence to Diagnose and Manage Depression. J Adolesc Health Off Publ Soc Adolesc Med. 2020;66(2):240–6.https://doi.org/10.1016/j.jadohealth.2019.08.022.

  19. Cook DA, Hatala R, Brydges R, Zendejas B, Szostek JH, Wang AT, et al. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA. 2011;306(9):978–88.https://doi.org/10.1001/jama.2011.1234.

    ArticleGoogle Scholar

  20. Cronbach LJ. Coefficient alpha and the internal structure of tests. Psychometrika. 1951;16(3):297–334.https://doi.org/10.1007/BF02310555.

    ArticleGoogle Scholar

  21. Dave S. Simulation in psychiatric teaching. Adv Psychiatr Treat R Coll Psychiatr J Contin Prof Dev. 2012;18(4):292–8.

    Google Scholar

  22. Dmitrienko A, Chuang-Stein C, D’Agostino Sr RB. Pharmaceutical statistics using SAS: a practical guide. Cary: SAS Institute; 2007.

  23. Dreifuerst KT. Using debriefing for meaningful learning to foster development of clinical reasoning in simulation. J Nurs Educ. 2012;51(6):326–33.https://doi.org/10.3928/01484834-20120409-02.

    ArticleGoogle Scholar

  24. Eagles JM, Calder SA, Nicoll KS, Walker LG. A comparison of real patients, simulated patients and videotaped interview in teaching medical students about alcohol misuse. Med Teach. 2001;23(5):490–3.https://doi.org/10.3109/01421590109177949.

    ArticleGoogle Scholar

  25. Eagles JM, Calder SA, Wilson S, Murdoch JM, Sclare PD. Simulated patients in undergraduate education in psychiatry. Psychiatr Bull. 2007;31(5):187–90.https://doi.org/10.1192/pb.bp.106.010793.

  26. Elley CR, Clinick T, Wong C, Arroll B, Kennelly J, Doerr H, et al. Effectiveness of simulated clinical teaching in general practice: randomised controlled trial. J Prim Health Care. 2012;4(4):281–7.https://doi.org/10.1071/HC12281.

    ArticleGoogle Scholar

  27. Ewalds-Kvist B, Algotsson M, Bergstrom A, Lutzen K. Psychiatric nurses’ self-rated competence. Issues Ment Health Nurs. 2012;33(7):469–79.https://doi.org/10.3109/01612840.2012.663460.

    ArticleGoogle Scholar

  28. Fabrizio J, DeNardi K, Boland M, Suffoletto J-A. Use of a standardized patient in teaching medical students to assess for PTSD in military veteran patients. MedEdPORTAL. 2017;13(1):10608 mep_2374-8265.10608.

    ArticleGoogle Scholar

  29. Fiedorowicz JG, Tate J, Miller AC, Franklin EM, Gourley R, Rosenbaum M. A medical interviewing curriculum intervention for medical students’ assessment of suicide risk. Acad Psychiatry J Am Assoc Dir Psychiatr Resid Train Assoc Acad Psychiatry. 2013;37(6):398–401.

    Google Scholar

  30. Fisher JM, Walker RW. A new age approach to an age old problem: using simulation to teach geriatric medicine to medical students. Age Ageing. 2014;43(3):424–8.https://doi.org/10.1093/ageing/aft200.

    ArticleGoogle Scholar

  31. Flinkman M, Leino-Kilpi H, Numminen O, Jeon Y, Kuokkanen L, Meretoja R. Nurse competence scale: a systematic and psychometric review. J Adv Nurs. 2017;73(5):1035–50.https://doi.org/10.1111/jan.13183.

    ArticleGoogle Scholar

  32. Foster A, Johnson T, Liu H, Cluver J, Johnson S, Neumann C, et al. Student assessment of psychiatry clinical simulation teaching modules. Med Teach. 2015;37(3):300.https://doi.org/10.3109/0142159X.2014.948834.

    ArticleGoogle Scholar

  33. French High Authority for Health. 2012. Guide to good practice in health simulation. Eval Improv Pract St Denis Plaine Fr.

    Google Scholar

  34. Geoffroy PA, Delyon J, Strullu M, Dinh AT, Duboc H, Zafrani L, et al. Standardized patients or conventional lecture for teaching communication skills to undergraduate medical students: a randomized controlled study. Psychiatry Investig. 2020;17(4):299–305.https://doi.org/10.30773/pi.2019.0258.

    ArticleGoogle Scholar

  35. Gliva-McConvey G, Nicholas CF, Clark L. Comprehensive healthcare simulation: implementing best practices in standardized patient methodology: Springer; 2020.

  36. Grundy SE. The confidence scale: development and psychometric characteristics. Nurse Educ. 1993;18(1):6–9.https://doi.org/10.1097/00006223-199301000-00004.

    ArticleGoogle Scholar

  37. Harden RM. Trends and the future of postgraduate medical education. Emerg Med J EMJ. 2006;23(10):798–802.https://doi.org/10.1136/emj.2005.033738.

    ArticleGoogle Scholar

  38. Hew KF, Lo CK. Flipped classroom improves student learning in health professions education: a meta-analysis. BMC Med Educ. 2018;18(1):38.https://doi.org/10.1186/s12909-018-1144-z.

    ArticleGoogle Scholar

  39. Himmelbauer M,塞茨T,塞德曼C, Loffler-StastkaH. Standardized patients in psychiatry - the best way to learn clinical skills? BMC Med Educ. 2018;18(1):72.https://doi.org/10.1186/s12909-018-1184-4.

    ArticleGoogle Scholar

  40. Hodges B, Hanson M, McNaughton N, Regehr G. Creating, monitoring, and improving a psychiatry OSCE: a guide for faculty. Acad Psychiatry J Am Assoc Dir Psychiatr Resid Train Assoc Acad Psychiatry. 2002;26(3):134–61.

    Google Scholar

  41. Kahan M, Wilson L, Midmer D, Borsoi D, Martin D. Randomized controlled trial on the effects of a skills-based workshop on medical students’ management of problem drinking and alcohol dependence. Subst Abuse. 2003;24(1):5–16.https://doi.org/10.1080/08897070309511529.

    ArticleGoogle Scholar

  42. Kaiser HF. The application of electronic computers to factor analysis. Educ Psychol Meas. 1960;20(1):141–51.https://doi.org/10.1177/001316446002000116.

    ArticleGoogle Scholar

  43. Kameg K,霍华德VM,米切尔,suresh Clochesy Jky JM. The impact of high fidelity human simulation on self-efficacy of communication skills. Issues Ment Health Nurs. 2010;31(5):315–23.https://doi.org/10.3109/01612840903420331.

    ArticleGoogle Scholar

  44. Karlsen R. Stable Program. Adaptation of the RUS model. Cambridge: Original work from the Center for Medical Simulation (D.R.); 2013.

    Google Scholar

  45. Kirkpatrick D, Kirkpatrick J. Evaluating training programs: The four levels. San Francisco: Berrett-Koehler Publishers; 2006.

  46. Kitay B, Martin A, Chilton J, Amsalem D, Duvivier R, Goldenberg M. Electroconvulsive therapy: a video-based educational resource using standardized patients. Acad Psychiatry J Am Assoc Dir Psychiatr Resid Train Assoc Acad Psychiatry. 2020;44(5):531–7.

    Google Scholar

  47. Kolb D. Experiential learning: experience as the source of learning and development. Englewood Cliffs: Prentice-Hall; 1984.

    Google Scholar

  48. Levett-Jones T, Lapkin S. A systematic review of the effectiveness of simulation debriefing in health professional education. Nurse Educ Today. 2014;34(6):e58–63.https://doi.org/10.1016/j.nedt.2013.09.020.

    ArticleGoogle Scholar

  49. Mackelprang JL, Karle J, Reihl KM, Cash REG. Suicide intervention skills: graduate training and exposure to suicide among psychology trainees. Train Educ Prof Psychol. 2014;8(2):136–42.

    Google Scholar

  50. McNaughton N, Ravitz P, Wadell A, Hodges BD. Psychiatric education and simulation: a review of the literature. Can J Psychiatry Rev Can Psychiatr. 2008;53(2):85–93.https://doi.org/10.1177/070674370805300203.

    ArticleGoogle Scholar

  51. Miller-Cribbs J, Bragg J, Wen F, Jelley M, Coon KA, Hanks H, Howell D, Randall K, Isaacson M, Rodriguez K, Sutton G. An evaluation of a simulation and video-based training program to address adverse childhood experiences. Int J Psychiatry Med. 2020;55(5):366–75.https://doi.org/10.1177/0091217420951064.

  52. Mitchell SM, Taylor NJ, Jahn DR, Roush JF, Brown SL, Ries R, Quinnett P. Suicide-Related Training, Self-Efficacy, and Mental Health Care Providers’ Reactions Toward Suicidal Individuals. Crisis. 2020;41(5):1–8.https://doi.org/10.1027/0227-5910/a000647.

  53. Mitra P, Fluyau D. The current role of medical simulation in psychiatry. In: StatPearls. Treasure Island: StatPearls Publishing; 2021. Accessed 10 Apr 2021.http://www.ncbi.nlm.nih.gov/books/NBK551665/.

    Google Scholar

  54. Monteiro K, Dumenco L, Collins S, Bratberg J, MacDonnell C, Jacobson A, et al. An interprofessional education workshop to develop health professional student opioid misuse knowledge, attitudes, and skills. J Am Pharm Assoc JAPhA. 2017;57(2S):S113–7.https://doi.org/10.1016/j.japh.2016.12.069.

    ArticleGoogle Scholar

  55. Neale J. What is the evidence for the use of simulation training to teach communication skills in psychiatry? Evid Based Ment Health. 2019;22(1):23–5.https://doi.org/10.1136/ebmental-2018-300075.

  56. Nelson EC, Eftimovska E, Lind C, Hager A, Wasson JH, Lindblad S. Patient reported outcome measures in practice. BMJ. 2015;350:g7818.

    ArticleGoogle Scholar

  57. Batterham PJ, Patel年代,Calear,克莱尔r .预测ors of comfort and confidence among medical students in providing care to patients at risk of suicide. Acad Psychiatry J Am Assoc Dir Psychiatr Resid Train Assoc Acad Psychiatry. 2016;40(6):919–22.

    Google Scholar

  58. Peyre H, Geoffroy PA, Tebeka S, Ceccaldi P-F, Plaisance P. Teaching healthcare students to assess suicide risk with a standardized patient module. Ann Méd-Psychol Rev Psychiatr. 2021;179(1):27–32.

    Google Scholar

  59. Pham-Dinh C, Laprevote V,施万R, Pichene C,出租车uth B, Braun M, Ligier F. 2019. Quantifying efficacy of investigation during a simulated psychiatric interview. L’Encephale

    BookGoogle Scholar

  60. Pheister M, Stagno S, Cotes R, Prabhakar D, Mahr F, Crowell A, et al. Simulated patients and scenarios to assess and teach psychiatry residents. Acad Psychiatry J Am Assoc Dir Psychiatr Resid Train Assoc Acad Psychiatry. 2017;41(1):114–7.

    Google Scholar

  61. Piette A, Muchirahondo F, Mangezi W, Iversen A, Cowan F, Dube M, et al. Simulation-based learning in psychiatry for undergraduates at the University of Zimbabwe medical school. BMC Med Educ. 2015;15(1):23.https://doi.org/10.1186/s12909-015-0291-8.

    ArticleGoogle Scholar

  62. Piot M-A, Dechartres A, Attoe C, Jollant F, Lemogne C, Layat Burn C, et al. Simulation in psychiatry for medical doctors: a systematic review and meta-analysis. Med Educ. 2020;54(8):696–708.https://doi.org/10.1111/medu.14166.

    ArticleGoogle Scholar

  63. Pisani AR, Cross WF, Watts A, Conner K. Evaluation of the commitment to living (CTL) curriculum: a 3-hour training for mental health professionals to address suicide risk. Crisis. 2012;33(1):30–8.https://doi.org/10.1027/0227-5910/a000099.

    ArticleGoogle Scholar

  64. Rabheru K, Wiens A, Ramprasad B, Bourgon L, Antochi R, Hamstra SJ. Comparison of traditional didactic seminar to high-fidelity simulation for teaching electroconvulsive therapy technique to psychiatry trainees. J ECT. 2013;29(4):291–6.https://doi.org/10.1097/YCT.0b013e318290f9fb.

    ArticleGoogle Scholar

  65. Regier DA, Goldberg ID, Taube CA. The de facto US mental health services system: a public health perspective. Arch Gen Psychiatry. 1978;35(6):685–93.https://doi.org/10.1001/archpsyc.1978.01770300027002.

    ArticleGoogle Scholar

  66. Roh YS, Jang KI. Survey of factors influencing learner engagement with simulation debriefing among nursing students. Nurs Health Sci. 2017;19(4):485–91.https://doi.org/10.1111/nhs.12371.

    ArticleGoogle Scholar

  67. Rudolph JW, Simon R, Dufresne RL, Raemer DB. There’s no such thing as “nonjudgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc J Soc Simul Healthc. 2006;1(1):49–55.https://doi.org/10.1097/01266021-200600110-00006.

    ArticleGoogle Scholar

  68. Rudolph JW, Simon R, Raemer DB, Eppich WJ. Debriefing as formative assessment: closing performance gaps in medical education. Acad Emerg Med Off J Soc Acad Emerg Med. 2008;15(11):1010–6.https://doi.org/10.1111/j.1553-2712.2008.00248.x.

    ArticleGoogle Scholar

  69. Simon R, Raemer DB, Rudolph JW. Debriefing Assessment for Simulation in Healthcare (DASH)© – Student Version, Long Form. Center for Medical Simulation, Boston, Massachusetts. 2010.https://harvardmedsim.org/wp-content/uploads/2017/01/DASH.SV.Long.2010.Final.pdf.

  70. Szpak JL, Kameg KM. Simulation decreases nursing student anxiety prior to communication with mentally ill patients. Clin Simul Nurs. 2013;9(1):e13–9.https://doi.org/10.1016/j.ecns.2011.07.003.

    ArticleGoogle Scholar

  71. Vandyk AD, Lalonde M, Merali S, Wright E, Bajnok I, Davies B. The use of psychiatry-focused simulation in undergraduate nursing education: a systematic search and review. Int J Ment Health Nurs. 2018;27(2):514–35.https://doi.org/10.1111/inm.12419.

    ArticleGoogle Scholar

  72. Wand A, Maheshwari R, Holton M. Mindful of the gaps: enhancing psychiatry training through a trainee workshop. Australas Psychiatry Bull R Aust N Z Coll Psychiatr. 2012;20(3):231–6.

    Google Scholar

  73. Way R. Assessing clinical competence. Emerg Nurse J RCN Accid Emerg Nurs Assoc. 2002;9(9):30–4.

    Google Scholar

  74. Williams JC, Balasuriya L, Alexander-Bloch A, Qayyum Z. Comparing the effectiveness of a guide booklet to simulation-based training for Management of Acute Agitation. Psychiatr Q. 2019;90(4):861–9.https://doi.org/10.1007/s11126-019-09670-z.

    ArticleGoogle Scholar

  75. Wind LA, Van Dalen J, Muijtjens AMM, Rethans J-J. Assessing simulated patients in an educational setting: the MaSP (Maastricht assessment of simulated patients). Med Educ. 2004;38(1):39–44.https://doi.org/10.1111/j.1365-2923.2004.01686.x.

    ArticleGoogle Scholar

Download references

Acknowledgements

We thank Franck Abitbol, a simulation technician, and Valéria Martinez, president of the Simulation Committee of the University of Versailles Saint-Quentin-En-Yvelines.

Funding

This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors. We thank the Centre Hospitalier de Versailles for editorial assistance.

Author information

Authors and Affiliations

Authors

Contributions

PR designed the study and performed the data analysis. NY and PR drafted the manuscript. NY, ALD, MR, PS, FH, FU, NG, MS, CP, and PR made critical revisions and edited the manuscript. All authors contributed to and approved the final manuscript.

Corresponding author

Correspondence toNadia Younes.

Ethics declarations

Ethics approval and consent to participate

The research was authorized on 12/20/2019 by the Ethics Committee of the University of Paris-Saclay under the reference CER-Paris-Saclay-2019-061. All participants signed a written informed consent form before being included in the study.

All methods were carried out in accordance with relevant national/international/institutional guidelines and regulations.

Consent for publication

Not applicable.

Competing interests

The authors have no conflict of interest to declare.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access本文是在Creative Commons许可Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visithttp://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Younes, N., Delaunay, A., Roger, M.et al.Evaluating the effectiveness of a single-day simulation-based program in psychiatry for medical students: a controlled study.BMC Med Educ21, 348 (2021). https://doi.org/10.1186/s12909-021-02708-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI:https://doi.org/10.1186/s12909-021-02708-6

Keywords

  • Simulation
  • Simulated patient
  • Psychiatry
  • vwin-eam
  • Psychometrics
  • Confidence
  • Clinical skills
  • Internal consistency
  • Factor analysis
  • Test-retest reliability