Effectiveness versus efficiency in a medical skills laboratory

June 24, 2017 | Autor: Dan-Sebastian Dirzu | Categoría: Medical Education
Share Embed


Descripción

Effectiveness vs efficiency in a medical skills laboratory Romanian Journal of Anaesthesia and Intensive Care 2015 Vol 22 No 1, 35-39

Effectiveness versus efficiency in a medical skills laboratory Dan Sebastian Dîrzu1, Sanda Maria Copotoiu2

1 2

University of Medicine and Pharmacy Iuliu Haţieganu, Anaesthesia and Intensive Care Department, Cluj-Napoca, Romania University of Medicine and Pharmacy, Anaesthesia and Intensive Care Department, Târgu Mureş, Romania

Abstract Medical educators are facing the new challenge of using medical simulation for teaching purposes. The use of simulators seems attractive for trainers and for trainees, but prices of simulators may be prohibitive. In an era of limited resources it is mandatory when using such an expensive tool as simulation, to prove its benefits. Despite the fact that simulation provides opportunity for training, additional advantages are far from being established. The supposed benefits of using medical simulators in teaching and examination needs to be proven regarding two aspects: effectiveness and efficiency. Key words: medical education, clinical skills, cost effectiveness, training Rom J Anaesth Int Care 2015; 22: 35-39

Introduction Contemporary medical education includes a large teaching spectre, starting with knowledge acquisition and culminating with complex decision making. No matter the invasiveness of the speciality, developing practical skills is the next step after knowledge acquisition. Starting with internal medicine physicians who need to master auscultation skills using a stethoscope, and ending with surgeons who use sophisticated techniques, all medical specialities require a Medical Skills Laboratory to train novices. Using simulation in medical teaching has already been assumed as an ethical imperative, simulation being accepted as both an education and evaluation standard [1]. The advantages of using a Medical Skills Laboratory has been intensively studied and published in contemporary medical literature [1-4], but no clear distinction between effectiveness and efficiency has been made. Adress for correspondence:

Dan Dîrzu, MD ICU Department, SCJU Cluj-Napoca Str. Clinicilor 3-5 Cluj-Napoca, Romania E-mail: [email protected]

Effectiveness and efficiency are two terms frequently misused to replace one another. The two terms are different: while effectiveness is defined as producing a result that is wanted, having an intended effect [5], efficiency is the ability to do something or produce something without wasting materials, time, or energy [6]. The authors of this paper consider that a clear distinction between effectiveness and efficiency needs to be made. Studies regarding teaching activities are required to respond to questions about both effectiveness and efficiency, to answer the need for quality training and the proper use of limited resources.

Method The authors of this review searched EMBASE and PUBMED databases using the terms: medical skill laboratory, efficiency, effectiveness, medical training, and medical simulation. Relevant papers published between 1999 and 2015 were selected and reviewed.

Results and Discussions Studies regarding medical teaching are conducted and published by an increasing number of researchers. Because simulation is a new acquisition in the medical trainers’ arsenal, there is a need to locate the proper place for this new tool. Also it has the burden of being an expensive tool, so to justify its use, we need to prove

36

Dîrzu et al.

that it really works. From the point of view of this paper we distinguished two different questions that require to be answered. The first is the proof that using simulation provides wanted results. The second regards the price of using simulation. The price needs to be analysed in the wider sense, including not only the costs, but also time, manpower, instructors training and all other resources used for making a simulation facility work. The price must prove its worth and that it can be optimised by proper measures. Unfortunately few papers are structured from this point of view, and a systematisation seems impossible. It is obvious that there is an interest in studying both aspects and we quote as an example this phrase from the conclusions of a recent study: “Laboratory training can effectively improve residents’ ability to perform anastomoses, which may result in increased efficiency of teaching in the operating room” [7]. The authors demonstrate that using simulation increased the confidence level of twelve from fourteen residents included in the study group. Also the necessary time to do a surgical procedure was decreased by a few minutes with simulation use. The authors conclude that increased effectiveness may increase efficiency, but they use the terms without making a difference between them. The main advantage of using simulation is that it offers opportunity to practice the skill in an environment where no harm can be done to a patient [8, 9]. Today, the general perception is that skill acquisition is better by using simulation as training tool when compared with standard training [10]. In spite of this, a recent study centred on the acquisition of an examination skill – colonoscopy, proves that the use of simulation alone for training is inferior to classic, patient centred training [11]. Virtual reality did not improved surgical skills of novices when they were trained to perform laparoscopic appendectomy to pigs [12]. Three years later, different authors concluded that using virtual reality improved surgical skills in different laparoscopic manoeuvres [13]. These two studies used different virtual reality simulators and no study comparing the two models was found in literature. Also, different approaches were used for training. Reviewing these papers we are forced to admit that we are far from drawing definitive conclusions regarding simulation effectiveness in new skills acquisition. We can predict, based on these study designs and their conclusions, that if the simulation is found useful then the number of repetitions needed for maximum skill should be standardised for every procedure. We can go even further by admitting that students are different and maybe we should settle different approaches of training for different student’s personalities. While it seems obvious that one exposure alone to simulation is beneficial in training of nontechnical skills,

is not clear if an increased number of repetitions may bring supplementary benefits [14]. Is not clear if we can setup universal teaching “receipts” as long as every trainee is a unique entity with a unique training path, with different backgrounds and personalities. We need to admit that studies trying to establish a “one size fits all” way of teaching may be doomed to failure. When compared with problem based learning, simulation was found to be superior when an evaluation was made on simulated patients [15]. It is not yet clear if this better performance in a simulated environment means also a better performance in clinical practice. We may expect that science will prove that simulation may be used alongside clinical practice in a certain context, for best results [9]. Trainee satisfaction when trained with simulation was used as an undisputed argument for simulation use in training of both technical and nontechnical skills [4]. The stage fright of performing a manoeuvre on a human subject for the first time is considerably reduced by prior simulation use. A research team shows increased confidence from 23.4 to 70.3 on a scale from 0 to 100, to nephrologist trained to do kidney puncture on a simulator [16]. The same team shows reduction in post puncture bleeding in the simulation training group with major implications for patient’s safety. While studying errors committed by anaesthesia nurses, another team demonstrated that nurses trained without simulation use were making more frequent and dangerous errors [17]. Skill retention over time is another subject frequently approached in papers about simulation use in medical training. The authors of this review have the strong belief that a new acquired skill will improve or degrade no matter the way the first acquisition was made. It seems that the frequency of use of this new acquired skill in current practice has a major impact: “use it or lose it” they say when you are trained to do a new manoeuvre. When we are talking about basic skills of every medical speciality it is easy to accept this. Training anaesthesia residents to perform tracheal intubation in skills lab is one example. After the first basic training they will perform this skill daily until the skill becomes automatic. The real problem that awaits an answer is regarding the skills that we may not perform for years, but when needed, the way they are performed have a major impact on patient safety – such as emergency cricothyroidotomy. Science will need to answer questions regarding the length of skill retention, the required frequency of training repetition and the extent of further training. A good research question may be if it is efficient to train the entire practitioner group to develop this kind of skill or more efficient solutions may be found such as telemedicine. Waiting for these answers we are forced to admit the

Effectiveness vs efficiency in a medical skills laboratory

obvious: only simulation may offer training in these rare situations. Even without proved efficiency simulation is the only real alternative for training we have. The feedback from trainees is encouraging and a recent paper evaluating a training program for board recertification, the Maintenance of Certification in Anaesthesiology (MOCA) exam, shows that students attending these courses were very pleased with the airway module that focused on tracheostomy catastrophes, which they thought was relevant and helpful [18]. A recent study shows a better skill retention at three and six months when compared with a training method using simulation with a classical “see one, do one, teach one” approach [2]. We have to admit when reading that paper that the educational program including simulation is the one which may be considered superior and that it is wrong to conclude that “simulation is better” than the classical approach. We think that the reader should see the simulation used in this research as a simple teaching tool and not the only variable that makes the difference in the equation. Transferability is probably the most important feature of simulation that needs to be proven. Transfer from simulation to reality when we talk about training and from reality to simulation when we talk about examination are both worth studying. Regarding bronchoscopy training, using a simulation to acquire this complex skill for novices improved outcomes for total procedure time, the bronchoscopy quality score, qualitative assessment by a blinded bronchoscopy nurse and amount of meperidine used [19]. The same study shows that experts may be differentiated from novices when a simulation is used to evaluate the trainee. Yet it was not proved how refined the level of differentiation was between the two trainees, so we cannot set up more levels of competencies. Even if multiple repetitions did not improve the skills; the aptitude of surgeons to perform a laparoscopic appendectomy was correctly predicted when they were evaluated in virtual reality [12]. Although we practice evidence based medicine, daily medical practice involves different styles and preferences. This part must be left out from the examination process, because the examination needs to define general accepted purposes. At the same time, actions from protocols with evidence based arguments require to be measured and corrected when they are wrongly performed [20]. Because simulation is more and more used as an examination tool for candidates from the same workplace, we consider that the relevance of this approach needs to be proven with solid scientific proof. Using these tools empirically may have ethical and legal implications. We give the example of the American

Board of Anesthesiology, who instituted a new MOCA Part IV activity requiring certified physicians to attend and self-reflect on a simulation-based course in an American Society of Anesthesiologists – endorsed program [18]. Even if attending a simulation based course is mandatory, the evaluation process does not contain a simulation exam. One criticism of using complex simulation scenarios in the examination process is the fact that the one being examined may demonstrate hypervigilance, anticipating that something wrong may happen, which is not always true in clinical practice [21]. Some authors even admit that the simulation may generate the Hawthorne effect, changing the natural reaction of the examined subject [22]. We fail to see why this effect may be accentuated in simulation based examination more than in any other kind of examination. What is obvious is that simulation is useful for evaluation and is better than other methods when automatic behaviour is evaluated. This makes simulation a good tool to evaluate routine use of generally accepted standards and protocols [22]. Standardised patients had been used and evaluated with encouraging results for mock oral board exams in neurology [23]. For certainty of results is still recommended that simulation should not be used as a unique evaluation tool, but rather associated with classical examination methods. The efficiency of simulation use in medical teaching has started to attract researchers, but yet with fewer publications. Probably one of the most important subjects of the efficiency of the medical skill laboratory is the number of students who effectively practice one skill, reported to the time unit. The “simple” skill of peripheral vascular access, a basic skill that students need to be trained in, is one example. If the anaesthesia and intensive care department lecturers of the University of Medicine and Pharmacy “Iuliu Haţieganu” ClujNapoca are asked to teach this skill to students from the third year of the university, they can do that to the patients they are treating in Surgery 1 Clinic and the Intensive Care Department of the Emergency County Hospital Cluj, or in the skill laboratory. For these patients an average of 450 peripheral venous catheters are used every month and a percentage of these are for failed procedures. There is no set program for doing these procedures; they are performed on an “as required” basis. So if we want to offer training to students using real patients, they must spend a whole month in a hospital waiting for the procedure to be available and also to have an instructor available for this at exactly the time required. This, of course, assumes that the patient will agree that the student can do the procedure for the first time on them. When using a

37

38

Dîrzu et al.

skill laboratory, one group of ten students requires one hour and one instructor to practice the manoeuvre as much as their instructor considers is appropriate, until they fully master the skill. In the studied literature, the authors of this article found no description regarding the training capacities of different teaching units. Few studies report the costs of simulation use. The general perception is that simulation is expensive and it is possible that this is correct. A cost analysis of competency evaluation shows that for one evaluation alone, the costs may vary between 6000 and 10 000 USD per evaluation, whether the participant practices within the standards of care or in a simulated environment [24]. No one can evaluate how much the lack of training costs. However, we would like to point out that any costs for simulation training may be easily matched by only one lost malpractice accusation generated by the lack of training. The competency level of the trainer using simulation is another theme with few reports. One single study was found comparing training offered by lay persons and health care providers regarding automated external defibrillator used [25]. The study found no difference between the groups but unfortunately there was no information about the medical or teaching expertise of trainers. Also the authors did not evaluate the knowledge of the trainee before training, so final conclusions cannot be drawn. Other authors evaluate technical skill training by senior students to the youngest students and reported good results. Yet the authors did not compare the results of the training with the results of the same training offered by trainers with superior expertise [26]. We consider that these reported results, even though encouraging, cannot be used as arguments to use lay persons as trainers, but studies regarding these aspects are still worth being conducted. Solid proof regarding this problem may allow training facilities to hire trainers with lower competency levels, who are more available and more affordable. The time required for the acquisition of skills, the numbers of repetitions, the optimum interval between repetitions are all elements which need solid arguments to attract researchers. One group of researchers proved that a wider interval between training sessions is more efficient than a shorter interval and also demonstrated that this period is more important when trained skills are more complex [27]. This kind of information can assist major changes in the way Universities are creating postgraduate teaching programmes for instance. It is a well-known fact that today the standard is to organise training over more consecutive hours and days. Simulation is increasingly being used as a valuable teaching tool in medical education programs. All

Medical Universities around the world wish for or have already their own Simulation Department in their medical educational programs. Important worldwide organisations such as the American Society of Anesthesiology offer certification for these simulation centres. There is no doubt that using simulation to gain minimum competency level before making the first manoeuvre on a human subject, is beneficial. Maybe society is ready to pay any price to improve patient safety. Studies on the effectiveness of simulation are now being published or ongoing but rarely sustain simulation use in medical education. Questions about efficiency are being raised and in the future we should find out how to obtain the maximum results from our simulation departments. Conflict of interest Nothing to declare

References 1. Ziv A, Wolpe PR, Small SD, Glick S. Simulation-based medical education: an ethical imperative. Simul Healthc 2006; 1: 25225 6 2. Herrmann-Werner A, Nikendei C, Keifenheim K, Bosse HM, Lund F, Wagner R, et al. “Best practice” skills lab training vs. a “see one, do one” approach in undergraduate medical education: an RCT on students’ long-term ability to perform procedural clinical skills. PLoS One 2013; 8: e76354 3. Spruit EN, Band GPH, Hamming JF, Ridderinkhof KR. Optimal training design for procedural motor skills: a review and application to laparoscopic surgery. Psychol Res 2014; 78: 878-891 4. Weller JM. Simulation in undergraduate medical education: bridging the gap between theory and practice. Med Educ 2004; 38: 32-38 5. Effectiveness – Definition and More from the Free MerriamWebster Dictionary [Internet]. Available from: http:// www.merriam-webster.com/dictionary/effectiveness 6. Efficiency – Definition and More from the Free MerriamWebster Dictionary [Internet]. Available from: http://www. merriam-webster.com/dictionary/efficiency 7. Egle JP, Malladi SVS, Gopinath N, Mittal VK. Simulation training improves resident performance in hand-sewn vascular and bowel anastomoses. J Surg Educ 2015; 72: 291-296 8. Michelson JD, Manning L. Competency assessment in simulation-based procedural education. Am J Surg 2008; 196: 609-615 9. Kneebone RL, Scott W, Darzi A, Horrocks M. Simulation and clinical practice: strengthening the relationship. Med Educ 2004; 38: 1095-1102 10. Lynagh M, Burton R, Sanson-Fisher R. A systematic review of medical skills laboratory training: where to from here? Med Educ 2007; 41: 879-887 11. Gerson LB, Van Dam J. A prospective randomized trial comparing a virtual reality simulator to bedside teaching for training in sigmoidoscopy. Endoscopy 2003; 35: 569-575 12. Ahlberg G, Heikkinen T, Iselius L, Leijonmarck CE, Rutqvist J, Arvidsson D. Does training in a virtual reality simulator improve surgical performance? Surg Endosc 2002; 16: 126-129

Effectiveness vs efficiency in a medical skills laboratory 13. Youngblood PL, Srivastava S, Curet M, Heinrichs WL, Dev P, Wren SM. Comparison of training on two laparoscopic simulators and assessment of skills transfer to surgical performance. J Am Coll Surg 2005; 200: 546-551 14. Yee B, Naik VN, Joo HS, Savoldelli GL, Chung DY, Houston PL, et al. Nontechnical skills in anesthesia crisis management with repeated exposure to simulation-based education. Anesthesiology 2005; 103: 241-248 15. Steadman RH, Coates WC, Huang YM, Matevosian R, Larmon BR, McCullough L, et al. Simulation-based training is superior to problem-based learning for the acquisition of critical assessment and management skills. Crit Care Med 2006; 34: 151-157 16. Dawoud D, Lyndon W, Mrug S, Bissler JJ, Mrug M. Impact of ultrasound-guided kidney biopsy simulation on trainee confidence and biopsy outcomes. Am J Nephrol 2012; 36: 570-574 17. Weller JM, Merry AF, Robinson BJ, Warman GR, Janssen A. The impact of trained assistance on error rates in anaesthesia: a simulation-based randomised controlled trial. Anaesthesia 2009; 64: 126-130 18. Levine AI, Flynn BC, Bryson EO, Demaria S Jr. Simulationbased Maintenance of Certification in Anesthesiology (MOCA) course optimization: use of multi-modality educational activities. J Clin Anesth 2012; 24: 68-74 19. Ost D, DeRosiers A, Britt EJ, Fein AM, Lesser ML, Mehta AC. Assessment of a bronchoscopy simulator. Am J Respir Crit Care Med 2001; 164: 2248-2255 20. Rosen MA, Salas E, Silvestri S, Wu TS, Lazzara EH. A measurement tool for simulation-based training in emergency medicine: the simulation module for assessment of resident targeted event responses (SMARTER) approach. Simul Healthc 2008; 3: 170-179 21. Perkins GD. Simulation in resuscitation training. Resuscitation 2007; 73: 202-211 22. Zausig YA, Bayer Y, Hacke N, Sinner B, Zink W, Grube C, et al. Simulation as an additional tool for investigating the performance of standard operating procedures in anaesthesia. Br J Anaesth 2007; 99: 673-678 23. Kissela B, Harris S, Kleindorfer D, Lindsell C, Pascuzzi R, Woo D, et al. The use of standardized patients for mock oral board exams in neurology: a pilot study. BMC Med Educ 2006; 6: 22

24. Levine AI, Bryson EO. The use of multimodality simulation in the evaluation of physicians with suspected lapsed competence. J Crit Care 2008; 23: 197-202 25. Castrén M, Nurmi J, Laakso JP, Kinnunen A, Backman R, NiemiMurola L. Teaching public access defibrillation to lay volunteers – a professional health care provider is not a more effective instructor than a trained lay person. Resuscitation 2004; 63: 305-310 26. Weyrich P, Schrauth M, Kraus B, Habermehl D, Netzhammer N, Zipfel S, et al. Undergraduate technical skills training guided by student tutors – analysis of tutors’ attitudes, tutees’ acceptance and learning progress in an innovative teaching model. BMC Med Educ 2008; 8: 18 27. Spruit EN, Band GPH, Hamming JF. Increasing efficiency of surgical training: effects of spacing practice on skill acquisition and retention in laparoscopy training. Surg Endosc 2014; [Epub ahead of print] DOI 10.1007/s00464-014-3931-x

Eficacitate versus eficienţă în laboratorul de aptitudini practice Rezumat Formatorul din învăţământul medical trebuie să facă faţă unei noi provocări: folosirea simulatoarelor medicale în scopuri educative. Utilizarea simulatoarelor medicale pare atrăgătoare atât pentru student cât şi pentru formator, dar preţurile simulatoarelor pot fi prohibitive. Într-o epocă a resurselor limitate este obligatoriu ca folosirea unor instrumente atât de scumpe să îşi demonstreze beneficiile. Deşi simularea oferă oportunitatea pentru formare, avantajele suplimentare sunt departe de a fi stabilite. Presupusele beneficii ale folosirii simulatoarelor medicale trebuie să fie demonstrate ţinând cont de două aspecte: eficacitate şi eficienţă. Cuvinte cheie: educaţie medicală, aptitudini clinice, cost-eficienţă, training

39

Lihat lebih banyak...

Comentarios

Copyright © 2017 DATOSPDF Inc.