Education Assessing practical skills in obstetrics and gynaecology: educational issues and practical implications

The Obstetrician & Gynaecologist 10.1576/toag.8.2.107.27230 www.rcog.org.uk/togonline 2006;8:107–112 Education Education Assessing practical skill...
Author: Grant Sanders
11 downloads 1 Views 133KB Size
The Obstetrician & Gynaecologist

10.1576/toag.8.2.107.27230 www.rcog.org.uk/togonline

2006;8:107–112

Education

Education Assessing practical skills in obstetrics and gynaecology: educational issues and practical implications Authors Dina L Bisson / Judith P Hyde / Jane E Mears

Shorter training programmes and a reduction in working hours have caused concern about competency levels of trainees completing training programmes in all specialties. Competency based assessment has been introduced into foundation programmes and there is an urgent need to develop similar assessment tools for specialist training programmes. This article describes the current situation in obstetrics and gynaecology and the challenges ahead for the trainers who will be using the tools that are being developed. All doctors involved in obstetrics and gynaecology training need to embrace this important initiative and as a specialty we need to ensure that the tools introduced are reliable, valid and easy to use. Keywords assessment / competency / OSATS / surgical skills / training Please cite this article as: Bisson DL, Hyde JP, Mears JE. Assessing practical skills in obstetrics and gynaecology: educational issues and practical implications. The Obstetrician & Gynaecologist 2006;8:107–112.

Author details Dina L Bisson FRCOG Consultant Obstetrician and Gynaecologist Department of Women’s Health, Southmead Hospital, Bristol, BS10 5NB, UK E-mail: [email protected] (corresponding author)

Judith P Hyde MRCOG Consultant Obstetrician and Gynaecologist Gloucestershire Royal Hospital, Great Western Road, Gloucester, GL1 3NN, UK

© 2006 Royal College of Obstetricians and Gynaecologists

Jane E Mears MRCOG Specialist Registrar in Education and Obstetrics and Gynaecology St Michael’s Hospital, Southwell Street, Bristol, BS2 8EG, UK

107

Education

2006;8:107–112

Introduction Historically, postgraduate education, particularly in the surgical specialties, has been based solidly on apprenticeship. This method has produced excellent clinicians and surgeons whose attainment of clinical skills has been based on long hours and long training programmes. The introduction of the European Working Time Directive (EWTD) has seen a significant reduction in junior doctors’ hours, which has resulted in a reduction in the number of procedures performed by trainees. This is documented in the RCOG and Royal College of Surgeons (RCS) trainee surveys.1,2 It has limited the effectiveness of a purely apprenticeship approach to training in surgical specialties, including obstetrics and gynaecology. The problem has been further fuelled by financial and political drives to reduce waiting lists, thus limiting training opportunities.3 The Modernising of Medical Careers (MMC)4 project is set to shorten specialist training programmes, which has led to concerns that some doctors may reach the end of their training without all the skills required to practise as a specialist. Coupled with this is the increasing pressure from the public and media to have robust evidence of individual clinicians’ abilities to ensure care of the highest quality and to minimise potential errors. Some of the methods of assessment currently in use have poor validity and reliability (‘see one, do one, teach one’, for example), making it difficult to justify their use in the objective feedback required in appraisal. These factors illustrate the importance of developing tools for the assessment of clinical skills that are well validated, robust and fit for their purpose.

Work based assessment Training in all specialties is evolving and there are moves towards competency based outcomes rather than those based solely on the length of the training. This allows maximum flexibility so that doctors can progress through training at their own pace. The curriculum for the foundation years was recently published by the Department of Health. It describes the core competencies to be achieved during the programme and the methods of assessment to be used, such as mini clinical evaluation exercises (mini CEX), direct observation of procedural skills (DOPS), mini PAT (mini peer assessment tool, previously known as 360º assessment) and case-based discussions.5 These competencies are based on the principles described by the General Medical Council in Good Medical Practice 6 and are generic. In August 2006 the first F2 trainees will undertake the women’s health module and in the following year they will enter specialist training programmes. 108

The Obstetrician & Gynaecologist

The RCOG is preparing for these changes with the development of a core curriculum which defines the knowledge skills and attitudes trainees are expected to acquire for obstetrics and gynaecology. Currently, all trainees complete mandatory basic and core log books before progressing to the next stage of training. All trainees and trainers are aware that this method of assessment is, at best, very crude. Every box in the log book requires a signature and experience has shown that most signatures are not based on direct observation of the trainee but an assumption of competence based on the length of time in training and hearsay (for example,‘Have you done a forceps yet?’…‘Yes, I did one last week’…‘Oh, that’s good, I can sign this box then’). Clearly, this is no longer robust enough to assume competence for all trainees in all areas.7 The RCOG is currently developing a new core log book with more clearly defined competencies expected at each stage of training. The development of new tools of assessment of these competencies and technical skills is the current challenge for trainers in obstetrics and gynaecology. The National Health Service Litigation Authority (NHSLA) has also highlighted the need for assessment and training. The assessment of competencies and technical skills of all junior medical staff has now become a maternity standard for the Clinical Negligence Scheme for Trusts (CNST), requiring 100% demonstrable compliance to achieve level two.8 The large financial impact of obtaining increasingly higher CNST levels has meant that this has also become an NHS Trust priority. The competencies need to be defined for each specialty and this is currently taking place in an ad hoc fashion around the UK, using differing criteria without structure. Clearly, there is an urgent need to define national core competencies for obstetric and gynaecology trainees at each level of training, in addition to validated methods of objective assessment that are fit for their purpose. There are many assessment tools available. However, before introducing them and basing assessment of competency in practical skills, and therefore career progression, on them, they require assessment themselves to confirm that they are discerning, accurate and reproducible. It is also vitally important that the limitations of these methods as assessment tools are appreciated.

Methods of assessment of technical skill Assessment of technical competencies involves use of a criterion with which to compare an individual’s performance with an absolute standard. The standard to be achieved must be clear. It is important to realise that training and assessment are complementary: without assessment any © 2006 Royal College of Obstetricians and Gynaecologists

The Obstetrician & Gynaecologist

deficiencies in the individual or the training system could be ignored. Any assessment or examination has to be valid and reliable: the desired skill is tested and the test is reproducible. A test must also be feasible or practical. The validity of an assessment can be looked at in different ways. Box 1 lists the different forms of validity that should be considered.9,10 The most commonly described one in surgical skills assessment is construct validity11 and this is often inferred by the ability of a test to distinguish between trainees of different levels. However, it is important to remember that this is only an inference and assumes that senior surgeons will be more skilled than juniors. The most accurate way of assessing the validity of a surgical assessment may be patient outcomes or complications, although these may be unreliable12 and the practical implications make it unlikely to be achieved. Objective Structured Assessment of Technical Skill (OSATS) The OSATS assessment tool was developed at the University of Toronto by Martin et al.13 in 1997 to evaluate surgical residents. It followed the widespread and successful introduction of the OSCE (Objective Structured Clinical Examination). The assessment tools tested were:

• an operation-specific checklist • a seven-item global rating scale, with a five-point •

scale with descriptive anchors at points one, three and five pass/fail judgement.

Twenty surgical residents were observed performing tasks at six stations ranging from abdominal wall closure to stapled bowel anastamosis. Tasks were tested on both live anaesthetised animals and bench models. Equivalent results were shown for both the live and the bench models. The content validity was established in the development phase and the construct validity by the discrimination of residency level. Models were used to give the assessment face validity (Box 1). The global rating scale had promising reliability in both the live and bench settings (Cronbach’s alpha 0.66 and 0.74, respectively) and the checklist performed less well in the bench setting compared to the live (Cronbach’s alpha 0.33 and 0.61, respectively). The inter-rater reliability was 0.64–0.72, with the global rating scale performing better than the checklist. It is recognised that a reliability of 0.5 is an imprecise measurement tool, 0.5–0.8 is moderately reliable and 0.8 is ideal, especially if the results of the assessment will form part of a ‘high stakes’ examination. The © 2006 Royal College of Obstetricians and Gynaecologists

2006;8:107–112

Form of validity Construct Content Concurrent Face Predictive Reliability Inter-rater Test-retest Practicalities Cost Feasibility Acceptability

Education

Box 1

Does it measure the trait that it is designed to measure? Does it cover the syllabus fully? Do the results compare favourably with the current gold standard? Does it resemble a real-life situation? Does it predict future performance?

Assessment considerations

Is it consistent between examiners? Is it robust if repeated by the same student? Financial and time costs of performing the examination Equipment and resources required Is the assessment valued by the examiners and students?

pass/fail judgement was not found to be discriminatory. The calculated costs were £270 per candidate for the live setting and £70 for the bench model and 18 man-hours per six candidates. This pioneering study highlights the importance of evaluating tools before introduction into the curriculum and demonstrates the feasibility of this form of assessment. Further work has shown improved inter-station reliability and discrimination between senior residents.14,15 Applications of OSATS in obstetrics and gynaecology This early work in the field of general surgery led to an increasing interest in the application of OSATS in obstetrics and gynaecology. Barbara Goff at the University of Washington in Seattle first looked at the use of OSATS assessment in live animal models in 2000.16 This study of 24 residents on a seven station assessment, including laparoscopic and open laparotomy procedures, evaluated a task-specific checklist, global rating scale and a pass/fail judgement. The results showed that the global rating scale was the best discriminator of residency level, thus displaying construct validity; the checklist also showed construct validity, but the pass/fail judgement was not discriminatory. The internal consistency was high for both the checklist and the global rating scale (0.84–0.97). There were only four tasks where the inter-rater reliability was assessed and this was high for both the checklist and global rating scale (0.78–0.98). The total cost was significant and calculated at £3245; this did not include faculty time, which was estimated at five hours per member for six students. Other studies in the same centre showed that the use of a bench model to test skills has construct validity and is feasible, with a much reduced cost.17,18 The question of examiner bias was addressed by the use of ‘blinded’ faculty members and by using nonidentifying videos.19,20 Simulation versus real life The majority of work that looks at objective assessments of surgical skill has used simulation of 109

Education

2006;8:107–112

The Obstetrician & Gynaecologist

some description. The benefits of simulation are evident: most importantly it provides a reproducible tool for assessment. It enables the assessment to be distanced from the operative setting and reduces intrusion into operating lists, also allowing for simultaneous examination of students, thus reducing time costs. It is also possible to ‘blind’ examiners by using examiners from outside a unit. It has been shown that it is feasible to deliver assessment developed centrally at remote sites in a reproducible fashion.18,21 (Box 2) Box 2

The use of simulation in surgical skills assessment

Advantages Reproducible Distanced from clinical environment Blinded examiners possible Patient safety not compromised

Disadvantages Skills only test Time costs Financial costs Artificial situation

The development of simulators has been rapid and ranges from low fidelity model-based simulators to virtual reality simulators with haptic feedback.22 The improvements allow for greater ‘real life’ scenarios and should improve the face validity. There is continued debate about whether a good performance on a simulated exercise equates with real life clinical skills.A small study23 of surgical trainees using ADEPT (Advanced Dundee Endoscopic Psychomotor Testing) showed good correlation with independent clinical assessment (concurrent validity). The use of real life settings for assessment of surgical skills has obvious problems: potential for variation in patient anatomy, cancellations to lists, unexpected findings or complications and the inability to examine more than one trainee at any session. It is not possible to ‘blind’the examiner unless videos of procedures are

used. The problem of lack of suitable cases and cancellations of patients led to 8 out of 26 participants being unable to complete the assessment in a study by Coleman et al.24 It has also been shown that, because of preceptor variation in the live setting, it may require up to 12 preceptor ratings to achieve an acceptable level of reliability 0.8.25

The current RCOG situation The RCOG Specialist Training Committee has advised that an experienced obstetrician is resident in a maternity unit 24 hours a day and that such an individual should have mastered a range of appropriate skills. These have been outlined in the document The European Working Time Directive and Maternity Services26 but the method of assessment of these skills has not yet been developed. The new core log book currently under development will further define these skills. In May 2005 a small RCOG working party further defined six core competencies and it is currently piloting a method of assessment of these competencies. The six core competencies are:

• caesarean section • operative vaginal delivery • perineal repair • manual removal of placenta • evacuation of the uterus • diagnostic laparoscopy. It is acknowledged that, to date, OSATS have only been used to assess technical skills in a bench and animal model and they have not been extended to assessment of clinical competence. It is clear that it

Figure 1 Procedure specific checklist for caesarean section

110

© 2006 Royal College of Obstetricians and Gynaecologists

The Obstetrician & Gynaecologist

is much more difficult to decide whether to perform a rotational instrumental delivery or an emergency caesarean section than it is actually to perform either procedure and the assessment of this particular skill remains an additional challenge. Attempts are being made to ensure that the assessment tools currently being piloted assess a broader range of skills than simply technical skills. Operative vaginal delivery involves interaction with the mother, appropriate use of other team members and accurate record keeping; this will be included in the assessment process. The assessment forms currently under pilot are comprised of two components: a procedurespecific checklist and a generic assessment (similar to the global assessment previously described for OSATS). The procedure-specific checklist allows each component to be assessed as having been performed independently or with help (Figure 1).

2006;8:107–112

Education

The generic skills assessment is identical for all procedures and includes general surgical ability, knowledge of instruments and respect for tissue, in addition to insight, attitude and documentation of the procedure assessed on a three point scale (Figure 2). Both aspects are used by the assessor to confirm competence to perform the procedure without the need for supervision. An additional component of the pilot is to obtain the views of the trainers and trainees. After each assessment both parties are asked to complete a feedback form, which includes the time taken to undertake the assessment. It is hoped that this pilot will result in the development of a valid, reliable and easy to use assessment tool of competency in practical skills for trainees in obstetrics and gynaecology. However, this is only the beginning: further work looking at Figure 2 Generic technical skills assessment

© 2006 Royal College of Obstetricians and Gynaecologists

111

Education

2006;8:107–112

The Obstetrician & Gynaecologist

broader aspects of obstetrics and gynaecology training and the development of appropriate assessment tools will need to follow.

Conclusion There can be no doubt that competency based assessment of all doctors is essential in the light of reduced working hours, shorter training programmes and the need to maintain public confidence in the medical profession. The development of tools to undertake this vital role is one of the current challenges to the specialty of obstetrics and gynaecology. Lessons can be learnt from studies of allied surgical specialty and obstetrics and gynaecology training programmes in other countries. However, any tools introduced into UK practice must be valid, reproducible and easy to use within our own training hospitals. Knowledge and experience in this area worldwide is sparse and the results should be interpreted with caution. All obstetricians and gynaecologists will be involved in assessment and should embrace the changes. This is not a matter for a few enthusiasts—it is the responsibility of us all to ensure that the obstetric and gynaecology workforce of the future is competent to perform the tasks expected. We cannot afford not to be involved. References 1 Royal College of Obstetricians and Gynaecologists Trainees Committee. Survey of Training 2002. London: RCOG Press; 2003. [www.rcog.org.uk/resources/Public/pdf/Survey_of_Training.pdf]. 2 Royal College of Surgeons of England. Results of the online EWTD trainee survey. [www.rcseng.ac.uk/service_delivery/ewtd/online_ EWTD_survey_results.pdf] 3 Blanchard MH, Amini SB, Frank TM. Impact of work hour restrictions on resident case experience in an obstetrics and gynaecology residency program. Am J Obstet Gynaecol 2004;191:1746–51. 4 Modernising Medical Careers. [www.mmc.nhs.uk]. 5 The Foundation Programme Committee of the Academy of Medical Royal Colleges, in cooperation with Modernising Medical Careers in the Departments of Health. Curriculum for the foundation years in postgraduate education and training. Department of Health, Social Services and Public Safety; 2005. [www.dh.gov.uk/assetRoot/04/10/76/96/04107696.pdf].

6 General Medical Council. Good Medical Practice. London: GMC; 2001. [www.gmc-uk.org/guidance/library/GMP.pdf]. 7 Cuschieri A, Francis N, Crosby J, Hanna GB. What do master surgeons think of surgical competence and revalidation? Am J Surg 2001;182:110–6. 8 NHS Litigation Authority. CNST standards and assessments. [www.nhsla.com/RiskManagement/CnstStandards]. 9 Moorthy K, MunzY, Sarker SK, Darzi A. Objective assessment of technical skills in surgery. BMJ 2003;327:1032–7. 10 Sidhu RS, Grober ED, Musselman LJ, Reznick RK. Assessing competency in surgery: where to begin? Surgery 2004;135:6–20. 11 Downing SM. Validity: on meaningful interpretation of assessment data. Med Educ 2003;37:830–7. 12 Bridgewater B, Grayson AD, Jackson M, Brooks N, Grotte GJ, Keenan DJ, et al. Surgeon specific mortality in adult cardiac surgery: comparison between crude and risk stratified data. BMJ 2003;327:13–7. 13 Martin JA, Regehr G, Reznick R, Macrae H, Murnaghan J, Hutchinson C, et al. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg 1997;84:273–8. 14 MacRae H, Regehr G, Leadbetter W, Reznick RK. A comprehensive examination for senior surgical residents. Am J Surg 2000;179:190–3. 15 Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill via an innovative ‘bench station’ examination. Am J Surg 1997;173:226–30. 16 Goff BA, Lentz GM, Lee D, Houmard B, Mandel LS. Development of an objective structured assessment of technical skills for obstetric and gynaecology residents. Obstet Gynecol 2000;96:146–50. 17 Lentz GM, Mandel LS, Lee D, Gardella C, Melville J, Goff BA. Testing surgical skills of obstetric and gynecologic residents in a bench laboratory setting: validity and reliability. Am J Obstet Gynecol 2001;184:1462–8. 18 Goff B, Mandel L, Lentz G, Vanblaricom A, Oelschlager AM, Lee D, et al. Assessment of resident surgical skills: is testing feasible? Am J Obstet Gynecol 2005;192:1331–8. 19 Goff BA, Nielsen PE, Lentz GM, Chow GE, Chalmers RW, Fenner D, et al. Surgical skills assessment: a blinded examination of obstetrics and gynecology residents. Am J Obstet Gynecol 2002;186:613–7. 20 Vogt VY, Givens VM, Keathley CA, Lipscomb GH, Summitt RL Jr. Is a resident’s score on a videotaped objective structured assessment of technical skills affected by revealing the resident’s identity? Am J Obstet Gynecol 2003;189:688–91. 21 Ault G, Reznick R, MacRae H, Leadbetter W, DaRosa D, Joehl R, et al. Exporting a technical skills evaluation technology to other sites. Am J Surg 2001;182:254–6. 22 Kneebone R. Simulation in surgical training: educational issues and practical implications. Med Educ 2003;37:267–77. 23 Macmillan AI, Cuschieri A. Assessment of innate ability and skills for endoscopic manipulation by the Advanced Dundee Endoscopic PsychomotorTester: predictive and concurrent validity. Am J Surg 1999;177:274–7. 24 Coleman RL, Muller CY. Effects of a laboratory-based skills curriculum on laparoscopic proficiency: a randomised trial. Am J Obstet Gynecol 2002;186:836–42. 25 Fung Kee Fung K, Fung Kee Fung M, Bordage G, Norman G. Interactive voice response to assess residents’ laparoscopic skills: an instrument validation study. Am J Obstet Gynecol 2003;189:674–8. 26 Royal College of Obstetricians and Gynaecologists. The European Working Time Directive and Maternity Services. London: RCOG Press; 2004. [www.rcog.org.uk/resources/Public/pdf/ewtd_and_maternity_ services.pdf].

The Obstetrician & Gynaecologist CPD QUESTIONS A reminder to UK CPD participants and overseas CPD participants. Are you taking part in the CPD exercise provided in each issue of the journal? If not, please note that a maximum of 20 personal category CPD credits per year may be available to you for undertaking this activity. Full instructions are provided in each issue. Please submit your answers online using the CPD submission system, which can be found on the RCOG website (www.rcog.org.uk). Please sign in as a registered user, then from the menu on the left choose ‘Fellows and Members’, proceed to ‘TOG Online’ and then select ‘CPD submission’. If you have any queries, do not hesitate to contact the CPD Office on 020 7772 6307 or email [email protected] 112

© 2006 Royal College of Obstetricians and Gynaecologists

Suggest Documents