영국 의과대학 졸업시험의 학교 간 차이: 임상역량표준은 서로 대등한가? (Med Teach, 2009)

Variations in medical school graduating examinations in the United Kingdom: Are clinical competence standards comparable?*

Peter Mccrorie & Katharine A.M. Boursicot






도입

Introduction


다른 많은 국가와 달리 영국은 의사국가면허시험이 없다. 모은 학교는 자신만의 졸업시험과 평가시스템이 있다. 의과대학을 졸업했다는 것, 그리고 의학 학위가 있으면 자동적으로 (예외적 경우를 빼고) GMC에 의해서 진료면허를 받게 된다. 여기에 깔린 전제는 모든 영국 의과대학의 시험이 면허와 진료에 필요한 동일한 수준의 최소한의 임상적, 전문가적 역량을 담보한다는 것이다.

Unlike many other countries (Medical Council of Canada 2007;USMLE 2007), in the the UK there is no national licensing Medical examination for medical profession. Every School decides on its own systems of assessment, including its graduating examinations. Graduation from Medical School and the conferment of a University degree in medicine leads automatically (unless there are exceptional circumstances) to the granting of a licence to practise by the General Medical Council (GMC), the medical professional regulatory body(GMC 2007a, b). The underlying assumption here is that the examinations at all the UK Medical Schools guarantee the same levels of minimum clinical and professional competence required for licensure and practice. 


모든 영국의과대학의 시험이 중앙 데이터 뱅크에서 기록되고 모니터되는 것은 아니다. 그러나 현재 운영되는 두 가지 inspection 시스템이 있다.

There is no formal documentation process whereby all the examinations of UK Medical Schools are recorded and are,monitored routinely in a central data bank. There however, two systems of inspection currently in place relating to Medical School examinations:

  • the Quality (Quality Assurance Assurance Agency’s ‘external examiner’ system Agency for Higher Education 2000) and

  • the GMC’s Quality Assurance of Basic Undergraduate Education (QABME)process (General Medical Council 2006). 

외부 검사자 시스템은 모든 영국 고등교육에 적용되는 영국만의 QA 시스템이다. 진행방식과 문제점.

The external examiner system is a peculiarly British system of quality assurance which applies to all the UK Higher Education Institutions.

  • This system involves the visit of academic/clinicians from one Medical School, to other similar institutions to review examinations, both paper-based and clinical.

  • The course documentation is usually sent to the external examiners prior to their visit, as are examination papers.

  • The extent to which examiners comment on the papers some varies widely: examiners produce detailed, lengthy comments; others simply send brief anodyne email messages.

  • The choice of external examiners is usually left to individuals within each Medical School, who may invite their friends or people they know and who will generally share their views.

  • The actual functions of external examiners are variable across institutions – in some cases, external examiners are required to act as examiners in clinical examinations while in others the externals simply observe the process. There is no requirement for them to observe any part of the process – merely to review the results of the assessments.

  • External examiners are required to complete reports, the quality of which varies across institutions. Some examiners send in detailed comments on the assessments, others simply tick boxes on a standard comments form (the format of which varies from institution to institution).

  • The hosting institutions can choose to modify their examinations according to the report if they wish, but they can also choose to over-ride the recommendations.

  • The system cannot be described as robust as the processes are so variable and do not provide formal quantitative comparison of standards of graduates across Medical Schools.


두 번째 inspection 시스템은 GMC의 QABME이다.

The second system of inspection is the GMC’s Quality Assurance of Basic Undergraduate Education (QABME)

  • which was started in 2005 and involves a general inspection of every UK Medical School’s curriculum, teaching, facilities, student support as well as assessment.

  • This system involves a series of visits to a Medical School over a 6-month period and each School is visited once every 5–7 years.

  • New Medical Schools, or those introducing radical changes, are visited several times each year starting at least a year in advance of the commencement of the new course and continuing until the first cohort of students has graduated, 6 or 7 years later.

  • Preparation for such inspections is time-consuming and each School is validated against the GMC’s published recommendations, Tomorrow’s Doctors (General Medical Council 2003).

  • The outcome of the process is a detailed document, published on the GMC website, which summarizes the findings of the QABME process and includes a series of requirements and recommendations.

  • Any requirements imposed on a Medical School are followed up by the GMC in the next academic year.


이 두 가지 프로세스는 서로 다른 것을 검사한다.

The two processes are looking at different things.

  • The external examiner system is designed to evaluate the quality of the assessments run by each Medical School, examiners usually being required to inspect a set of examinations in a single year of the course.

  • The QABME process takes a broader perspective and looks at staffing structure, course management, staff development, curriculum content and design, student support, quality assurance processes as well as assessment design, management and implementation.

 

두 가지 시스템 모두 질적인 평가이며, 학생의 역량을 양적으로 비교하지 않는다. 두 가지 시스템은 모든 방문평가자가 동등한 내적 기준을 갖는다는 것을 전제로 한다. 그러나 임상평가자의 기준이 의과대학마다 다르다는 근거가 있다.

Both systems are qualitative in nature and do not actually compare the professional and clinical competence of students in a quantitative manner. The two systems rely entirely on the assumption that all visitors have the same internal standards. However, there has been some evidence that standards for clinical examinations do vary significantly across Medical Schools at graduation level (Boursicot et al. 2006, 2007b).




방법

Methods


설문 실시

A questionnaire was constructed to collect relevant informa- tion about graduation level examinations from Medical Schools and sent to all members of the Association for the Study of Medical Education (ASME) Education Research Group for validation and scoping of opinion. The questionnaire was modified according to comments received and was then sent out to the ASME Council representative of each of the 32 Medical Schools in the UK.

 

This included a number of paired Schools, have who joint Finals examinations (Leicester/Warwick; Liverpool/Lancaster; Manchester/Keele; Newcastle/Durham; Nottingham/Derby) and so the final number of respondents was 27 (100% response rate).


 

설문에서 다루고 있는 내용

The 13-page questionnaire was in several parts and covered the following:

  • (1) Nature of written Final assessments;

  • (2) Nature of clinical Final assessments; 

  • (3) Nature of other final year assessments (e.g. portfolios, logbooks, attachment reports, RITAs, appraisals, video assessments, audit reports, Special Study Module assessments); 

  • (4) Timing of Finals assessments; 

  • (5) Standard setting;

  • (6) Examiners and examiner training; 

  • (7) Cost; 

  • (8) Views on national examinations and shared assessments.




Results


지필고사

Written assessments


At the time of carrying out the survey (November 2006), most of the UK Medical Schools used one or more of three formats to assess knowledge in Finals –

  • Extended Matching Questions, EMQs (23),

  • Single Best Answer Multiple Choice Questions, SBAs (16),

  • Short Answer Questions, SAQs/Modified Essay Questions, MEQs (14).

A few Schools still use True/False MCQs (5) or Essays (4).

 

다양한 형태

Within this partial uniformity, there is a wide range of test form.

  • 시험의 수 There was variation in the number of papers (1–6),

  • 각 시험 시간 the length of each paper (1–3h),

  • 총 시험 시간 the total length of the written papers (2–12h),

  • 문항당 시간 the time allocated per MCQ item (0.86–1.5 min per item),

  • 최종시험 시기 the timing of written Finals (June of the penultimate year until June of the Final Year) and

  • 문항 혼합형태 the mix of formats used (e.g. from a single MCQ paper to SBAs, EMQs, MEQs and SAQs).

 

P/F판단기준

  • Twenty-two Schools reported that they set passing standards for written papers using either the Angoff (Angoff 1971) or Ebel (Norcini 2003) methods.

  • Three Schools reported used the Hoftsee (Norcini 2003) method.

  • Two Schools had fixed pass marks of 50% and one of 60%.



임상평가

Clinical assessments


대부분 OSCE를 시행. 그러나 차이가 많음.

The vast majority of Schools assess the clinical competence of their students using an Objective Structured Clinical Examination, OSCE (25 Schools; Table 2). However there is a wide range in

  • 스테이션의 수 the number of stations (6–48),

  • 각 스테이션의 길이 the length of each station (4, 5, 6, 7, 7.5, 10, 12, 15, 30 or 45 min),

  • OSCE의 스타일 the style of the OSCE [traditional circuit format, static observation of video projections or Practical Assessment of Clinical Examination Skills (PACES)-style examination] (MRCP 2008), and

  • OSCE시행 시기 the timing of the OSCEs.

 

또 다른 차이는..

There were variables in...

  • circuit마다 변하는 스테이션의 정도 the extent to which stations were changed between circuits,

  • 합격 기준 what constituted a pass – achieving a set pass mark, or passing a minimum number of stations, or

  • 채점표 both – the style of marksheet (checklist or global),

  • SP 또는 진짜 환자 활용 the use of real and/or simulated patients and

  • 평가 영역 the range and domains of skills tested.


또 다른 임상평가 도구로는..

Other clinical assessments in use include different formats developed for clinical performance assessment (Boursicot et al. 2007a):

  • DOPS (Direct Observation of Clinical Practice) – eight Schools, requiring demonstration of competence in 3–38 skills;

  • the mini Clinical Evaluation exercise (mini-CEX) (Norcini et al. 2003) – eight Schools, still largely in embryonic development, but averaging at around six mini-CEXs per year;

  • Objective Structured Long Examination Record (OSLER) – seven Schools, the number of assessments ranging from 1 to10, each 20–60min long; and

  • Long Cases – five Schools, competence to be demonstrated in only two long cases.



mini-CEX, an OSLER and a long case를 구성하는 것이 무엇인가에 따른 차이가 컸지만, 포괄적으로 보았을 때, 거의 동일한 평가 방식이라고 볼 수 있다. 

There was much variation in the definition of what constituted a mini-CEX, an OSLER and a long case; indeed, broadly speaking, they could be considered to be much the same form of assessment.


Sequential testing하는 학교. 초반에 통과하지 못한 학생은 시험시간이 2배가 되며, 신뢰도를 높일 수 있다.

Four Schools operated sequential testing whereby students who performed well in an OSLER, an OSCE or a series of mini- CEXs were exempt from further testing. Around two-thirds of students usually fell into this category, the remainder having double the amount of testing time, thereby increasing the reliability of the pass/fail decision.


대부분의 학교는 borderline group이나 borderline regression으로 합격기준 설정

Most Schools reported using either the borderline group or borderline regression method to set the passing standard for OSCEs (17). Other methods reported were a modified Angoff procedure (4) and Contrasting Groups (1). Three Schools used grade descriptors, one used global ratings and one used norm referencing at the level of each station with a minimum permitted number of stations failed.



포트폴리오 평가

Portfolio assessments


사용 방식이 다양함.

Increasingly, Medical Schools were using portfolios in assessment. How they were used varied widely; some were in the style of a record of achievement – a collection of evidence while some could be described as a reflective diary. Some Schools declared the portfolio a formative assessment; others summative. In two Schools, the portfolio was the only assessment in the final year of the medical course (Davis et al. 2001).



QA

Quality assurance


The number and expertise of External Examiners at the different Schools was variable. Numbers ranged from 3 to over 30. Most were discipline-specific, although seven Schools stated one of their examiners was a medical educationalist (or a clinician with an understanding of assessment in medical education). Most schools stated that their internal examiners were trained for clinical assessments but rarely for written assessments; external examiners were generally assumed not to need training.






 


 

Discussion


결국 문제는 우리가 이 모든 평가방식이 임상적/전문가적 역량의 동일한 최저 기준을 보장한다는 것을 받아드일 것이냐 아니면 국가면허시험을 도입할 것이냐이다. Boursicot 등은 다섯개 의과대학에서 동일한 최종 OSCE 스테이션에서 modified Angoff method를 사용하였을 때 한 의과대학에서 합격할 학생이 다른 대학에서 불합격할 정도로 차이가 크다는 것을 보여주었다.

An issue is whether we accept that all these different assessments guarantee the same minimum standards of clinical and professional competence at gradua- tion from Medical School, or whether we should introduce a national licensing examination which all graduates have to pass in order to be licensed to practise. Boursicot et al. (2006) found that, in five Medical Schools, pass marks for six identical Finals OSCE stations, derived using a modified Angoff method, varied sufficiently such that students who would have passed at one Medical School would have failed at some of the others.



2008년 MRCP시험에서 학교간 차이가 크다는 것을보여주었다.

In 2008, a paper was published (McManus et al. 2008), which showed that graduates from different UK medical schools perform significantly differently in all parts of the MRCP (UK) examinations. One interpretation is that this is a consequence of the variation in standards required by different medical schools across the UK at graduation.


Schuwirth 는 한 차례의 시험으로 판단을 내리는 것이 위험함을 주장했다.

Schuwirth (2007) argued that it is dangerous to make judgements based on single-shot assessments,


 

Schuwirth는 한 차례의 시험이 특정 시점엣의 역량을 체크할 수는 있지만, 한 학생이 의사로서 진료를 할 수 있게 적합하느냐에 대한 인증을 위해서는 학생의 발달과정을 보여주는 종단적 요소를 고려해야 한다고 했다. 이는 신뢰도 놓게 평가하기 어려운 고차원적 스킬 (professionalism, scholarliness and critical thinking)에서 특히 그렇다.

He believes that, while single-shot assessments can be used to check competencies at particular points in time, certification of a medical student’s fitness to practise needs to take into account some element of long- itudinal evaluation of a student’s progress. This is particularly true for those higher-order skills which are so hard to assess reliably – professionalism, scholarliness and critical thinking.


평가의 조합을 사용하는 것에는 장점이 있다.

There is some merit in arguing in favour of a combination of approaches to assessment – perhaps a national licensing examination, and/or a School-based Finals examination, plus some form of in-School measurement of continuous progress, such as a portfolio, or progress tests.


GMC는 국가면허시험 도입을 위한 consultation을 시작했으며 Tooke 보고서는 지식 위주의 국가시험 도입을 FY1 말미에 도입하는 것을 타협안으로 할 수 있음을 제시했다.

The GMC set up a consultation on the introduction of a national licensing examination (GMC 2007a, b) in the UK and decided not to proceed along this path. The Tooke Report suggested the introduction of a national examination for knowledge but first not clinical skills at the end of the Foundation be may Year (1 year after graduation): this a reasonable compromise (Tooke 2007).



Tooke J. 2007. Aspiring to excellence: Independent inquiry into modernis- ing medical careers. London: Universities UK.


 



 2009 Mar;31(3):223-9. doi: 10.1080/01421590802574581.

Variations in medical school graduating examinations in the United Kingdom: are clinical competence standards comparable?

Author information

  • 1Centre for Medical and Healthcare Education, University of London, 4th Floor Hunter Wing, St George's, Cranmer Terrace, London SW17 0RE, UK. Email: mccrorie@sgul.ac.uk

Abstract

BACKGROUND:

While all graduates from medical schools in the UK are granted the same licence to practise by the medical professional regulatory body, the General Medical Council, individuals institution set their own graduating examination systems. Previous studies have suggested that the equivalence of passing standards across different medical schools cannot be guaranteed.

AIMS:

To explore and formally document the graduating examinations being used in the UK Medical Schools and to evaluate whether it is possible to make plausible comparisons in relation to the standard of clinical competence of graduates.

METHODS:

A questionnaire survey of all the UK medical schools was conducted, asking for details of graduating examination systems, including the format and content of tests, testing time and standard setting procedures.

RESULTS:

Graduating assessment systems vary widely across institutions in the UK, in terms of format, length, content and standard setting procedures.

CONCLUSIONS:

We question whether is it possible to make plausible comparisons in relation to the equivalence of standards of graduates from the different UK medical schools, as current quality assurance systems do not allow for formal quantitative comparisons of the clinical competence of graduates from different schools. We suggest that national qualifying level examinations should be considered in the UK.

[PubMed - indexed for MEDLINE]


+ Recent posts