의과대학 교육과정에 대한 지속적 프로그램 평가 (Penn State University College of Medicine)

How we conduct ongoing programmatic evaluation of our medical education curriculum

KELLY KARPA & CATHERINE S. ABENDROTH Penn State University College of Medicine, USA






ABSTRACT

학부의학교육에 대한 평가는 인증기준에 도달하기 위해서 반드시 필요하나, 적절한 평가 단계를 도입하고 유지하는 것은 쉽지 않다. Penn State University College of Medicine에는 2000년 교육과정평가위원회(CEC)가 설립되어 기존의 의학교육부학장과 학부의학교육위원회의 활동을 보조하였다. 여기서는 CEC가 활용한 방법과 성과를 다루고자 한다. 우리가 활용한 단계의 강점은 지속적이고 정기적인 평가이며, 이를 통해서 매2년마다 교과목을 리뷰하고 변화가 필요할 때에는 교과목 담당자가 책임을 지도록 했다. 우리의 평가 단계는 효과적이었고, 지속가능하며, 교육과정 개선을 위한 추가적인 영역을 확인해주었다.


Evaluation of undergraduate medical education programs is necessary to meet accreditation standards; however, implementation and maintenance of an adequate evaluation process is challenging. A curriculum evaluation committee (CEC) was established at the Penn State University College of Medicine in 2000 to complement the already established activities of the Office of the Vice Dean for Medical Education and the Committee on Undergraduate Medical Education. Herein, we describe the methodology used by the CEC at our academic medical center and outcomes attributable to the curriculum evaluation process that was enacted. Strengths of our process include ongoing, regular assessments that guarantee a course is reviewed at least every two years and a feedback loop whereby course directors are held accountable for implementing changes when necessary. Our evaluative process has proven effective, sustainable, and has identified additional areas for curricular improvements.





Introduction


의학교육에서는 총괄적인 프로그램 평가 단계를 묘사한 것을 찾기 어렵다.

Medical education is continually evolving, necessitating ongoing programmatic evaluations to determine the worth of programmatic components and whether overall goals and objectives are being attained (Fitzpatrick et al. 2004). More than 35 types of evaluative theoretical frameworks (needs, cost/benefit, formative, summative, goal-based, outcomes-based, etc.) have been developed for virtually every type of social endeavor; however, within medical education it is unusual to find comprehensive programmatic evaluation processes described (Musick 2006; Durning et al. 2007; Gibson et al. 2008; Fetterman et al. 2010; Vassar et al. 2010).


의과대학은 LCME에서 설정한 인증기준을 만족시키기 위한 평가를 해야 한다. 인증의 요구조건을 맞추기 위해서는 다음과 같은 것을 해야 한다.

Medical schools must establish approaches to evaluation that satisfy the accreditation standards set by the Liaison Committee on Medical Education. To meet accreditation requirements, medical schools are mandated to 

consider student feedback; 

perform ongoing monitoring of curriculum content, teaching, and assessment of student progress; 

use the results of program evaluations for course development; and 

evaluate medical school coursework in terms of postgraduate performance. 


PSUCOM의 교육과정 평가 단계에 대해 설명하고자 함.

Despite these clear directives, medical schools have struggled with identifying procedures and personnel best suited to accomplish these tasks, as exemplified recently by the probation of one United States medical school (Mulder 2012). Herein, we describe the systematic curriculum evaluation process that has been used at the Penn State University College of Medicine (PSUCOM) since 2000. 

      • Our programmatic evaluation process draws from both the Tylerian approach, which places substantial emphasis on objectives (Tyler 1942), 
      • and the empowerment evaluation model, which depends upon a culture of evidence, a critical friend, a cycle of reflection and action, and reflective educational practitioners (Fitzpatrick et al. 2004).



How we started


CEC 출범과 그 과업

A decade ago, our institution formed a Curriculum Evaluation Committee (CEC), an independent evaluative body that made recommendations for consideration by the Committee on Undergraduate Medical Education (CUMED), the authoritative body. The CEC was charged with...

(a) evaluating the performance, outcomes, and impact of the medical school curriculum; 

(b) judging the value and worth of the instructional methods; and 

(c) working with other institutional committees to improve curriculum and course performance. 

To accomplish the directives, the appointed CEC members operate at a granular level to comprehensively review each pre-clinical course and clinical clerkship bi-annually.


교과목 평가시에 고려한 점

The four components the CEC considers when reviewing a course are: 

goals and objectives, 

content, 

examination/assessment, and 

student opinion. 



Two courses are discussed at each monthly (1 h) CEC meeting. In advance, a committee member is assigned to review one of the four components for a single course. Members are provided with course-specific documentation and on-line access to course materials via our electronic course management system. Overall, these bi-annual assessments allow good performance to be rewarded by departmental Chairs, problematic curricular areas to be identified and remedied, and facilitate a shared awareness of curricular issues at all levels from the Dean to individual instructors.


Goals and objectives

The CEC member evaluating course goals and objectives checks these parameters against institutional goals and objectives for alignment. Each educational session (e.g., lecture, laboratory, small-group session) is expected to have specific learning objectives addressing varying cognitive levels that are written in behavioral terms which describe knowledge, skills, and attitudes students should demonstrate as a result of the educational session.


Content

Course content is compared to course goals and the stated learning objectives, United States Medical Licensing Examination (USMLE) course content outlines, and/or curricular content published by national associations. Omissions, redundancy, organization and integration of content, pedagogy, quality of lecture slides, and ease of navigation of electronic course management tools are assessed. The ability of the CEC to assess horizontal and vertical integration of curricular content is facilitated by use of the Curricular Management and Information Tool developed by the American Association of Medical Colleges. Evaluations of clinical clerkships also assess parameters such as inpatient and outpatient experiences, opportunity to perform procedures, and comparability of student experiences across clerkship sites.


Examination/Assessment

Multiple choice question-based examinations are judged in terms of: test reliability, correlation between test questions and objectives and content, level of cognitive domain being tested, item difficulty, item discrimination value, grade distribution, and format and quality of questions. Rubrics and inter-observer validation are expected for subjective assessments, e.g., essays.


Student opinion

End-of-course evaluations by students are semi-quantitative (Likert scale) with opportunity for anonymous free-text comments. CEC members that review student evaluations look for themes pertaining to: quality of lectures; course organization; clarity of learning objectives; quality of problem-based learning sessions, on-line exercises and small group sessions; topics requiring more or less emphasis; clarity and content of examination questions; and level of knowledge necessary for examination in light of course and lecture objectives. Student focus group feedback is reviewed as well.



Putting it all together: Information flow



Results from CEC involvement

부족한 것으로 나타난 부분

To date, areas of curricular deficits identified during the CEC reviews have generally been addressed by adding didactic sessions. Since 2002, the total hours of didactic instruction in year 1 increased from 66.5% to 74.8% (p = 0.02). Even in year 2, where problem-based learning plays a larger role, didactic classroom instruction time has increased by 11% (p < 0.001). Although more complete, this outcome is not necessarily desirable given the current trend toward more student-directed learning.


향상된 것으로 나타난 부분

As a result of CEC feedback to course directors, the number of lectures that state learning objectives has been steadily increasing since 2002; furthermore, the number of lectures with appropriately written behavioral learning objectives has increased in both year 1 and year 2 courses (p < 0.001). Another facet of the curriculum that has experienced improvements as a direct result of CEC involvement pertains to examination questions written in National Board of Medical Examiners-approved format. Since the 2002–2003 academic year, significantly more questions are written appropriately (p < 0.001). These are areas that the CEC has heavily championed, and this committee is largely responsible for the improvements that are noted. In addition, the average percent of our students passing USMLE Step 1 between 2000 and 2005 was 90%. In comparison, within the last five years, the average pass rate has increased to 95% (without any change in student admission requirements). Although this may be multifactorial, we believe the curriculum evaluation process has played a significant role.



A subjective measure of the CEC's effectiveness is the current manner in which course/clerkship directors welcome and are engaged in the review and improvement process. The CEC is generally viewed as a “critical friend” that provides constructive criticism, rather than the “educational police.” This perception likely accounts for the cooperation we receive from faculty in response to our reviews. Some course directors have even requested that the CEC review their course annually (rather than the normal bi-annual cycle), illustrating the value that course directors place on CEC feedback.






 2012;34(10):783-6. doi: 10.3109/0142159X.2012.699113. Epub 2012 Jul 20.

How we conduct ongoing programmatic evaluation of our medical education curriculum.

Abstract

Evaluation of undergraduate medical education programs is necessary to meet accreditation standards; however, implementation and maintenance of an adequate evaluation process is challenging. A curriculum evaluation committee (CEC) was established at the Penn State University College of Medicine in 2000 to complement the already established activities of the Office of the Vice Dean for Medical Education and the Committee on Undergraduate Medical Education. Herein, we describe the methodology used by the CEC at our academic medical center and outcomes attributable to the curriculum evaluation process that was enacted. Strengths of our process include ongoing, regular assessments that guarantee a course is reviewed at least every two years and a feedback loop whereby course directors are held accountable for implementing changes when necessary. Our evaluative process has proven effective, sustainable, and has identified additional areas for curricular improvements.

PMID:

 

22816980

 

[PubMed - indexed for MEDLINE]


+ Recent posts