MSPE 랭킹의 학교별 차이: 매우 나쁜 학생이지만 "Good" (Acad Med, 2016)

Ranking Practice Variability in the Medical Student Performance Evaluation: So Bad, It’s “Good”

Megan Boysen Osborn, MD, MHPE, James Mattson, Justin Yanuck, MS, Craig Anderson, PhD, MPH, Ara Tekian, PhD, MHPE, John Christian Fox, MD, and Ilene B. Harris, PhD




The medical student performance evaluation (MSPE), formerly called the “dean’s letter,” is an important component in a medical student’s application for residency training.


The Association of American Medical Colleges (AAMC) guidelines for preparing the document were revised in 2002, In a review of MSPEs submitted three years later, however, Shea and colleagues2 found that there was still great variability from institution to institution.


The 2002 MSPE guidelines state that the summary section should include

“a summative assessment …

of the student’s comparative performance in medical school,

relative to his/her peers, including information about any school- specific categories

used in differentiating among levels of student performance.”1


Notwithstanding this recommendation, analyses of MSPEs have demonstrated that ranking systems vary among schools.2–5

  • Some medical schools provide numerical ranks for their students,

  • while others group their students into quartiles or quintiles.

  • Many medical schools group their students into categories, with descriptors such as “outstanding” or “very good.”

  • Some schools do not use any type of ranking system or do not clearly define one.



Schools’ categorical ranking systems vary widely in the terminology used and in the size of the category groups.

  • For example, many schools use the term “excellent” to describe high-achieving students, while other schools use the same term to describe average students.5

  • On the other hand, schools typically use the term “good” to describe students in the bottom 50% of their graduating class.3


no authors have quantified the variability in ranking practices among U.S. medical schools.






Method


자료수집1

We extracted the MSPE from each appli- cation to the University of California, Irvine (UC Irvine) emergency medicine (EM) residency program during the 2012–2013 and 2014–2015 application cycles. We included any applicant from a U.S. MD-granting medical school.


자료수집2

For any schools still missing from our sample, we contacted the school’s associate dean of student affairs and requested a deidentified MSPE.


연구질문

We answered the following questions for each school’s MSPE:

  • 1. Does the school use a defined ranking system?

  • 2. What type of ranking system is used?

  • 3. Into how many categories are the students divided? What are the most common category descriptors for each group? What percentage of students is in each category?

  • 4. Where is the student’s rank provided in the MSPE? Where is the legend for the ranking system?
  • 5. Do nonranking schools (i.e., schools that do not rank students) use similar language to schools that rank students?



Results


In 2015, there were 136 U.S. MD-granting medical schools that had graduating classes, while in 2013 there were 132.6


 

we had at least one MSPE from 134 (99%) of the 136 U.S. MD-granting medical schools with graduates in 2015. We had data for both 2013 and 2015 for 114 (85%) of these 134 medical schools.


Of the 134 schools, 101 (75%) provided ranks for their medical students in the MSPE, and 33 (25%) did not.


  • Sixty-three (62%) of the ranking schools used named category groups, such as “outstanding” for their top group and “good” for their lowest group.

  • Twenty-four (24%) of the ranking schools broke the class into segments such as tertiles (thirds), quartiles (fourths), or quintiles (fifths), without other category descriptors.


Students were most commonly divided into four groups (Table 1).

 


그룹별 사이즈(학생 수)

  • Six (10%) schools did not provide the sizes for each of their groups, and

  • another 6 (10%) provided size distribution for onlytheir top group(s).

  • When the negative terms “marginal,” “below  average,”  or “recommended with reservation” were used for the lowest-ranked students, these groups included on average 1% of students (range: 0%–2%). 

 

흔하게 사용되는 용어

The most common terms used by the 63 schools with named category groups to describe student performance, regardless of class position, were

  • “excellent” (n = 53; 84%),

  • “outstanding” (n = 52; 83%),

  • “very good” (n = 51; 81%), and

  • “good” (n = 42; 67%) (Table 2).

 

Among these schools,

  • “excellent” was used to describe students ranging from the 1st to the 95th percentiles;

  • “outstanding” was used to describe students who were in the 33rd to 99th percentiles;

  • “very good” was used to describe students who were in the 1st to 80th percentiles; and

  • “good” was used to describe students who were in the 1st to 57th percentiles.

 



MSPE내에서 학생의 rank관련 정보

Among the 101 schools with formal ranking systems, there was variability in where the reader could locate the student’s rank in the MSPE.

  • The majority (n = 79; 78%) of the schools identified the individual student’s rank in the summary section.

  • Other locations included the appendices (n = 14; 14%) or another section within the MSPE (n = 8; 8%).

  • Many schools (n = 51; 50%) made an effort to highlight the rank by bolding, capitalizing, or underlining it. 


Ranking system legend의 위치

The location for the ranking system legend also varied.

  • Thirty-six (36%) of the 101 ranking schools described their system in Appendix D, as suggested by the AAMC guidelines.1

  • Forty-one (41%) described their ranking system in another appendix, including the medical school information page.

  • Some included the legend in a cover letter (n = 6; 6%) or

  • within the body of the MSPE (n = 5; 5%).

  • Thirteen (13%) did not fully describe their ranking system anywhere, but the use of a ranking system was inferred by their giving a numerical or quantile rank to the student at some location in the MSPE.


nonranking school들의 특성

Of the 33 schools that did not rank their students,

  • 21 (64%) included a statement somewhere in the MSPE that they “do not rank” their students.

 

33개의 nonranking school에서 330개의 MSPE분석결과

We examined the summary sections of 330 additional MSPEs from the 33 nonranking schools (average = 10 MSPEs/school; range: 1–25, IQR: 5–14).

  • We did not find any language suggestive of rank for 9 (27%).

  • We found that 15 (45%) of the schools included statements in their summary paragraphs that were suggestive of rank.
    “We recommend him/her as a [descriptor] candidate for graduate medical education.” The descriptors this school used were “outstanding,” “excellent,” and “very good.”



Descriptor와 Academic performance의 관계 비교

When we compared students’ academic performance against the “descriptors” used by these 15 schools (independently ranked by our reviewers), we found that 8 of the schools had strong correlations between the descriptors and students’ academic performance, as measured by clinical clerkship grades (Spearman rank coefficient: r = 0.71–1.0 for 8 schools, s P < .05 for 6 of these schools).


Nonranking school 중 2015 U.S. News and World Report top 20 medical schools의 비율이 높음

We found that a higher percentage of nonranking schools (n = 9 of 33 [27%]; 95% confidence interval [CI], 12%–42%) were among the 2015 U.S. News and World Report top 20 medical schools7 compared with the ranking schools (n = 11 of 101 [11%]; 95% CI, 5%–17%; chi-square test for independence: P = .02).


랭킹 사용여부의 변화, 카테고리 그룹의 숫자 혹은 descriptor의 변화 등

  • Seven (6%) of the 114 schools for which we had two years of data changed from nonranking schools in 2013 to ranking schools in 2015.

  • None of the schools changed from ranking to nonranking.

  • Five (4%) schools decreased the number of category groups they used.

  • Two (2%) schools changed their descriptor terms.

  • Many schools changed the size of their category groups (n = 36; 32%).

A Wilcoxon signed rank test detected no statistical difference in the percentage of students in the first, second, third, or last category between 2013 and 2015 in these 36 schools.



Discussion


 

Our study clearly demonstrates inconsistency in the student ranking systems used in MSPEs among U.S. MD-granting medical schools.

 

Variability로 인해서 해석이 어려워짐. 평균적으로 레지던트 프로그램은 856개의 지원서를 받음. MSPE의 권장 길이는 2~3페이지이나 보통 그거보다 긴 경우가 많음.

Variability in the format of MSPEs may contribute to difficulty in interpreting them. On average, residency programs receive 856 applications per year.8 Although the recommended length of the MSPE is two to three pages, it is usually longer.1,2


그러나 학생의 class rank를 locate하는 것은 귀찮은cumbersome과정이다. program director는 먼저 학생의 MSPE에서 rank category를 찾고, legend를 확인하여 ranking system을 설명하는 부분을 보아야 한다. 이러한 귀찮음을고려하면 program director들이 MSPE를 rank-order decision을 내릴 때 사용하지 않는 것이나 MSPE가 대부분의 전공과목에서 top factor가 아닌 사실이 놀랍지 않다.

Locating a student’s class rank can be a particularly cumbersome process, however. The program director must first identify the sentence in the MSPE that contains the student’s rank category and then locate the legend that describes the ranking system—both items with variable locations and presence. Therefore, it is not surprising that one-third of program directors do not use the MSPE to make rank-order decisions and that the MSPE is not a top factor in making selection decisions for most specialties.10,11


 

MSPE의 가치가 평가절하되는 상황은 우려스럽다. 왜냐하면...

The lower value placed on the MSPE by program directors is concerning because...

  • the MSPE is the only comprehensive assessment of a student’s medical school performance.

  • Medical schools put significant time and resources into the production of MSPEs each year.12,13

  • Issues in the readability and usability of the MSPE undermine the efforts and resources invested by all parties involved and

  • may ultimately take emphasis away from assessment of global medical school performance in a student’s application.


같은 용어도 학교마다 다른 식으로 사용된다.

  • For example, “good” was usually used to refer to students in the bottom half of their graduating class, but its use ranged above the median in at least one school. Such variability may lead to errors of interpretation.

  • A program director may assume, for example, that an “outstanding” student is ranked in the school’s top category, when in reality the student is ranked in the bottom half of the class or the school has no ranking system.


가장 손해를 보는 학생은 high-achieving student가 된다.

High-achieving students have the most to lose from the variability in ranking systems. First, they are indirectly affected if a program director incorrectly assigns their lower-achieving counterparts to a higher quartile than is deserved


권고 10년이 지났지만, 여전히 다수의 학교가 transparent, comparative performance information 를 제공하지 않는다. top student에 대해서만 rank를 제공하기도 한다.

Our study also demonstrates that, more than 10 years after the AAMC recommendations on the MSPE were released,1 one-quarter of medical schools do not provide transparent, comparative performance information for their students. Other nonranking schools provided rank information for their top students only, despite explicitly stating that they “do not rank.”


왜 rank를 제공하지 않는지 이유는 불분명하다. 그러나 U.S. News and World Report top 20 medical schools 에서 NONRANKING비율이 높은 점에서, 이들 학교가 '우리 학생은 특별하다'라는 생각을 가지고 학생들을 CATEGORY로 나누고 싶지 않은 것을 유추해볼 수 있다. 본질적으로 학생의 입장을 대변하려는 MSPE작성자와, MSPE로 학생을 선발하려는 MSPE사용자 사이에는 conflict가 있다.

It is not clear why some schools choose not to rank their students. A larger proportion of nonranking schools than ranking schools were among the U.S. News and World Report top 20 medical schools in 2015,7 which suggests that top medical schools may be of the mind-set that all of their students are exceptional and may not want to place them into categories. Inherently, there is a conflict between the needs of the MSPE writer and the MSPE reader.15 The writer may feel compelled to act as an advocate for each student,13 whereas the reader is attempting to select the best candidates for his or her residency program.


Best ranking practice는...

It is our opinion that the best ranking practice is a consistent ranking practice that

  • 고성과 학생을 강조하면서 highlights high achievers and

  • 문제 학생을 발견하게 해주나 identifies problematic students but does

  • 저성과 학생을 벌하지 않는 not punish lower achievers.

 

저자의 권고

We believe that the following categorical ranking system would achieve these goals:

  • “outstanding” (80th–99th percentiles),

  • “excellent” (50th–79th percentiles),

  • “very good” (4th–49th percentiles), and

    • A large third group avoids punishing lower-ranking students,

  • “satisfactory” (3rd percentile and below),

    • with the “satisfactory” group size adjusted to include only students with serious academic performance issues (e.g., course failures).

    •  while a small lowest group provides an opportunity for schools to identify very-low-achieving students. 


한계점

Limitations

 

There were a few notable limitations to this study.
  • For ranking schools, we examined only one MSPE per application cycle and assumed that other MSPEs from the same school would have the same format; however, we minimized this potential issue by analyzing documents from two different years when available. Our data abstractors were not blinded to the study purpose. We excluded schools that did not give sizes for their category groups from the group size calculations.

  • For our analysis of nonranking schools, we reviewed 1 to 25 MSPEs from each school. The nonranking schools for which we had fewer MSPEs may have had a “hidden” rank system that we did not discover secondary to insufficient variability in student performance distribution.


 

 


161107_AAMC_MSPE_Journal_Club.pdf



 2016 Nov;91(11):1540-1545.

Ranking Practice Variability in the Medical Student Performance Evaluation: So BadIt's "Good".

Author information

  • 1M. Boysen Osborn is assistant professor and residency program director, Department of Emergency Medicine, University of California, Irvine, Orange, California. J. Mattson is a fourth-year medical student, University of California, Irvine, School of Medicine, Irvine, California. J. Yanuck is a fourth-year medical student, University of California, Irvine, School of Medicine, Irvine, California. C. Anderson is research specialist, Department of Emergency Medicine, University of California, Irvine, Orange, California. A. Tekian is professor, Department of Medical Education, University of Illinois, Chicago, Chicago, Illinois. J.C. Fox is professor and assistant dean of student affairs, Department of Emergency Medicine, University of California, Irvine, Orange, California. I.B. Harris is professor and head and director of graduate studies, Department of MedicalEducation, University of Illinois, Chicago, Chicago, Illinois.

Abstract

PURPOSE:

To examine the variability among medical schools in ranking systems used in medical student performance evaluations (MSPEs).

METHOD:

The authors reviewed MSPEs from U.S. MD-granting medical schools received by the University of California, Irvine emergency medicine and internal medicine residency programs during 2012-2013 and 2014-2015. They recorded whether the school used a ranking system, the type of ranking system used, the size and description of student categories, the location of the rankingstatement and category legend, and whether nonranking schools used language suggestive of rank.

RESULTS:

Of the 134 medical schools in the study sample, the majority (n = 101; 75%) provided ranks for students in the MSPE. Most of the ranking schools (n = 63; 62%) placed students into named category groups, but the number and size of groups varied. The most common descriptors used for these 63 schools' top, second, third, and lowest groups were "outstanding," "excellent," "very good," and "good," respectively, but each of these terms was used across a broad range of percentile ranks. Student ranks and school category legends were found in various locations. Many of the 33 schools that did not rank students included language suggestive of rank.

CONCLUSIONS:

There is extensive variation in ranking systems used in MSPEs. Program directors may find it difficult to use MSPEs to compare applicants, which may diminish the MSPE's value in the residency application process and negatively affect high-achieving students. A consistent approach to ranking students would benefit program directors, students, and student affairs officers.

PMID:
 
27075499
 
DOI:
 
10.1097/ACM.0000000000001180
[PubMed - in process]


+ Recent posts