Go back

MD’s Hope for a Swipe Right on their Careers

CaRMS (The Canadian Resident Matching Service) is the nightmarish process to which we subject senior medical students in order to match them to their residency programs. CaRMS determines a medical trainee’s future for the next 2 – 5 years of their career. This makes it an extremely stressful process for applicants.

The procedure is relatively straightforward: Students submit online applications to prospective programs. Applications are reviewed by each program, resulting in a shortlist of candidates to be interviewed. After the interviews, students and programs rank each other in order of preference. Then the “CaRMS algorithm” matches students to programs. (Click here to watch a video explaining how the algorithm works.)

Ostensibly, CaRMS exists to efficiently allocate students to residency spots – minimizing the number of vacant residency positions, and unmatched students. According to its mission statement, the goal is to “facilitate a fair and transparent match”.

However, in emergency medicine, CaRMS is the furthest thing from transparent, and it reeks of unfairness.

From the outside, the selection process is a black box. Only members of the selection committee know the specific criteria used to construct the final rank list. It isn’t even clear if programs standardize how they evaluate applications, or if it is entirely subjective based on the overall gestalt of the reviewer. Only one Canadian EM program (congratulations to the University of Saskatchewan) has made its selection process transparent. But even within their seemingly rigorous system, applicants are assessed on how well they will “fit” into the program culture – an entirely subjective measure susceptible to bias and discrimination.

Furthermore, it is becoming progressively more difficult for students to differentiate themselves. There are almost no objective indicators of students’ abilities left. Canadian medical education has used a pass-fail grading system for years – so programs do not have access to historical measures of performance for most students. More recently, the amount of time medical students can spend on electives in a given specialty has been restricted. This makes it even more difficult to identify the most committed applicants.

Without objective indicators of student performance, programs look to letters of reference to evaluate applicants. This is another problematic subjective assessment. Royal College emergency medicine is a small specialty. A few big names have a lot of sway. It is commonly felt that the reference letter’s author is as important, or perhaps more important, than its content. Unfortunately, the ability to secure a letter from a big-name emergency physician is out of the control of most students. Their access to staff physicians is controlled by the administrator who creates the ED learner schedule. Even if this were not the case, we are determining a student’s career based on another physician’s writing skills.

Curriculum Vitae are also difficult to evaluate as students begin medical school at different stages of their careers. One applicant may have previously done a Ph.D., have 30 publications, and therefore an extremely impressive CV. Another may have started medical school before completing their undergraduate degree resulting in a less remarkable CV. Assuming their medical education was equivalent, in this example, their CVs would have no bearing on their quality as a resident.

In the absence of objective data upon which to evaluate students or any information on how assessors score applications, the review process is entirely opaque. To an outsider, it appears to rely almost wholly upon who you know – making connections and impressing the right people. In my 4th-year experience, developing clinical skills were secondary to doing electives in academic emergency departments and meeting program directors.

Emergency medicine is a very competitive specialty. Applicants are already under tremendous stress. Our failure to have a transparent and fair method to evaluate applicants further contributes to this stress. Our senior medical students deserve better, and we can do better than this.

How would you improve the residency matching system to make it more fair and transparent? Do you have any memories of CaRMS (good or bad)? Comment below. 

 

Disclaimer: I have never participated in a CaRMS selection committee. This is an opinion – not an evidence-based review.


Disclaimer: The views and opinions expressed in this blog post are those of the authors and do not necessarily reflect the official policy or position of the BC Emergency Medicine Network.

COMMENTS (0)

Add public comment…

MORE BLOG POSTS

See All