[BLOG] Level Up! Moving up the medical ladder

With the constant assessments and examinations of medical school and junior doctor life, Joel Cunningham questions whether we should be taking more control over our development by self-assessing our abilities.

July and August have been and gone. The season of exam results and new doctors. The time when, throughout the country, medical students and junior doctors progress on to the next step of the medical ladder. Have you just embarked on your first year of clinical medical studies? Your first foundation year job? Your core training? Your registrar specialty programme?

I have recently traversed the MRCP assault course. Starting my second year of core medical training with Royal College membership I am now qualified to ‘act up’ as a medical registrar in certain situations1. The medical registrar is the doctor who runs the acute admission take, manages unwell patients in the hospital and supervises the more junior doctors in the on call team. Does this mean that I have seen enough patients to be competent in this role? Can I now run a medical take and safely steward a hospital through a whole shift? The thought is daunting.

The medical profession hangs a lot on exams and training progression, and understanding the hierarchy relies on a comprehension of complex qualifications and training grades. Does this represent the whole picture, though? Are we being becoming complacent in the assessment of our own abilities, in a world of relentless examinations and online eportfolio tick-boxes where we merely have to be ‘sufficient’ and ‘satisfactory’ in a set number of domains? What else is there to allow us to judge our abilities?

Several years ago, a medical school in the US proposed a new system for providing feedback to medical students; the ‘RIME’ tool2. In my opinion, acronyms in medical education are usually reserved for complex frameworks and hypothesised models which are rarely applicable to real life practice. RIME, however, is a simple and elegant description of the stages of personal development within medical practice; an alternative to the linear ladder we know so well. Here is my interpretation of their descriptions:

R is for Reporter. This is the diligent new clinical medical student who will spend an hour with the patient, collect a wide array of information from history and examination, and return to present that information back to their supervisor. This is not without skill - accurate and thorough history taking and ordered, concise presentation of facts to a medical senior can be daunting. It takes a great deal of practice! However, reporters cannot process the information beyond its face value. They present facts, not opinions.
I Is for Interpreter: able to synthesise the patient’s history and examination findings with their own medical knowledge in order to present a list of problems and possible differential diagnoses. This takes further practice and requires the ability to sift through the plethora of symptoms and signs to find those which have true value and diagnostic significance.
M is for Manager, the clinician who is able to take control of a medical issue, gathering the information they require, constructing a problem list and then a management plan. A manager is essentially independent in practice, though this is not to stop the healthy practice of discussing cases with colleagues and taking advice.
Finally, E is for Educator. Capable of not only efficiently managing patients’ medical issues, but also simultaneously supervising and educating their team of doctors and students. Able to impart advice on the management of patients, and also to sit down with more junior members of the team (students or doctors) to systematically discuss a topic and infuse knowledge. This is your favourite Medical Registrar, your wise Professor, or even your experienced SHO.

Who judges your performance in this model? Several US groups have studied the use of ‘RIME’ when providing feedback to medical students from their supervisors. Some have shown that it correlates with more formal/classical assessment outcomes, and could even be predictive of summative assessments3,4.

However, this is such a simple model that we could use it to self-assess at all stages of medical school, training and practice. Reflection, despite some issues*, should be part of everyday practice. How simple it can become when you think back on the events of the day and just consider whether you were mostly acting as a reporter, interpreter, manager or educator. Why did you not reach the next level, and what can you do tomorrow to try to achieve this? Was this a deficiency in knowledge? Or just lacking in confidence to go the next step and provide a differential diagnosis and management plan?

This is not to suggest that we throw away our name badges and begin to introduce ourselves with “Hello my name is...., I’m the Interpreter looking after you today”. Not only would that lead to a great deal of patient confusion (!), but it would perhaps miss the point. These descriptions are fluid, and apply to you performance and skill in each individual case. Confidence with respiratory medicine might allow you to be an educator when a patient presents with an exacerbation of COPD, but the unfamiliarity with acute strokes could leave you stumped, running to the your senior and reporting the facts straight away. Our performance can be affected by knowledge, experience, and confidence in each situation.

I feel that medical students and doctors of all grades should have insight into their performance which extends beyond their list of exam achievements and their training grade. RIME is a simple tool which puts some common sense into self-appraisal. I hope that it is a tool that catches on in the UK and provides a step away from the culture of satisfying tick-boxes and writing exams.

* Coming to a blog post soon!

 

References/Further Reading

  1. Guidance on Core Medical Trainees acting up as a Medical Registrar. Jt. R. Coll. Physicians Train. Board (2015).
  2. Pangaro, L. A new vocabulary and other innovations for improving descriptive in-training evaluations. Acad. Med. 74, (1999).
  3. Ander, D. S., Wallenstein, J., Abramson, J. L., Click, L. & Shayne, P. Reporter-Interpreter-Manager-Educator (RIME) Descriptive Ratings as an Evaluation Tool in an Emergency Medicine Clerkship. J. Emerg. Med. 43, 720–727 (2015).
  4. Espey, E. et al. To the point: medical education review of the RIME method for the evaluation of medical student clinical performance. Am. J. Obstet. Gynecol. 197, 123–133 (2015).


written by: joel_cunningham, first posted on: 16/10/15; 11:37

Comments:

No one has left a comment yet. Be the first - see below.

Make a comment:

Please login before posting a comment. If you don't yet have an account on osce-aid.co.uk, you can register for FREE by clicking here.