November 2017 was the month of Programmatic Assessment Masterclasses in Australia for ModMed (GPEx). Five workshops, hosted by five different organisations in four cities, saw over 400 people attend the Masterclasses. They were led by Prof Cees Van der Vleuten, from Maastricht University, Prof Lambert Schuwirth from The Prideaux Centre at Flinders University and Ms Christine Cook from ModMed/GPEx.

The first event was hosted by Australian Medical Council and was attended by over 100 people. Every medical specialty was represented. This was an excellent day with an opportunity to hear about the theory behind programmatic assessment and the problems it addresses. Case studies were used to review the use of programmatic assessment around the world, including a case study of GP365, the programmatic assessment for learning general practice training program developed by ModMed and the Prideaux Centre.

The following workshops at RACP, RACGP and our very own partners WAGPET and GPEx considered the theory and benefits, with a focus on what programmatic assessment actually looks like and the change management process for implementation. Each event was highly interactive and in particular, it was great to see the Colleges engaging in discussions about how assessment can be used to further learning and make robust decisions on progress.

The Programmatic Assessment Masterclasses were a fantastic opportunity to challenge traditional thinking, consider the evidence around assessment and how this should inform how assessment is used in the twenty first century. There were so many great lessons learnt, and questions raised; here are a few:

Firstly, how are we as a college/hospital/organisation assessing the full spectrum of competencies?

Although there are individual differences in competencies for each organisation, all competency frameworks acknowledge that the competencies of a doctor extend beyond medical knowledge or “medical expert”.

How are we assessing these other competencies?

Many organisations have a robust assessment framework that was traditionally developed for assessing the “medical expert”, many participants were left reflecting on the emphasis their respective organisations were placing on the communication and professionalism competencies.

In regards to methods of assessment, the “toolbox is quite full”. We have long cases, MCQs, Mini CEx, OSCE, MCQ, oral exams, video assessments, in congnito standardised patients and so on. As Van der Vleuten and Schuwirth demonstrated in their paper published in 2005 all of these assessment methods become reliable provided there is adequate sampling. However for the duration which they are used in many training programs, the reliability is limited.

Perhaps we should focus less on discovering the optimum assessment tool but rather considering how we are designing our assessment programs to optimise reliability.

No good talk is complete without cartoons, and the one below rang true with all present.

This lead to a discussion on how assessment can drive learning, and how we can maximise this as organisations.

Prof Van der Vleuten and Prof Schuwirth demonstrated that to ensure validity of assessment, a multitude of methods are needed, to ensure reliability of assessment, lots of combined information is needed and that the impact of learning from assessment should not be forgotten with assessment providing meaningful information for learning. Programmatic assessment combines all of these things.

Prof Schuwirth reminded us that we live in a changing world. Trainees have endless resources at their fingertips and are no longer reliant on teachers for information. The definition of a safe and independent doctor is also changing. No longer can trainees learn all they need to know for their career by their fellowship exam. Lifelong learning, making meaning of all the available resources is an essential skill for doctors today. Medicine is a humanistic discipline and doctors need complex adaptive systems to be able to make meaning of information, apply it to clinical situations, adapt as needed to different variables, show empathy and reflection, and work in different environments with different types of people. A doctor’s ability to be a safe independent practitioner can’t be assessed by only one type of assessment.

Christine Cook’s presentation about the implementation journey was well received, as many of the organisations that were represented have commenced or are planning to introduce programmatic assessment principles within their education and assessment framework. Many were grappling with issues related to the practicalities of such a transition and the inevitable impact that the change process would have upon stakeholders. Christine spoke about the implementation journey through a congruence model of change approach and the importance of a reliable, user intuitive platform to ensure greatest uptake and change success.

There were many challenges posed for us as supervisors, teachers, organisations and individual doctors. It was a wonderful opportunity to discuss these, learn from each other and resolve to continue to strive for excellence in medical education. If you’d like to know more about programmatic assessment or how it is implemented through GP365 please feel welcome to contact us at GPEx and we would be happy to share our insights and expertise that we have gained along the way.

If you are interested to find out more about on programmatic assessment check out the following video and articles…

About Programmatic Assessment:

Schuwirth LWT, Van der Vleuten CPM. Programmatic assessment: from assessment of learning to assessment for learning. Med Teach 2011; 33(6): 478-485.
Van der Vleuten CPM, Schuwirth LWT, Driessen EW et al. A model for programmatic assessment fit for purpose. Med Teach 2012; 34(3): 205-214.
Van der Vleuten CPM, Schuwirth LWT. Assessing professional competence: from methods to programmes. Med Edu 2005; 39(3): 309-17.

Implementation of Programmatic Assessment:

Schuwirth L, Valentine N, Dilena P. An application of programmatic assessment for learning (PAL) system for general practice training. GMS J Med Educ. 2017; 34(5): Doc56.
Bok H, Teunissen P, Favier RP et al. Programmatic assessment of competency-based workplace learning: when theory meets practice. BMC Medical Education 2013; 13 (1) 123.
Van der Vleuten CPM, Schuwirth LWT, Driessen EW, Govaerts MJB, Heeneman S. 12 Tips for programmatic assessment.  Med Teach 2015; 37: 641-646.
learn@modmed.