Assessing science students’ practical skills – can we learn from healthcare education?

By Miranda Prynne, 12 July, 2022
Healthcare educators have developed effective ways to assess a range of practical skills in their learners. What can other science educators learn from their example?
Article type
Article
Main text

As a science educator, you will likely have experienced the struggles of developing practical lab classes that are engaging, novel, affordable, relevant, sustainable and effective. The days of running lengthy practical classes have gone from many institutions, particularly where budgets are tight, technical support staff minimal, space limited and student numbers increasing. There is also the question of whether many practical lab classes are fit for purpose: What skills do they actually assess? Is it the lab techniques we want students to learn, or are we more worried about the completion of a formulaic lab report or series of questions to give us something familiar to grade? We need to ask ourselves whether we are assessing only technical competencies in these lab classes, rather than “softer” graduate skills that might enhance our students’ employability and effectiveness as future professionals. Maybe it’s time to try something new?

Practical skills assessment in healthcare programmes

If we look beyond the science disciplines, there may be lessons we can learn from colleagues in the healthcare professions. Many clinical programmes have used the Objective Structured Clinical or Practical Exam (OSCE/OSPE) format to assess their students’ skills, attributes and competencies. Such assessments focus on skills or attributes, rather than just the tasks that the student performs. They involve multiple, timed stations where a student must undertake a task so that they may demonstrate specific skills or competencies. Student performance is assessed using clearly specified, consistent grading criteria for each station. We don’t tend to use this method of assessment much in science, but it might be worth trying when aiming to develop more authentic assessments.

Where to begin?

When considering the introduction of an OSPE-style assessment, the key thing is to ask is: what are the skills that we actually want to assess? It is easy for educators to focus solely on a lab procedure or technique rather than actual skills, eg, “We’ll just get them to do a Western blot”, without being clear in their minds what it is about running a Western blot that is important and worth assessing.  The skills you want to assess will vary, but it may include some of the following:

  • Problem-solving
  • Numerical skills
  • Handling difficult situations
  • Health and safety
  • Leadership skills
  • Entrepreneurship
  • Time management
  • Interview skills
  • Ethics/professionalism
  • Interpretation of text, graphs, images, data
  • Language skills

Once you have an idea of what skills you want to assess, tasks can be developed that serve as the platform to assess these competencies. If you have healthcare colleagues in your institution, it might be worth seeking their advice and learning from their experiences. They can act as critical friends during the development process.

How do learners and educators benefit from an OSPE-style assessment?

An OSPE assessment can help learners reflect on what they do well and gives them some direction as to what skills they need to develop or enhance. You can contextualise the OSPE content to reflect real situations in employment, and this may be useful when aiming for more authentic assessments. Students are motivated to succeed, there is greater objectivity in assessment, and they can test a broad range of skills. OSPE performance may provide better evidence to use when commenting on certain skills and attributes when writing references, or when providing targeted academic support to learners.

Top tips for planning an OSPE

1. Your students. Who are they? How many of them are there? Do they have specific needs, requirements, study modes, etc?

2. Why are you doing this? What skills or attributes do you need or want to assess? Are there resources or colleagues who can help you with this in your institution, or do you need the help or views of others?

3. Logistics. Location and timing – wet laboratory? Clinical area? Sports facility? Multipurpose area? Remote, eg fieldwork? Computer lab? How many stations? How long will each station last for?

4. Resources. Staff – how many academic staff, technicians, examiners? Do you need demonstration or practice days, examination days, marking time? Equipment – availability, costs, quick turnaround for lab or equipment needed?

5. Planning for problems. What will you do if there are spillages, breakages, power cuts, fire alarms, IT issues and so on during your OSPE?

6. Illness or absence. What are the arrangements if staff or students are ill and cannot attend?

7. Nature of the assessment. What should students do to demonstrate that they have gained the required skills? This should be clear to all students and staff, and criteria should be published in advance. Will you use technology or paper-based submissions?

8. What if it works? How do you upscale, expand to different disciplines, groups of students or staff, or varying locations? How will you measure and disseminate success of the OSPE?

These ideas will hopefully help you develop OSPEs to address the needs of your learners. Parts of an OSPE will work well during your first attempt, and other elements will need enhancement – it’s an iterative process. But you may find that taking a chance and trying this assessment methodology is beneficial for both you and your students.

Derek Scott is a professor of physiology and pharmacology education at the University of Aberdeen.

This advice is based on a presentation given at a HUBS-funded workshop, Fundamental Biosciences, hosted by the University of East Anglia.

If you found this interesting and want advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the THE Campus newsletter.

Standfirst
Healthcare educators have developed effective ways to assess a range of practical skills in their learners. What can other science educators learn from their example?

comment