Home / News and Blog / Health Partnerships / Self-reported change in practice: a monitoring tool for health partnerships
Back to blog

Self-reported change in practice: a monitoring tool for health partnerships

25 February 2014

In this blog, THET's Evaluation & Learning Officer, Emily Burn, shows how the introduction of a self-assessment tool is key in demonstrating the impact of health worker training.

Trainee nurses at Kambia District Hospital, Sierra Leone. Photo by Timur Bekir

 

Imagine this situation: A team of doctors were trained in palliative care by their UK partners in a one-week course. There were 30 trainees in the group and the partnership plans to train another cohort of the same size.  The trainee doctors received a certificate of competence at the end of the course if they performed well enough in the final written assessment.

Once they completed the course, the doctors returned to their places of work in the hospitals and clinics both in peri-urban and urban areas. For many of the doctors, they will be the most senior or only clinician at their place of work and they will also be the only ones responsible for delivering palliative care.

What can the partnership do to gather data on the doctors’ practice in palliative care once they have returned to their places of work? In this scenario, the trainees are spread widely geographically making it difficult for those responsible for M&E to visit each of them.

The doctors do not have an appraisal system or a senior colleague who can provide objective feedback on their practice. The doctors do keep practice logbooks but it may not be feasible to retrieve data from all of those logbooks.

The partnership does not have enough funds or supervisors available to visit each doctor in their place of work to assess their skills, and in any case, how thorough an assessment can be made in just one visit?  So the partnership faces various issues in data collection but they do need to come up with ways to gather data on practice that are reasonable given the context they work in.

The case for self-assessment

This scenario is typical of many health partnerships’ experience of monitoring change in a rigorous way: they have limited staff, time and funds available to monitor and evaluate each health worker’s performance yet they still need to gather evidence that could demonstrate the impact of the training on the health worker’s practice.  Given the context that health partnerships work in, there is a strong case for using a self-assessment tool, such as an online or paper questionnaire.

The obvious issue is the lack of objectivity or external verification of the claims made in a questionnaire and indeed the aim should be to combine it with other sources of information (e.g. clinical records) yet the questionnaire can overcome some of the difficulties health partnerships face in data-gathering to provide interesting insights to trainees’ experiences in their own places of work.

How can you design a questionnaire for use in your own work? I asked THET’s community of practice for examples of self-assessment tools that they use in their projects.

Interestingly, I received several examples of workshop evaluations, which looked at things like knowledge gained, course relevance, and overall satisfaction with the training, all gathered at the end of the course.

I had far fewer examples of questionnaires that trainees complete further down the line on if, or how, they applied their training. Although any questionnaire must be tailored to the specific techniques taught, the following generic topic areas are a useful starting point for designing your self-reporting tool:

Evaluation of the training

  • This is to understand if the training targeted the right cadre of candidates so the questions should ask how relevant the trainee has found the training to be once they returned to their institution.

Knowledge retention

  •    A set of multiple choice questions (MCQs) can directly test how much knowledge the trainee has retained since the training.  You could compare results from a final assessment at the training with this later MCQ assessment.

Confidence to practice

  • How confident does the trainee feel to carry out procedures or new techniques, learnt on the training course?  A Likert scale (e.g. Very confident – Fairly confident – Not really confident – Not at all confident) is the commonest format for this type of question.  Include a comment box to give rationale for their answer.  It could be interesting to compare confidence answers across three points in time: pre training, immediately post training, and 6 – 12 months after training. The comments box could help to explore reasons for notable dips or peaks in confidence.

Change in practice

  • With what frequency does the trainee now practice a set of techniques, bearing in mind what is reasonable for a given context?  Always provide comment boxes for these questions as it is important to understand any barriers to practice, such as lack of equipment or adequate supervision.
  • Include text boxes to gather narrative examples of practice such as cases where they have used the skills gained in training; if appropriate, request that the answer includes any information on the outcomes for the patient.

Lastly, what response rate would you be satisfied with?  It is unlikely that you will get 100% of trainees to complete and  return the questionnaires so make it as user-friendly as possible (test it out on some colleagues before you distribute it), consider creating it online if local bandwidth allows – THET  uses the online tool Survey Monkey because it is easy to use and has useful reporting functions – and review your M&E plans to determine other data-gathering tools so that you are not reliant on just the questionnaire for data on change in practice.

If you have an example questionnaire that you would be happy to share with us, please send it to emilyburn@thet.org or post it on the community of practice.  I am also interested to hear about people’s experience of using self-reporting to gather evidence.

Useful Links

http://betterevaluation.org/

http://www.tools4dev.org/resources/online-based-survey-software-5-pointers-to-look-out-for/

This post was written by:

Emily Burn - Evaluation & Learning Officer, THET

0 Comments

Leave a comment

Your email address will not be published.