EngageU: Challenges of assessment & public engagement

Funding bodies are increasingly requiring evidence of impact for higher education efforts in outreach and public engagement, yet measuring this impact is challenging. A review of current practice combined with interviews of public engagement experts in the UK underscored the degree to which outcomes of public engagement and outreach efforts are often not immediately visible, but rather diffuse and developed over time.

This challenge in measuring impact was a main point of conversation on Friday when the OII hosted the judging panel for our first-time European Competition for Best Innovations in University Outreach and Public Engagement, supported by the European Commission as part of the ULab project. The panel consisted of experts in public engagement from the ULab partner countries. Judges represented a range of backgrounds, including funding officers, policymakers, and those engaged in award-winning outreach.

When reporting impact, the majority of entries cited audience size, whether measured by attendance, web traffic, web clicks, or distribution of printed materials. Yet, this reporting usually did not directly measure progress on the stated aims of the projects. For example, if a project aims to change perceptions, or increase engagement with science, simple attendance at a fair or visits to a website cannot show whether these objectives were met. Tracking whether the person then attends another science event, enrolls in a science class, or undertakes a degree in science may provide a stronger measure, but still the extent of influence of a particular event is unknown. Of course, the problem is that indicators of input are more easily and economically gathered, than of impact, which are costly and an additional activity, which may not yield definitive answers.

However, some entries used participant surveys to gather more details about impact and experience. These ranged in depth from like/dislike buttons to short-answer questions about learning. Some projects engaged their participants in focus group discussions aimed at improving their offerings and better meeting the needs of their target audience. A few looked at impacts over time, showing shifts in community engagement, or increases in student enrollment.

The question of depth versus breadth of impact was debated among our judging panel, showing that even experts have diverse opinions of what constitutes models of good practice in public engagement. We found that entries from a number of European countries focused on the numbers of people reached as a measure of success. For example, events such as the Researcher’s Nights hosted throughout Europe aim to engage large numbers of community members in short events and therefore measure success by attendance levels. By way of contrast, several entries from the UK place an emphasis on depth of engagement. In other words, a project may have less than 20 participants, but have weeks or months of interaction that leads to a change in behavior, for example, a demonstrated interest in science, or a better understanding of a different social perspective. Some judges placed more emphasis on breadth, while others focused on depth.

Overall, even in top-ranked entries, the judges found that impact measures were not always ideal or fully developed and recommended that the projects should consider developing more evidence of impact. One of the winning entries, Active Science, demonstrated a multi-faceted approach to impact measures. In addition to reporting on audience attendance, the programme also demonstrated its success through tracing its expansion from a local to nation-wide project. To measure whether the project was successful in shifting perceptions about science, participants were given pre- and post-activity questionnaires that measured their attitudinal approaches.

Matching clearly defined objectives to appropriate measures may improve reporting and evaluation of impact. The wealth of activities and approaches reported in the EngageU entries provide a repository for strong examples of impact measures as well as a means of comparisons for how projects with similar objectives measure their impact.

Of course, you may have very different views on the centrality of impact assessment for outreach. We would appreciate hearing from you.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.