One element of our original approach to measuring the impact of the activity was to ask students about their attitudes to science and studying science both before and after their participation. It was no surprise to find that afterwards students were more positive about science and intended to study at a higher level. Even better, we were able to see that the more a student participated, then the greater the improvement in attitude and intention.
But people have a tendency to be positive about an activity immediately after participating and it is difficult to know if that reported change in attitude will result in a change of behaviour. Will those students saying they will study science at the next level actually do so because of their participation in our activity? It is often suggested that a long-term longitudinal study would answer the question of impact. Regardless of the logistical challenges, research on educational choice highlights that the impact of a single activity is strongly influenced by what happens before and after, and that choice is determined by many interwoven factors, including peers, parents and teachers.
Change the activity?
One option that some practitioners take is to extend their activity and work with a smaller number of schools over a longer period. It is an attractive approach. Strategic, planned, cohesive activity, rather than spontaneous and short-term increases the possibility of impact. However, such activity can be challenging to implement – both for schools and for others (such as scientists) Our approach, although short-term, works with the demanding schedules of scientists and teachers, as well as students. The demand from these groups reflects that. It also allows us to deliver our national activity efficiently reaching schools equitably across the country.
The other option was to change our evaluation approach. If we can’t measure the long-term impact of the activity can we evidence the impact of the activity on something that we do know has long term impact? Step forward the concept of science capital and the Science Capital Teaching Approach (SCTA).
In short, students with higher levels of science capital are more likely to feel that science is for them and are more likely to study science and see STEM careers as being an option for them. If we could evidence that participation in I’m a Scientist likely supported a student’s science capital, then we could evidence that we are likely to have an impact on the outcomes predicted by the better resourced and evidenced science capital concept.
Link to science capital
We decided to link to science capital for a number of reasons. It is based on robust, long term, large scale research conducted by a well-respected academic team that explain their findings in a clear well-presented report designed for practitioner audiences that include most of the people to whom we need to demonstrate the impact of our work. The UK science engagement funding sector have also to a large degree accepted science capital as a valid explanation for the sector’s limited success in broadening participation in science.
In May 2018 we started speaking with Jen DeWitt, PhD from the academic team who developed the science capital concept to plan an evaluation programme to try to evidence the link between I’m a Scientist and science capital.
Field work started in November 2018 and continued with our March 2019 event. Jen observed live chats from the perspective of the classroom. She conducted focus groups with students in the weeks after their participation and interviewed their teachers. Transcripts of live chats and questions asked were analysed.
Over the spring and early summer of 2019 Jen wrote her report. We also produced a summary and presentation.
The findings were clear:
“The study indicates that all the elements of I’m a Scientist together (CHAT, ASK, VOTE and so forth) form an integrated whole, through which the SCTA is enacted and which contribute to building students’ science capital.”
The research has so far been very useful. Not only has it clearly evidenced that I’m a Scientist can support science capital in participating students but it also describes why it does so. Often evaluation focuses on the impact without explaining why. The reasons why provide additional confidence in the research and could help others learn from our activity.
The data for this stage of the research has been a small but, we think, typical sample of the students participating. The next step for November 2019 is to see if the majority of students report similar experiences of the activity indicating that what we found in the qualitative work may be true for most participating students.