Written by Dr Alison Hill
I was an inaugural Exeter Education Incubator fellow in 2017/18 looking at ways to address the mathematics gap in my classes. I developed bespoke resources for my medicinal chemistry module, and a Smart Worksheet (in conjunction with LearnSci) for one of the second year biochemistry classes. In both cases I had discovered that performance on these modules was linked to post-16 maths qualifications. The new online tools broke that connection and confirmed to me the importance of targeted interventions where students can see the direct relevance to their subject. The interventions led to increased confidence and competence in data handling for all of the students. It also provided me with the confidence and evidence for the use of digital online tools to facilitate and enhance student learning.
This meant that when the pandemic hit and all teaching moved online, I did not simply move my lectures online and hope for the best. I realised that the online environment was different and that education materials needed to be targeted and accessible. I was also concerned about ensuring students submitted their own work rather than collude or get others to do it for them (e.g. peers or using helper sites such as Chegg). It was also important to me that students felt supported to give them the confidence and skills to complete the assessments themselves.
Assessments such as mine with unique answers are at high risk of collusion/cheating. It is very easy for answers to be shared through social media (such as Whatsapp) or to use helper sites. Students are more likely to cheat if they think “everyone else is doing it” and that there is a low chance of getting caught.1
From my Education Incubator project, I had used a random number generator in my work, setting upper and lower limits for variables, so that one question could be turned into an infinite number of related questions. I realised that this strategy could be deployed for my second year biochemistry class for their assessments. Working with my colleague Prof. Nic Harmer, using historical student derived data, we built computer models of our experiments and generated unique data sets for both the coursework and exam assignments. Students were provided with their own set of data (both numerical and images) which they downloaded alongside the exam paper/coursework instructions. We also produced associated staff answer sets which included plotted data and all answers (intermediary and final).
To support the students we used Padlet to collate questions and answers, and held dedicated Q&A sessions. The Smart Worksheet was used exclusively online with historical data sets as the students were unable to collect their own. The immediate feedback and ‘solve’ function that are provided by the Smart Worksheet, together with an instructional video I had prepared enabled the students to process data and prepare for the exam.
We knew that students preferred a 24 hour exam format rather than a timed one. However, this would give even more opportunities for students to ‘check their answers’ with each other or outsource their work. The use of unique data sets is a disincentive for collusion as each student has a unique answer and the benefits of working together are reduced. We ‘google-proofed’ the rest of the exam paper so that higher order skills were tested rather than recall of lecture material. The students all used their own data sets and performance on the exam and module resulted in no grade inflation. We could not detect any evidence of collusion or cheating. With exams remaining online this academic year, we have continued to use this format and approach for the exam. With the reopening of the teaching laboratory, students were once again able to collect their own data, but our method was used to provide data to those that were unable to attend the assessed practical.
‘The use of individual data sets for assessments was an excellent way of ensuring the work was fair for all students.’ Student 2021
Nic and I spent last summer writing up our results and in November 2021 our paper was published in the Journal of Chemical Education.2 The reception to the paper has been tremendous with over 1700 downloads and an Altmetric score of 64 (which puts it in the top 5% of all research outputs scored by Altmetric). It has led to our work being included on the university website news, an interview in the Times Higher Education3 and a Times Higher Education campus4 piece where we gave tips to others who may wish to use this approach. Our work has been mentioned in numerous specialist HE websites5-9 and we received an Honourable Mention in the Learning Science Teaching Innovation Awards 2022. We have also published two papers in the JISC digital culture collection10,11 that were curated by fellow Incubator Fellow, Prof. Lisa Harris and Incubator Director, Prof. Sarah Dyer. Nic and I also were invited to speak at the Westminster Higher Education Forum policy conference on ‘Next Steps for Tackling Cheating in Higher Education’ where our work was identified as a highlight of the meeting.
Our innovations have continued and this year we have introduced a gamification session where we built upon what we had learned to produce a ‘who-dunnit’ game for our students to consolidate analytical biochemistry techniques in a fun way.
The Education Incubator has enabled me to connect with other education-focussed staff across all parts of the University and I have learned from them too and about their diverse projects. This community is a highly supportive environment and sounding board for new ideas and innovations. I would never had had the confidence to tackle the unique data sets or gamification sessions without the Education Incubator and my involvement has been transformative.
1. Schultz, M. and Callahan, D.L., (2022) ‘Perils and Promise of Online Exams’, Nat. Rev. Chem., DOI:10.1038/s41570-022-00385-7.
2. Harmer, N. J., and Hill, A. M., (2021) ‘Unique Data sets and Bespoke Laboratory Videos: Teaching and Assessing of Experimental Methods and Data Analysis in a Pandemic’, J. Chem. Educ., 98: 4094-4100; DOI: 10.1021/acs.jchemed.1c00853.
3. Grove, J., ‘ Bespoke Robot-Written Exams to Curb Student Cheating’, Times Higher Education, https://www.timeshighereducation.com/news/bespoke-robot-written-examscurbstudentcheatingfbclid=IwAR3QTxRGIXbs6Ksdi2wRlajAKoNzJLWURKSFSwm8_j9QfMAsqG9PQ8xJ2L4 (date accessed 28 April 2022).
4. Harmer, N. J. and Hill A. M. (2022) ‘Online Exams are Growing in Popularity, How can they be Fair and Robust?’, Times Higher Education Campus, https://www.timeshighereducation.com/campus/online-exams-are-growing-popularity-how-can-they-be-fair-and-robust (date accessed 28 April 2022)
5. Grove, J.,Inside Highered, ‘Could Custom Exams Prevent Cheating?’ https://www.insidehighered.com/news/2022/01/06/british-university-tries-custom-exam-reduce-cheating (date accessed 28 April 2022).
6. Mirage news, https://www.miragenews.com/unique-data-creates-fair-and-robust-online-exams-693956/ (date accessed 28 April 2022).
7. Brogan, T., Education Technology, https://edtechnology.co.uk/e-learning/digital-learning-pioneers-develop-fair-and-robust-online-exam-method/(date accessed 28 April 2022).
8. Phys.org, https://phys.org/news/2021-12-unique-fair-robust-online-exams.html (date accessed 28 April 2022).
9. Warren, C., University Business, ‘Fixing a pain in the ass-essment’, https://universitybusiness.co.uk/features/fixing-a-pain-in-the-ass-essment (Date accessed 28 April 2022)
10. Hill, A. M. & Harmer, N. J. (2022) Robust and Fair Online Exams, JISC Digital Culture collection, https://digitalculture.jiscinvolve.org/wp/2022/03/09/robust-and-fair-online-exams/
11. Harmer, N. J. & Hill, A. M. (2022) Teaching and Assessing Laboratory Sessions Online, JISC Digital Culture collection. https://digitalculture.jiscinvolve.org/wp/2022/03/09/teaching-and-assessing-laboratory-sessions-online/