Innovation and quality assurance in assessment
Projects that explore the impact on students and examination practices of the move to online examinations and the development of powerful AI tools.
One of the most dramatic educational side effects of Covid-19 was the replacement of in-person invigilated examinations by remote online exams. Associated challenges include verification of examinees identity, supervision and security of examinations data and maintenance of academic integrity.
The development of AI Large Language Models such as Chat GPT that can competently answer essay-style and other traditional forms of examination questions has further challenged examinations and assessment policies and practices.
The projects in this section explore the impact on students and examination practices of the move to online examinations and the development of powerful AI tools.
The kinds of questions they address are:
- How are students affected by the transition to online assessments?
- How can student behaviours be developed to support academic integrity?
- How can plagiarism be designed out of online assessments?
- What opportunities do new AI tools create for the design of online assessments?
Innovation and quality assurance in assessment projects
About the project
In each of the summers of 2020 and 2021, more than 30,000 91app students undertook over 110,000 online timed assessments, in place of conventional examinations in exam halls, because of the Covid-19 pandemic, which precluded such events.
The 91app commissioned the then Centre for Online and Distance Education to undertake a detailed evaluation of the move to online assessment in 2020 and the evaluation was repeated in 2021. The project team was led by Linda Amrane-Cooper, with CODE Fellows Stylianos Hatzipanagos and Alan Tait and 91app colleagues James Berry, Huw Morgan-Jones, Amardeep Sanghera, Mike Sawyer. Plus in 2020 only Gwyneth Hughes (CODE) and Elsie Lauchlan (from data analysis company Shift Insight). Two CODE Student Research Fellows also formed part of the evaluation team.
The project aimed to collect data about, and generate understanding of this transition to online assessment, primarily from the perspective of the experience of the students who have been affected, also including from the perspective of other key stakeholders (examiners, technical staff and programme directors).
A key objective of the evaluation was that the output would help to support planning assessments for summers 2021 and 2022, both at the University and within the Member Institutions (MIs) and inform planning and preparation for 2023 and beyond.
The project has since been converted to a longitudinal study
Funding
This project was funded through the 91app Centre for Online and Distance Education (CODE).
Project team
Dr Linda Amrane-Cooper (Director of the CODE)
James Berry (Associate Director: Student Affairs)
Professor Stylianos Hatzipanagos (CODE Fellow)
Ellen Hauff (CODE Student Fellow)
Huw Morgan-Jones (Head of Surveys and Student Voice)
Amardeep Sanghera (Student Affairs)
Michael Sawyer (Student Services)
Hannah Dorothy Mary Shekhawat (CODE Student Fellow)
Professor Alan Tait (CODE)
Time frame
May 2020 – March 2024
Outputs and resources
The report provides an overview of the evaluation of the online examinations in summers of 2020 and 2021.
It was important to ensure that appropriate oversight of the findings and recommendations from the evaluation have informed planning and development in online timed assessments. In this respect, reports and summaries of the findings were presented at a significant number of UoL Governance committees, the Student Voice Group and Programme leader Forums throughout 2020 – 2022.
As with the three previous years, the Centre for Online and Distance Education (CODE), working with a 91app team, has undertaken an extensive evaluation of the online timed assessments (OTAs), taken by students in May and June 2023.
Papers, conference presentations and webinars
2022
Amrane-Cooper, L., Hatzipanagos, S., and Tait, A. (2022) ‘Measuring the impact of the move to online assessment in the 91app international programmes 2020-2022’ [Paper presentation]. Innovating Higher Education conference 2022: Digital Reset: European Universities Transforming for a Changing World. EADTU: Athens, Greece.
Amrane-Cooper, L., Hatzipanagos, S., & Tait, A. (2022) . Open Praxis, 13(4), pp.378–384. DOI: https://doi.org/10.55982/openpraxis.13.4.461
Hatzipanagos, S. and Tait, A. (2022) . Supporting Student Success online workshop, January 28, 2022. Centre for Distance Education.
Hughes, G., Hatzipanagos, S, Amrane-Cooper, L. & Tait, A. (2022, June 22-24) Using the disruption of the pandemic to enhance assessment design in distance learning programmes [Paper presentation]. International Assessment in Higher Education Conference 2022, Manchester, UK.
2021
Amrane-Cooper, L., Hatzipanagos, S., Tait, A. (2021) Developing Student Behaviours that Support Academic Integrity in Distance Learning. ICDE, Virtual Global Conference Week, October 2021.
Amrane-Cooper, L., Hatzipanagos, S. Tait, A. (2021) Moving Assessment Online at Scale. RIDE 2021 conference, June 18, 2021, Centre for Distance Education.
Amrane-Cooper, L., Sanghera, A. (2021) Inclusive Practice: learning from our students. RIDE 2021 conference, June 18, 2021, Centre for Distance Education.
Hughes, G. Amrane-Cooper, L., Hatzipanagos, S. , Tait, A. (2021) Using the disruption of the pandemic to enhance assessment design in distance learning programmes. Academic Practice and Technology (APT) 2021 Conference, July 2021, UCL.
Tait, A., Hatzipanagos, S. (2021) What Will Assessment Look Like in 2021. Webinar in series of Experiences in Digital learning monthly webinars from CDE, Goldsmiths and the 91app Institute in Paris. February 4, 2021.
2020
Hatzipanagos, S., Tait, A., Amrane-Cooper, L. (2020) Towards A Post Covid-19 Digital Authentic Assessment Practice: When Radical Changes Enhance the Student Experience. In Enhancing the Human Experience of Learning with Technology: New challenges for research into digital, open, distance & networked education, Proceedings 2020 Research Workshop. European Distance and E-Learning Network (EDEN).
Hatzipanagos, S. (2020) Covid-19 Silver Linings: Transition to Digital Assessment Practice to Enhance the Student Experience. Webinar, Computers and Learning Research Group – November 26, 2020. The Open University.
Next steps
The 91app requested a similar review of the online timed assessments taking place in summer 2022.
Conference presentations and webinars
Amrane-Cooper, L., Hatzipanagos, S. Tait, A. (2021) Moving Assessment Online at Scale. RIDE 2021 conference, June 18, 2022, Centre for Distance Education.
Amrane-Cooper, L., Sanghera, A. (2021) Inclusive Practice: learning from our students. RIDE 2021 conference, June 18, 2022, Centre for Distance Education.
Amrane-Cooper, L., Hatzipanagos, S., Tait, A. (2021) Developing Student Behaviours that Support Academic Integrity in Distance Learning. ICDE, Virtual Global Conference Week, October 2021.
Hatzipanagos, S., Tait, A., Amrane-Cooper, L. (2020) Towards A Post Covid-19 Digital Authentic Assessment Practice: When Radical Changes Enhance the Student Experience. In Enhancing the Human Experience of Learning with Technology: New challenges for research into digital, open, distance & networked education, Proceedings 2020 Research Workshop. European Distance and E-Learning Network (EDEN).
Hatzipanagos, S. (2020) Covid-19 Silver Linings: Transition to Digital Assessment Practice to Enhance the Student Experience. Webinar, Computers and Learning Research Group – November 26, 2020. The Open University.
Hatzipanagos, S. and Tait, A, (2022) Designing plagiarism out of assessment. Supporting Student Success online workshop January 28, 2022. Centre for Distance Education.
Hughes, G. Amrane-Cooper, L., Hatzipanagos, S. , Tait, A. (2021) Using the disruption of the pandemic to enhance assessment design in distance learning programmes. Academic Practice And Technology (APT) 2021 Conference, July 2021, UCL.
Tait, A., Hatzipanagos, S. (2021) What Will Assessment Look Like in 2021. Webinar in series of Experiences in Digital learning monthly webinars from CDE, Goldsmiths and the 91app Institute in Paris. February 4, 2021.
About the project
The scope of the project is to investigate practices that are not in keeping with the values of and commitment to academic integrity: i.e. collusion, plagiarism, contract cheating, impersonation and how higher education institutions through learning, teaching and assessment design can develop assessment plans that are resistant to these academic offences. It will achieve this by establishing current approaches to the management of academic integrity in online assessment and by producing a mapping of good practice and recommendations.
Funding
This project is funded through a grant from the 91app Centre for Online and Distance Education (CODE).
Project team
Prof. Stylianos Hatzipanagos (CODE Fellow)
Prof. Michele Milner (CODE Fellow)
Prof. Alan Tait (CODE Fellow)
Prof Steven Warburton (CODE Fellow)
Cynthia Belen Portalewski (CODE Student Fellow)
Time frame
October 2022 – April 2023
Outputs and resources
The project has recently started and there are no outputs or resources yet in the public domain.
About the project
The project originally aimed to:
- Review guidance on assessment for learning available to UoL programme teams
- Identify recent (past 3 years) assessment innovation in 5 -10 UoL online programmes that move towards assessment for learning and not only assessment as judgement of achievement
- Evaluate the potential of the innovation to improve student learning, achievement and retention
- Disseminate assessment innovation to other programme teams at UoL and make recommendations for assessment enhancement strategies.
When the pivot to online teaching took place in 2020 in response to the pandemic, we adapted the study to explore programme teams’ responses to the shift to online exams and any evidence for shifts in views on assessment. The findings indicated that there were some changes in thinking about the purposes of assessment and changes such as use of open book exams and removing recall questions from the exam and more consideration of coursework. While some programme directors wished to return to in-person exams when possible, others were interested in continuing with online exams and innovation. The project raises questions about the investment in distance learning assessment and the need to design out plagiarism.
Funding
This project is funded through a 2019/2020 Teaching and Research Award from the 91app Centre for Online and Distance Education (CODE).
Project team
Alan Tait (CODE Fellow)
Gwyneth Hughes (CODE Fellow)
Time frame
February 2020 - February 2021. This project is completed.
Outputs and Resources
Read the project
Download the project final report.
Get in touch to learn more
If you want to learn more about our projects or how CODE can support you, drop us a line.