UNIwise

View Original

Digital assessment during the coronavirus: what our data shows about how higher education institutions handled the pandemic

Data from WISEflow demonstrates how digital assessment has grown over the past seven years

The use of digital assessment in the higher education institutions of Europe has been growing gradually over the past decade as technology developed, but for many institutions, transitioning to digital assessment was part of a five-year plan, something that would be tackled slowly and in the future. But then the COVID-19 pandemic hit and upended these plans: universities had to move fast, compressing what might’ve taken two or three years into two or three months.

For our part, in response to the pandemic and to support assessment integrity, UNIwise introduced remote proctoring within WISEflow through the use of AI and images of participants. Our data shows that more than 250,000 students were assessed through remote proctored exams during April, May and June, which is staggering given the functionality was only introduced at the beginning of April. We also expanded our team to ensure we could support existing and new customers during this time – in total, there the number of assessments using WISEflow increased by over 51% during the pandemic compared to the same period in 2019 – and began working with 14 new customers.

To proctor or not to proctor

During the lockdown of Spring 2020, higher education institutions were faced with the same challenge: cancelling summative assessments or running them online. Most institutions chose the latter, either opting to change as little as possible and transpose closed-book assessments from pen and paper to digital, or choosing to redesign the assessment so that it took an alternative form.

Our data shows that which option was chosen seems to depend on several factors. The first is the institution’s previous level of digitalisation. The more digitised and experienced with digital assessment, the less inclined the institution seemed to be to use closed-book or proctored exams. This becomes clear when we look at the data for our Danish and Norwegian customers. In Denmark, the number of exams and flowtypes used were very similar in 2019 and 2020, which indicates that the pandemic didn’t change what assessment types were used – which is not to say, of course, that it was business as usual for staff and students. Norwegian users have traditionally had the most focus on invigilation and assessment security, with over 40,000 flows sat using our lockdown browser in 2019. By contrast, in 2020, Norwegian institutions moved towards home-based assessments, which is an extraordinary change, signalling that they used their many years’ experience with digital assessment and prompt of the pandemic to redesign their assessments rather than replicate the pen-and-paper experience.

On the other hand, our data shows that institutions with a higher previous reliance on paper-based assessments were more inclined to transpose them into a similar format online. The second factor that influenced how institutions chose to run their assessments was the level of institutional autonomy. Did they have legislative freedom, and were their assessments monitored by professional and/or regulatory bodies? Here we can look to data about the UK and Continental Europe for illustrations. In the UK, structured tests of multiple choice were the predominant way of assessing students in the UK. There was also a great use of remote proctoring in the UK; this can partially be attributed to some of our new customers, which used proctoring extensively, but the fact remains that more than 50% of all flows in the UK between January and the end of July 2020 used remote proctoring. Many of our new customers in 2020 were from Central Europe, where institutions had previously used pen-and-paper assessment and, therefore, during the pandemic, transposed their exams online in such a way as to resemble pen and paper assessment as closely as possible. Almost all assessments in Germany, Belgium and France were done using remote proctoring. National legislation in those countries, especially in Germany, also factored into the kind of assessments used.

Data demonstrates how the 2020 pandemic affected which flowtypes institutions used for their assessments

Lessons learned

Large-scale rollout in institutions in Germany, France and Belgium proved that what previously took years can be managed in weeks, with some institutions, such as IÉSEG, having just three weeks between signing the contract and running their first exam. It seems necessity truly is the mother of invention!

During the pandemic, there was, understandably, a renewed focus on security, with concerns about the degree of surveillance remote proctoring entailed – what is surveilled, who is doing the watching and where is the data stored? Institutions had to deal with very outspoken and concerned students, particularly in Germany and France, in the early phase. There were also concerns after the assessment period, in those institutions who opted not to use remote proctoring, about the “easiness” of cheating in some assessment types, with some students saying they felt they had to cheat, as they suspected everyone else was doing so during open-book exams. This remains a challenge for institutions going forward. Furthermore, there were the legal aspects of the processing, storage and use of sensitive data to consider, and local data protection agencies showed an increased interest in institutional procedures. So, to support institutions, we engaged with experts to produce a Data Protection Impact Assessment (DPIA) as an integrated part of the onboarding process. There were also updates made to the Data Processor Agreement (DPA) and Data Retention Policy (DRP), as a direct consequence of the processing of biometric and sensitive data.

Finally, the Spring also revitalised and reamplified discussions on the importance of feedback – we saw more annotations, comments and rubrics being used this past semester – probably because so many students haven’t been able to participate in the day-to-day activities on-campus. This is a good reminder for us all that assessment represents a key point in a student’s learning, not just to test what they know but to offer feedback that continues and further develops that learning.