Forum condemns surveillance software for assessment

Schools went remote last March with little preparation or warning, and both students and professors alike were thrown into the disarray of online classes. As a consequence, many institutions have been forced to substitute in-person invigilators with exam surveillance software such as the Respondus lockdown browser. The faculty of education hosted an online forum titled “Surveillance software is not the answer: Evaluation in higher education” on Feb. 24 to discuss this issue.

The forum addressed the ethical and practical issues surrounding the use of testing software. Shannon Moore, an assistant professor in the faculty of education, moderated the forum.

“The tools we use to assess should not encourage a culture of distrust and erode the relationships between teachers and students,” she said.

“The tools we use should not place student privacy at risk, subject students to commodification and data surveillance, penalize student for socioeconomic factors, perpetuate racism or regulate exceptionalities.”

Kristin Smith, UMSU vice-president advocacy, started the forum by talking about student concerns regarding surveillance software.

“When [we] use e-proctoring, students’ futures, students’ well-being and their relationship with faculty are at stake,” she said.

Surveillance software like Respondus browser act as a virtual invigilator to students and “flag” any suspicious activity, which might even include looking away from your screen for just a moment.

“My mother walks in the room and some of the sound cuts out. How can we be certain she wasn’t assisting me on a question?” Smith gave as an example.

“That not only provides a sense of anxiety to the student but additional layers for the professor to have to go look at and analyze.”

The use of this type of software brings more firmly into focus long-standing questions about the validity, fairness and equity of assessment practices.

Smith also compared an in-person environment to the current virtual environment.

“Students aren’t looking for an easy way out,” she said.

“There are actual issues that are unique to the e-proctoring environment that simply don’t manifest when we look at the in-person environment […] Their professors will have the full context with regards to the testing environment, so false flags become very unlikely [in person].”

Smith said some statistics have shown false positivity rates “upwards of 40 per cent” and emphasized the extra stress these risks place on students.

“Students are consistently mentally preoccupied with keeping their body movements and environment consistent to avoid these ‘flags,’” she said.

She also spoke about the difference between potential false positives in in-person and remote events.

“The standard used in these cases is a balance of probabilities which tends to work decently well when you’re looking at in-person evidence and testimonials, but when you’re basing your assumption on a flag that might be false and you’re already operating in an environment where academic dishonesty is up, the likelihood that a student might be found falsely guilty, so to speak, goes up,” she said.

Acting head of the U of M’s philosophy department Neil McArthur added, “This software is a lot better at recognizing the behaviour of white people and sometimes it just won’t simply recognize people with dark skin.”

“It won’t allow them to write the exam and so I think that can amplify, understandably, the anxiety of students.”

He added that surveillance software often privileges students with specific remote learning conditions.

“It isn’t just there’s going to be a lot of false flags, the fact is the false flags are going to come up with, overwhelmingly, students who already have vulnerabilities, who are already for various reasons perhaps uncomfortable in the educational environment, who are already facing challenges,” McArthur said.

“So, I think that’s really important to emphasize. It also would differentially impact students who don’t have the fancy technology setups, students who are under-resourced, students who don’t have private rooms.”

McArthur also touched a bit on privacy concerns.

“These are technology corporations and technology corporations tell us that they’re always going to protect our privacy and we’ve just seen again and again that they don’t,” he said.

Fred Zinn, associate director of digital learning at University of Massachusetts Amherst’s college of education, spoke more about the issues stemming from programmers themselves.

“Biases by the people who create these tools feeds into the false flags,” he said.

“So, it’s not even just about false flags, it’s the bias of the person who designed the test.”

Like Smith, he discussed the stress students can feel due to the possibility of false flags.

“There’s something called stereotype threat, which means if you’re from a marginalized group […] you are aware of the false flags you can give off and you are more likely to give off those false flags when you know you’re being observed,” Zinn said.

He raised an important question about the practices surveillance software in education are promoting.

“You’re literally training student to comply, training students not to question,” Zinn said.

“Is that really what we want to teach?”