The Daily Orange is a nonprofit newsroom that receives no funding from Syracuse University. Consider donating today to support our mission.
At the beginning of the fall 2020 semester, Shiu-Kai Chin gave his class a choice most Syracuse University students don’t have: how he should manage their data.
Chin, a professor of electrical engineering who specializes in cybersecurity, wanted to know how his class would like him to store the assignments uploaded to their shared Google Drive, including to whom the assignments are visible and how long they would remain online.
“The (settings) are set now so that only the class can see it, which is what the class wanted,” Chin said. “Of course, that’s only as good as the person who has the authority to change the settings, which is me.”
The discussion between Chin and his students represents a broader concern among SU students and faculty, as well as cybersecurity and education technology experts, about how the shift to online learning amid the coronavirus pandemic threatens students’ digital privacy.
Since the onset of the pandemic, SU — like colleges and universities across the world — has scrambled to rework in-person classes to a virtual format. As a result, professors have employed a patchwork of software, such as video conferencing apps and remote proctoring services, to hold classes and administer tests at a time when packed lecture halls are a public health hazard.
Between grades, personal information and hours of video and audio from classes and exams, SU students are generating a significant amount of data during the pandemic, some of it sensitive. The problem, some experts say, is that there’s little oversight of how that data is handled, either by universities or software vendors.
In a statement to The Daily Orange, SU’s Information Technology Services said that the university protects all student data through an information security program that uses industry-accepted controls, technologies and policies. ITS’ information security department and information technology staff in SU’s various schools and colleges oversee the program.
But according to Shea Swauger, a librarian at the University of Colorado Denver and critical data studies researcher, universities don’t have the tools or motivation to police abuses of student data by the faculty and staff who already have access to them. One of the greatest threats to students’ privacy during the pandemic stems from the act of appearing on camera, either during class or while taking exams, he said.
Students taking classes from their homes may expose aspects of their identity, such as their financial status or religion, to their professor, Swauger said. With that exposure comes the potential for implicit bias or other forms of misconduct against students.
Data is very valuable, and it’s valuable for a lot of different reasons. All of these things go together to build a picture that is altogether pretty granular.
Mark Pollitt, an adjunct professor at SU and former FBI special agent
“We’ve all had some professor that made us uncomfortable,” Swauger said. “Now we’ve given that professor access to recorded videos of all the students that they have, that they can then download.”
Privacy laws place steep penalties on university employees who spread or misuse students’ personal data. The problem, Swauger said, is that not all misuse of student data is easy to police.
At SU, Chin said professors can manage video and audio collected for their own classes.
“Is it possible for instructors or anybody who has access to them to widely distribute them? The answer is yes,” Chin said. “That’s where you’re relying on professional standards.”
Remote proctoring services, which can collect biometric information about students, are a particularly serious threat to student privacy due to the volume of video and biometric information they collect, Swauger said. At SU’s College of Law, students rallied against the implementation of Proctortrack, one such service that had a record of security flaws and discrimination against Black students, transgender students and students with disabilities.
It’s not just universities that can misuse students’ data. There are also considerable financial incentives for software companies to collect and sell user data that provides insight into consumers’ habits and behavior, said Mark Pollitt, an adjunct professor at SU and a former FBI special agent who investigated computer crime.
For many businesses and other parties, data can paint startlingly accurate pictures of their audience and develop new ways to reach it, Pollitt said.
“Data is very valuable, and it’s valuable for a lot of different reasons,” he said. “All of these things go together to build a picture that is altogether pretty granular.”
It can also be difficult for universities to vet all the software faculty use to ensure students’ data isn’t being sold. The only way to know what data a piece of software is collecting is to examine the source code, which is often closely guarded by the companies that make it, Pollitt said.
And in the time it takes to fully inspect a software for security flaws or covert data collection, that software could have become obsolete or been updated.
“There’s a point where you’re actually having to take somebody’s word for it,” Pollitt said. “In essence you’re trusting them to do only and exactly what they promised to do.”
In the absence of surefire vetting methods, one of the best ways to protect students’ privacy during the pandemic is to reduce the amount of data collected about them, Pollitt said. For Swauger, that includes eliminating the use of remote proctoring software and the high-stakes testing that necessitates it.
In their place, he suggested faculty use open-note, untimed tests, or instead assess students based on long-term projects that are impossible to cheat on.
Most schools care more about cheating than they do about subjecting their students to some kind of discrimination.
Shea Swauger, librarian at the University of Colorado Denver and critical studies researcher
“Right now, most schools care more about cheating than they do about subjecting their students to some kind of discrimination,” Swauger said.
When administering exams in his classes, Chin has followed the guidance of staff at SU’s Center for Teaching and Learning Excellence by reducing testing to account for less of his students’ final grade.
“The challenge for me was to get it down below 50%, which I did,” he said. “A lot of the assumed incentives for why people would cheat, I think, disappeared.”
Students can also take some steps to protect their personal data, Pollitt said. He recommended that students use separate devices for personal and academic purposes, if they are able to.
At the end of the day, though, protecting data privacy in an increasingly virtual world can be so difficult that it’s sometimes easier to just go with it, Pollitt said.
“One of the problems from a security practitioner’s perspective is that we have people who have gotten so accustomed to giving up little pieces of their security,” he said. “They don’t realize the cumulative effect.”