This piece was collaboratively written by BCTLC members and shared by co-Chairs 2020-22  Dr. Anne Sommerfeld (UNBC) and  Dr. Paula Hayden (College of New Caledonia).

This past year has brought numerous changes to teaching in higher education. One rapid change involved using academic surveillance technology to monitor our students. This technology was an emergency strategy in a crisis situation. Post covid instruction may also increase online, distance or blended offerings of courses dependent on this technology. 

The BC Teaching and Learning Council (BCTLC) does not support the use of academic surveillance technology, such as online proctoring software, in higher education unless absolutely necessary. Our issues are based on following ethical and pedagogical concerns.

Concerns: 

  • Lack of evidence indicating surveillance technology reduces cheating
    • Increases cognitive load so students perform worse (studies are examining poor performance = didn’t cheat however, high stress and cognitive load can also = poor performance)
    • Many ways that students can cheat 
    • Employing non-proctoring security measures (academic integrity statements, warnings of consequences, time-limited exams, synchronous testing, paraphrased test-bank questions, randomized question order, randomized options for MCQ, one question per page, and employing high-order thinking questions) can eliminate any difference in performance between proctored and unproctored exams (Feinman, 2018; Golden & Kohlbeck, 2020).
  • Privacy
    • requires invasion of their home or living space while they share with others  
    • Many students use public wifi access and cannot do a 360 view of their space 
  • Mental health
    • Students with anxiety perform worse on exams with academic surveillance technology (Woldeab & Brothen, 2019; Butler-Henderson & Crawford, 2020). 
  • Algorithms that perpetuate racism
    • Well-documented issues with online proctoring that include darker skin tones requiring extra lighting or more easily being flagged for suspicious behaviour.
  • Equity and access
    • Requires all students to have up to date technology hardware (e.g., computers, webcams, etc.), software (e.g., operating systems, modern browsers, etc.), and reliable high speed internet access which is ableism
    • The requirements of online proctoring (e.g., to remain in the video frame throughout an exam, avoid wandering eye gazes, remove background noises, etc.) are inherently ableist.
  • Proctored exams tend to favour lower-level, memory-based questioning. Is this the best way to assess learning? 
  • Instructor workload -The use of online proctoring can add significantly to instructor workload due to the necessity of manually reviewing verbal and non-verbal behaviour(s) that have been deemed suspicious by the software’s algorithm.
  • Risk of legal exposure with litigious educational technology vendors
    • The sharing by a UBC staff member of publicly-accessible “unlisted” video guides published by an online proctoring company resulted in a vexatious lawsuit filed against the staff member by the online proctoring company.

We highly support ongoing conversations regarding academic integrity, learning assessments and changing assessment practices to best serve our students. This technology is a high cost with limited value for the public funds in our institutions. 

 

References

  • Woldeab, D. and Brothen. T. (2019). 21st century assessment: Online proctoring, test anxiety, and student performance. International Journal of E-Learning and Distance Education, 34(10, 1-10. 
  • Butler-Henderson, K. and Crawrford, J. (2020). A systematic review of online examinations: A pedagogical innovation for scalable authentication and integrity. Computers &
    Education, 159,  1-12.
  • Feinman, Y. (2018). Security mechanisms on web-based exams in introductory statistics community college courses. Journal of Social, Behavioral, and Health Sciences, 12(1), 11.
  • Hope, A. (2018). Creep: The Growing Surveillance of Students’ Online Activities. Education and Society, 36(1), 55-72.
  • Ladyshewsky, R. K. (2015). Post-graduate student performance in ‘supervised in-class’ vs. ‘unsupervised online’ multiple choice tests: implications for cheating and test security. Assessment & Evaluation in Higher Education, 40(7), 883-897.
  • Swauger, S. (2020). Software that monitors students during tests perpetuates inequality and violates their privacy. MIT Technology Review. Retrieved from: https://www.technologyreview.com/2020/08/07/1006132/software-algorithms-proctoring-online-tests-ai-ethics/ 
  • Teclehaimanot, B., You, J., Franz, D. R., Xiao, M., & Hochberg, S. A. (2018). Ensuring academic integrity in online courses: A case analysis in three testing environments.
    The Quarterly Review of Distance Education, 12(1), 47-52.