by Saanya Pherwani
Proctoring services are experiencing a surge in demand as universities look for new ways to monitor online tests.[1] Proctorio, an online proctoring service that previously added about 100 clients each year, has increased business by 900% in a few months.[2] Large public universities that would go through months-long negotiation processes rushed to sign proctoring deals within days.[3] Amid the urgency to re-create in-person schooling during the pandemic, educational authorities disregarded the lack of a federal privacy law that monitors data collection. Currently, the Federal Trade Commission Act of 1914 prohibits companies from engaging in “unfair or deceptive acts of practice.”[4] However, the Act does not enforce any stringent privacy compliances on proctoring services.[5] A new federal privacy law could help to establish a standard for how proctoring services treat its consumers’ personal information and instill greater consumer confidence in the services. Furthermore, the law could address issues in racial and economic discrimination caused by algorithmic bias.
Proctorio and other similar services monitor students’ webcams, mouse clicks, screen activity, and eye and speech movements to flag suspicious behavior.[6] Some programs even require students to scan their IDs and perform a 360-degree scan of their rooms before beginning their exams. Proctorio collects all of these types of data to report to professors students’ suspicion levels. However, once testing ends, the companies are left to their own discretion regarding how they can use information gathered from students’ computers and bedrooms. By requiring ID credentials and monitoring eye movement, services like Proctorio access and disclose students’ personal, physical, and biometric identifiers, including their home addresses, work details, parental and citizenship status, medical records, height, weight, fingerprints, and retina scans to unspecified third parties.[7] Legal scholars widely recognize that there is a lack of regulation surrounding privacy in proctoring services.
Furthermore, students have accused proctoring services’ AI algorithms of being discriminatory by putting female, black, brown, and transgender students at a disadvantage.[8] In an op-ed for the MIT Technology Review,[9] librarian Shea Swauger explained how proctoring services have a long history of reproducing societal inequalities. For instance, many transgender individuals find it difficult to find ID credentials that accurately reflect their gender.[10] Additionally, one of Swauger’s students, a Black woman, was prompted to shine more light on her face when using proctoring software.[11] Swauger also explained how tracking eye movements and flagging students for leaving their desk can penalize students with different living situations. A few of her students have children to care for or live in an unstable household with a lot of disruptions. The proctoring service instantly flagged the students for having background noise or leaving the desk to take care of a child. Title IX protects parenting students from educational discrimination, and default settings that flag students for performing daily parental duties go against these protections.[12] On a larger level, there have been widespread debates about the inevitability of algorithmic bias in everyday technology.[13] The presence of bias is even more prominent in autonomous systems like Proctorio, which might be regarded as neutral or impartial.[14]
While students agree to the terms and conditions, they have no meaningful power to deny consent to being proctored. No federal privacy law currently requires software companies to disclose how data is collected;[15] however, several smaller scale strides have been made to improve disclosure. For example, the Gramm-Leach-Bliley Act subjects financial institutions to disclosure requirements to safeguard sensitive data.[16] On the state level, the California Consumer Privacy Act (CCPA) gives consumers the right to know the personal information a business collects about them and how it is used.[17] The consumer can then choose to opt out of the sale of their personal information. In adherence to the CCPA, services like Proctorio should publicize the historical data it uses to detect suspicious behavior, since public access to the data has been historically critical to ensuring algorithmic bias is detected and eliminated. For example, Amazon was forced to scrap its resume-screening tool when it was found that the tool placed female applicants at a disadvantage.[18] When designing the algorithm, software engineers at Amazon used decades of data that unknowingly consisted of resumes only from men. Requiring data disclosure is the first step in empowering courts with the ability to analyze proctoring services’ data to settle disputes regarding their algorithms’ discriminatory nature.
In addition, the data can reveal cultural bias. In an effort to establish a metric for suspicion for their service, the programs must establish a norm for comparison. A norm will ensure that the service is not entirely autonomous, and humans correct any flaws in the software from utilizing historical data. The programs should flag suspicious behavior only when it satisfies the norm. This will help programs from wrongfully judging students’ behaviors, especially when students are regularly flagged for rolling their eyes, stretching their arms, or solving a problem on paper.[19] Yet, this process of comparison is not neutral, as certain facial expressions that are considered suspicious in one culture may not be considered suspicious in another.[20] However, with proper disclosure of the types of behavior the service considers suspicious, students will know what to expect from an exam. This disclosure requirement should extend to universities as well; students can choose whether to take a class if the syllabus explicitly states the type of exam proctoring. In creating course syllabi, professors should weigh the benefits of using internal versus external proctoring services. In internal proctoring, professors can monitor student activity on exams without employing an intermediary service, as students might be more comfortable being monitored by a familiar authority while they take an online exam.
The unforeseen overreliance on technology during the pandemic unveils structural problems in data monitoring and the need for federal privacy laws that regulate services like Proctorio. The laws should universalize some privacy accountability requirements, for example, by requiring all technological companies to provide transparent information. However, a blanket approach for accountability to all kinds of technologies is ineffective, as technology constantly advances; proctoring software itself did not exist a decade ago. These services and other companies with rapid technological innovation will be able to find loopholes in laws enacted decades ago. Therefore, privacy law is different from laws that address relatively stable industries, and they need to constantly adapt to changing circumstances.
Unfortunately, this technological dependence has also revealed that academic institutions favor surveillance over trust. In order to elicit change, higher educational authorities should first start with giving students meaningful decision-making power and openly communicating their course expectations. As Jesse Stommel, a senior lecturer and co-founder of the Digital Pedagogy Lab, explains, “cheating is a pedagogical issue, not a technological one, and there are no easy solutions.”[21] Lecturers’ work does not begin with “an app or a license for remote proctoring,” but by talking “openly to students about when and how learning happens,” so they can take ownership of their education.[22]
[1] Shea Swauger, Software that monitors students during tests perpetuates inequality and violates their privacy, MIT Tᴇᴄʜ. Rᴇᴠ. (Aug. 7, 2020), https://www.technologyreview.com/2020/08/07/1006132/software-algorithms-proctoring-online-tests-ai-ethics/.
[2] Drew Harwell, Mass school closures in the wake of the coronavirus are driving a new wave of student surveillance, Tʜᴇ Wᴀsʜ. Pᴏsᴛ (April 1, 2020), available at https://www.washingtonpost.com/technology/2020/04/01/online-proctoring-college-exams-coronavirus/.
[3] Id.
[4] See 15 U.S.C. § 57(b)(1). See alsoA Brief Overview of the Federal Trade Commission's Investigative, Law Enforcement, and Rulemaking Authority, Fᴇᴅᴇʀᴀʟ Tʀᴀᴅᴇ Cᴏᴍᴍɪssɪᴏɴ (2019), https://www.ftc.gov/about-ftc/what-we-do/enforcement-authority.
[5] Andy Green, Complete Guide to Privacy Laws in the US, Iɴsɪᴅᴇ Oᴜᴛ Sᴇᴄᴜʀɪᴛʏ (2020), https://www.varonis.com/blog/us-privacy-laws/.
[6] Proctorio, The World's First Learning Integrity Platform, Pʀᴏᴄᴛᴏʀɪᴏ Iɴᴄ. (Oct. 25, 2020), https://proctorio.com/about.
[7] See ProctorU CCPA Privacy Policy, PʀᴏᴄᴛᴏʀU Iɴᴄ. (Oct. 25, 2020), https://www.proctoru.com/ca-privacy-policy. Many schools have expressed concerns about this data collection. See e.g., Letter from UCSB Faculty Association Board to Chancellor Henry Yang Executive Vice Chancellor and David Marshall (March 13, 2020), https://cucfa.org/wp-content/uploads/2020/03/ProctorU_2020-1.pdf.
[8] Noah Zeitlen, Universities need to condemn the use of problematic online proctor services, Tʜᴇ Jᴜsᴛɪᴄᴇ (Sept. 15, 2020), https://www.thejustice.org/article/2020/09/universities-need-to-condemn-the-use-of-problematic-online-proctor-services. See also Swauger, supra note 1, citing multiple reports and studies supporting how AI is often “racist, sexist, and transphobic.”
[9] Swauger, supra note 1.
[10] See id.
[11] See id. (“The software couldn’t validate her identity and she was denied access to tests so often that she had to go to her professor to make other arrangements. Her white peers never had this problem.”).
[12] Know Your Rights: Pregnant or Parenting? Title IX Protects You From Discrimination At School, U.S. Dᴇᴘᴀʀᴛᴍᴇɴᴛ ᴏғ Eᴅᴜᴄᴀᴛɪᴏɴ (2020), https://www2.ed.gov/about/offices/list/ocr/docs/dcl-know-rights-201306-title-ix.html.
[13] See e.g., David Danks & Alex John London, Algorithmic Bias in Autonomous Systems, Pʀᴏᴄᴇᴇᴅɪɴɢs ᴏғ ᴛʜᴇ Tᴡᴇɴᴛʏ-Sɪxᴛʜ Iɴᴛᴇʀɴᴀᴛɪᴏɴᴀʟ Jᴏɪɴᴛ Cᴏɴғᴇʀᴇɴᴄᴇ ᴏɴ Aʀᴛɪғɪᴄɪᴀʟ Iɴᴛᴇʟʟɪɢᴇɴᴄᴇ (2017).
[14] See id.
[15] Daniel Castro & Ashley Johnson, Why Can't Congress Pass Federal Data Privacy Legislation? Blame California, Iɴɴᴏᴠᴀᴛɪᴏɴ Fɪʟᴇs (Dec. 13, 2019), https://itif.org/publications/2019/12/13/why-cant-congress-pass-federal-data-privacy-legislation-blame-california.
[16] Pub. L. No. 106-102, 113 Stat. 1338 (1999) (codified as amended in scattered sections of 12 U.S.C. and 15 U.S.C.).
[17] See generally the California Consumer Privacy Act of 2018, Cᴀʟ Cɪᴠ. Cᴏᴅᴇ § 1798.100 (2018). The CCPA gives consumers in the state of California “more control over the personal information that businesses collect about them.” SeeCalifornia Consumer Privacy Act (CCPA), Sᴛᴀᴛᴇ ᴏғ Cᴀʟɪғᴏʀɴɪᴀ Dᴇᴘᴀʀᴛᴍᴇɴᴛ ᴏғ Jᴜsᴛɪᴄᴇ (2020), https://oag.ca.gov/privacy/ccpa.
[18] Jeffrey Dastin, Amazon scraps secret AI recruiting tool that showed bias against women, Rᴇᴜᴛᴇʀs (Oct. 10, 2018), https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G.
[19] Fahad Diwan, Concordia University is undermining the privacy rights of its students and this must stop, Tʜᴇ Sᴛᴀʀᴛᴜᴘ (April 7, 2020), https://medium.com/swlh/concordia-university-is-undermining-its-students-privacy-rights-by-using-proctorio-5e1ff03ecaab.
[20] Rachael E. Jack, et al., Facial expressions of emotion are not culturally universal, Pʀᴏᴄᴇᴇᴅɪɴɢs ᴏғ ᴛʜᴇ Nᴀᴛɪᴏɴᴀʟ Aᴄᴀᴅᴇᴍʏ ᴏғ Sᴄɪᴇɴᴄᴇs ᴏғ ᴛʜᴇ Uɴɪᴛᴇᴅ Sᴛᴀᴛᴇs ᴏғ Aᴍᴇʀɪᴄᴀ, (March 19, 2012), https://www.pnas.org/content/109/19/7241.
[21] Jesse Stommel, Why I Don’t Grade, Jᴇssᴇ Sᴛᴏᴍᴍᴇʟ (Oct. 27, 2017), https://www.jessestommel.com/why-i-dont-grade/.
[22] See id.