Technological innovation, including the expansion of surveillance technology, is often promoted as a sign of progress. The COVID-19 pandemic has provided further legitimacy for this narrative, with techno-capitalists suggesting that surveillance technology is our most promising tool in the fight against the virus. Even before the pandemic created an increased need for educational technology and more efficient ways for managing community health, college and university campuses were being primed as the next venues for new surveillance technologies. Whether it be algorithmic test proctoring services, which surveil students in their own homes, or the use of facial recognition software for campus safety, campus communities have been subject to increasing levels of oversight. It is now widely accepted that surveillance cannot properly tackle issues related to welfare and equity. However, surveillance in higher education deserves further analysis. Campuses are becoming increasingly diverse social areas, making them unique and peculiar targets for mass surveillance projects. Universities and colleges are not only sites for education and training but also for social life, including activism. Surveillance threatens the emergence and continuance of this activism and other cultural components of campuses. Surveillance of these spaces also creates a precedent with impacts beyond universities.
This paper argues that the use and expansion of surveillance technologies on university and college campuses does not adequately serve the campus community members who are being surveilled. In uniting against invasive surveillance, campuses must mobilize behind their collective privacy rights. It is this emphasis on collective harms that allows communities to recognize and dismantle the systems of discrimination that create and uphold dangerous surveillance tools.
* * *
Technological advancement is often closely associated with sociocultural progress. Manufacturing this sentiment and capitalizing on it, techno-capitalists have asserted themselves and their inventions as solutions to the various social and economic problems we currently face.
Although new in form — utilizing the latest advancements in machine learning and artificial intelligence — at their heart, many of these technologies have an age-old purpose: surveillance. Even before the COVID-19 crisis created a need for digital education tools and efficient means for tracking virus outbreaks, the use of surveillance technologies has been expanding on university and college campuses. Whether students are learning remotely, or living directly on campus, now more than ever they are under surveillance. Technologies like algorithmic test proctoring, automatic license plate readers (ADLR) and cameras equipped with facial recognition technology are being used on campuses across the United States. Although presented as solutions to issues of campus-safety and academic integrity, when used in practice, these technologies often end up invading students’ privacy, while offering no substantial benefit to their wellbeing or educational experience.
This paper argues that the use and expansion of surveillance technologies on university and college campuses does not adequately serve the campus community members who are being surveilled. Campus community members have their individual as well as collective privacy rights violated by these technologies. In uniting against invasive surveillance, campuses must mobilize behind their collective privacy rights. It is this emphasis on collective harms that allows communities to recognize and dismantle the systems of discrimination that create and uphold these dangerous surveillance tools. I begin by discussing surveillance, privacy and new public management in higher education, I then address the types of surveillance found on university and college campuses and the harms they present. I finish with a discussion of student-lead anti-surveillance campaigns.
Surveillance, at its most fundamental level, is an attention or watchfulness directed at a particular individual or group. Surveillance is found everywhere there is human sociality, though it is not always harmful or invasive. Non-strategic surveillance refers to the routine, instinctual, semi-consciousness awareness we have of our surroundings. Hearing an ambulance outside a bedroom window or people watching while waiting for the bus are examples. Strategic surveillance — the form we commonly associate with the term — involves a “conscious strategy” usually directed towards the collection of information.
Strategic surveillance can be divided into two subcategories: traditional surveillance and new surveillance. Traditional surveillance is associated with pre-industrial societies. It is often limited, relying on little more than the senses of the observer, and typically does not involve an organized system of information storage. New surveillance, the primary interest of contemporary surveillance studies, involves target observation and scrutiny of particular individuals and or groups. New surveillance utilizes advanced technologies and tools to observe beyond the capacity of human senses. New surveillance is the foundation of a surveillance society — the phenomenon in which surveillance plays an increasingly significant role in the organization of social, political and economic relations. This paper is concerned with new surveillance particularly as it relates to privacy.
One conceptualization of privacy asserts that it is “the enforcement of the more general right of the individual to be let alone.” By protecting this more general right, privacy enables individuals to more freely explore and express their thoughts and beliefs. Privacy is not an end in and of itself but rather a precondition for other values, including autonomy and freedom of expression. In its capacity to serve the individual, privacy is also sometimes imagined as an obstacle to the pursuit of society’s collective interests such as community safety. However, the conception of privacy as a collective value challenges this notion.
Privacy as a collective value asserts that privacy is a communal right and experience before it is an individual one. All individuals have a constitutional right to privacy — however, in practice, some groups have more access to privacy than others. Historically, marginalized communities are subjected to greater oversight. Whether it be over policing in lower income neighborhoods, or caseworkers checking their clients’ purchase history via their Electronic Benefit Transfer card, a lack of resources is often accompanied by heightened surveillance.
Big data contributes to disparities in the surveillance experience. Although an often-vague term, big data includes any information available in traceable digital form. Electronic financial transactions, social media interactions, and web searches are some notable examples. The existence of big data is followed by various data mining projects that seek to identify patterns and trends within databases for some specific purpose, like identifying and predicting consumer trends.
In many instances, data mining projects are not interested in identifying specific people, but rather identifying the commonalities that exist within larger groups. As Tobias Matzner states:
Data miners do not want to know who you are, but what you like. They want to know that you can be characterized as being similar to this or that group of people regarding a particular aspect — like the notorious people on Amazon that also bought what you buy.
Individuals are provided increasingly fewer opportunities to opt-out of big data surveillance. Regardless of whether someone is an avid social media user or has never owned a smartphone in their life, individuals are brought under the surveillance regime by virtue of being a member of a society where like individuals are being surveilled and where knowledge produced through that surveillance informs important decisions.
The ubiquity of modern surveillance makes it increasingly difficult for individuals and communities to control privacy and privacy norms — this is by design. Upon stepping down as Chief Executive Officer of Amazon, Jeff Bezos remarked in a farewell email to employees:
Amazon is what it is because of invention. We do crazy things together and then make them normal… If you do it right, a few years after a surprising invention the new thing has become normal. People yawn. And that yawn is the greatest compliment an inventor can receive.
The innovation Bezos describes depends on the tech industry’s ability to influence our perception of what constitutes progress. This includes, as Bezos remarks, making “crazy things” seem normal, including the violation of our privacy. As stewards of the burgeoning digital age, defined by big data, big tech, and even bigger surveillance, the tech industry (which is often also in the business of surveillance) demands the prioritization of growth and innovation over privacy concerns. This has serious implications.
First, technological advancement often exceeds the pace of discourse. It is usually only once surveillance technologies have done their damage that it is societally recognized that they should not have existed in the first place. Moreover, by the time this recognition has occurred, the technology might have already been accepted, with calls for its removal portrayed as regressive or radical.
Second, people experience different degrees of vulnerability when it comes to surveillance. For some, surveillance can be a source of convenience, for others it can give way to further discrimination and invasions of privacy. Addressing the ethical issues created by surveillance demands embracing the conception of privacy as a collective value. This conception promotes a civil rights approach to surveillance that recognizes that one’s access to privacy is determined less so by their personal choices and more so by their social location. This conception also demands a critique of surveillance as a means of knowledge production and governance. We can apply this understanding to the use of surveillance technology on university and college campuses.
Neoliberalism and Higher Education
Although often considered one of the least resistant to capitalist influence, higher education resembles other neoliberal institutions. This resemblance impacts how and why students are surveilled.
The 1980s was the beginning of major shifts in how higher education is provided. Regan, Thatcher and other world leaders aggressively pushed neoliberal policy reforms that gutted funding to public services and welfare programs. For the public sector, neoliberalism ushered in new policies and organizational principles defined broadly as New Public Management (NPM). NPM being “a combination of free market rhetoric and intensive managerial control practices.”
For higher education, NPM fosters a pursuit of efficiency over the preservation of high educational standards, wellbeing and accessibility. A rapid decrease in faculty positions and well-funded graduate programs and an increase in precarious work, as tenure and long-term teaching positions are replaced by short term contacts, are examples.
The impacts of NPM are felt by undergraduate students who receive a lower standard of education: for example, larger class sizes, overextended teaching assistants, impersonal experiences with administrative staff, and higher tuition costs. A shared sentiment among students in higher education is that administrations tend to treat them in an impersonal, bureaucratic nature — in other words, like numbers. This is a product of NPM. Neoliberalism turns higher education into a commodity, wherein students are no longer just students, they are customers who are purchasing a service from their university. Administrators are tasked with minimizing the university’s cost and maximizing its return. As Chris Lorenz states:
neoliberalism simultaneously shifts its focus from rights to risks; it represents “risk society,” job insecurity, and ‘flexibility’ to be the normal, present-day ‘global’ condition. Neoliberalism thus silently uncouples the globalized individual from fundamental rights formerly connected to national citizenship, like the right to schooling and welfare. It trades all these civil rights for one new right: the right to buy services on the privatized service market.
Surveillance is one of the mechanisms in which risk is monitored and reduced. Rather than providing students with greater educational and welfare supports, surveillance technology allows administrators an effective and relatively cheaper means of managing academic integrity and student conduct. Consequently, campus communities organizing against surveillance must not only mobilize against invasive technologies, but they must also contest NPM and neoliberalism more broadly.
Part II: Surveillance of University and College Campuses
The surveillance of university and college campuses is not particularly new. Since the 1970s, faculty and students have called for limitations on the surveillance of the academic community. Modern organizations like Fight for the Future continue this campaign, organizing against the use of surveillance technology on college campuses. As unique sites where people from various social locations study, work, and live, the surveillance of university campuses should concern us all.
Types of Surveillance on Campuses
For the purpose of this essay, I will consider two types of surveillance found on university and college campuses: general surveillance and performance surveillance.
General surveillance monitors real or perceived threats to community safety as well as general campus goings-on. Biometric data collection is one of the most controversial forms of general surveillance used against students. Biometric data can be described as any information derived from the calculation or measurement of human characteristics. These characteristics can be behavioral as well as physical. Facial recognition and fingerprinting are examples.
In 2020, the University of California Los Angeles announced plans to use facial recognition technology on campus. The plans were later cancelled after mass mobilization by the student body who called for an outright ban of the technology. Fight for the Future maintains a scorecard which tracks which U.S universities have facial recognition software in use.
In addition, there has also been a rise of technologies designed for non-surveillance purposes being repurposed for surveillance. For example, schools like the University of Connecticut use automated license plate readers (ALPR) to manage parking permits. However, the collected data is also used for public safety purposes. Towson University uses similar technology sharing their data with the Maryland Police via the Maryland Coordination and Analysis fusion center. Florida State University also shares its ALPR data with law enforcement.
Performance surveillance is concerned with how surveillance subjects adhere to rules and perform deliverables. In the case of higher education, performance surveillance is used to monitor student conduct and academic integrity and performance. Like general surveillance, performance surveillance has become increasingly tied to digital technology. Examples of performance surveillance include algorithmic test proctoring services that monitor and collect student’s biometric data; phone apps, like the University of Missouri’s SpotterEDU which uses the campus Wi-Fi network to track classroom attendance; or even social media monitoring, like in the case of the University of North Carolina campus police using the software Social Sentinel to monitor the social media activity of protestors at an anti-Confederate memorial demonstration on campus.
In the age of remote learning, COVID-19, and major advances in digital infrastructure, the market for invasive surveillance tech has increased. However, students and faculty have raised valid concerns. These concerns are warranted, both general surveillance and performance surveillance present serious threats to privacy rights and student wellbeing.
Harms to Collective Privacy by Campus Surveillance
There is a growing consciousness amongst campus communities that surveillance is threatening important privacy rights and norms. For example, students and educators have rallied on online spaces like Twitter to express their negative experiences with invasive online proctoring. Students report being flagged for silently lip reading to themselves, or in extreme cases having to urinate on camera in order to avoid being disqualified from exams. Alongside complaints about discriminatory design and policies, there is a growing sense that these forms of surveillance are not merely faulty, but inherently wrong carrying with them dangerous precedents that threaten privacy norms and student wellbeing. The concerns raised by campus communities are correct. It is not merely about these technologies falling to equitably surveil, rather it is about the normalization of privacy invasion and the unbalanced power new and existing surveillance technologies provide oppressive systems and entities.
Surveillance creates “vertical realities” in which the same surveillance practices that produce benefits and conveniences for one group marginalize another. In the case of algorithmic test proctoring, in addition to directly violating students’ privacy, algorithms operate in accordance with the “eugenics gaze” coding some behavior as normal and ideal and other behavior as suspicious and problematic. These categories are then used to educate anti-cheating algorithms. These algorithms have obvious discriminatory outcomes creating and reinforcing the ideals of what makes an acceptable student.
The eugenic gaze aims to identify non-conformity. Algorithms, although appearing as solutions to the biases of human judgement, are rather an extension and in many instances an amplification of the racist, sexist and ableist conditions they were created under. Surveillance does not merely extract knowledge; it also produces it. Even if to address its discriminatory shortcomings, creating algorithms that are better able to recognize neurodivergent behavior, darker skin and other attributes considered abnormal under the eugenic gaze, discriminatory outcomes may be lessened but ultimately the collection of this data in and of itself gives way to harmful projects. The same argument can be raised against general surveillance technologies. Wi-Fi based attendance tracker apps like SpotterEDU, body worn cameras and drones used by campus police departments, implementation of facial recognition technology on campuses, and even automatic license plate readers hold the potential for discriminatory outcomes and the deepening of unequal power dynamics. Increasing the ease and efficiency in which administrations can identify, track and target specific individuals and information sharing with outside identities like local and state police are examples. One outcome of campus surveillance we should be particularly weary about is the suppression of activism on campuses.
Campuses have a long history of activism. Across schools, students, staff, faculty and other community members mobilize in the form of marches, rallies, sit-ins, strikes, etc. to express grievances related to both campus life and larger society. In many instances’ protests have incredible utility especially when used to contest a school’s administration. Many labor victories, cancelation of tuition hikes, and accountability measures have been secured through protests. Surveillance presents a barrier to this work.
Organizing and engaging in protests, especially when said protest involves civil disobedience, inevitably becomes more difficult and risk-latent when demonstrators are subjected to a high-degree of surveillance. For example, knowing that facial recognition cameras or body worn cameras could identify protesters could easily dissuade people from participating in demonstrations. This is especially the case for those in precarious situations, like students studying on a scholarship or those who are undocumented. The ubiquity of surveillance plays a large role in this chilling effect. How much more difficult does it become for campus communities to effectively critique and mobilize against their administration if they do not know the extent to which they are being surveilled?
Many have pushed back against the claim that surveillance is required to keep students safe and maintain educational standards. For example, instead of algorithmic proctoring, many have called for different forms of assessment that challenge traditional modes of examination. However, by providing non-surveillance driven alternatives, universities would have to provide additional educational resources, for example, by providing more compensation to teaching assistants. A reluctance to invest in student wellbeing and educational staff prevents these reforms from coming to fruition.
We should also be concerned about the norms surveillance technologies create. Campuses are test grounds for dangerous surveillance projects. However, to the same extent, they are also test grounds for resistance. Student mobilization against surveillance holds important possibilities not only for future students, but society at large. Campus campaigns against surveillance should (and have been) embracing what Ruha Benjamin describes as an “abolitionist consciousness”. This consciousness calls for the eventual elimination of invasive surveillance and its carceral approach to the governance of student life. Students should aim to disempower surveillance and those who control it, transferring this power back to campus communities in the forms of better educational resources, worker rights, and more democratic governance structures.
Rejections to the Abolitionist Approach
There are a number of important rejections to an abolitionist approach to surveillance that must be addressed. First, there are concerns that surveillance abolitionists overemphasize surveillance’s harms while ignoring its benefits, such as providing campus safety and compliance with academic integrity. Abolitionists would agree that both campus safety and academic integrity are important to student wellbeing. However, they would also reject the assumption that surveillance has done a satisfactory job at providing those things.
Evidence has failed to provide a strong correlation between the expansion of general surveillance technologies, like CCTV cameras, and crime prevention. Under our current carceral-based systems of public safety management, those who are racialized, experiencing homelessness, or otherwise socio-economically disadvantaged are placed under greater scrutiny by surveillance expansion while the root causes of safety concerns (i.e., lack of community resources) are left unaddressed. This is especially the case when these technologies are controlled by historically oppressive entities like police forces. When technologies are used within the context of existing discriminatory systems and organizations, they (intentionally or not) enable the further perpetuation of mythologies about society’s most vulnerable: in effect, stifling rather than protecting important progress made in the arenas of human rights, democracy, and racial justice.
For example, in March 2009, a Virginia fusion center report identified historically black colleges and universities in the state as “nodes for radicalization,” encouraging law enforcement to monitor activities protected under the first amendment. The coding of black students as potential threats to national security reflects the ways in which surveillance is not neutral. Under systems of racial injustice, the surveillance’s gaze is more firmly directed at those who have been historically imagined as threats. Consequently, the abolition of these dangerous technologies, rather than their reform, provides the best path forward.
A second objection to the abolition approach is that it denies technological progress, particularly as it relates to providing a better, more modern education experience for students. Objectors are right to acknowledge the opportunities technology provides for helping students and educators. For example, the quick shift to remote learning at the start of COVID-19 pandemic was made possible in large part due to ed-tech.
Abolitionists once again agree that technologies provide important benefits, however they take concern with the conditions these technologies were created under. Technology companies are businesses before they are educators. Yet, their business model demands they are trusted as if they were the latter. This is often done by equating technological progress with social progress. For example, Proctorio’s website reads “our organization is dedicated to expanding human potential and lifting individual achievement.” ProctorU, a similar company, describes their software as “technology that crushes brick and mortar test centers.” Implicitly, the rhetoric of ProctorU, Proctorio and similar companies puts forward the belief that progress demands innovation, and that such innovation demands techno-determinism, where these new technologies are seen as fully capable of solving all problems and existing faulty infrastructure. Abolitionists contest this narrative, arguing against an adherence to techno-determinism in which administrators outsource educational problems to technology companies.
An abolitionist student movement against surveillance demands a process of sousveillance. The word surveillance is derived from the French word “watching over.” Sousveillance is the reverse, meaning “watching from below.” Students should engage in an ongoing sousveillance that critiques tech companies and administrators. Students should also educate each other about the harms of surveillance, especially in light of university administrations and technology companies’ failure to do so.
Across the United States, for example, students have launched petitions calling for their universities to terminate their contracts with proctor services. Many of these petitions have been successful. Professors at the City University of New York can now opt out of algometric proctoring and the University of London has decided not to work with a third-party proctor service anymore.
Movements should also engage in a constructive campaign, advocating for alternatives to surveillance. In doing so, the movement expands to include other important issues such as a need for greater student involvement in university governance, the labor conditions of graduate students, concerns about NPM, and more. Above all, these movements must be ongoing. They must commit themselves to creating a culture of sousveillance, not independent moments of it. It is through this that consciousness is built and abolitionist objectives are achieved.
Despite being advertised as effective solutions to problems like remote learning and campus safety, surveillance technologies are creating more problems than they are solving. In addition to individual privacy concerns, there are collective privacy risks associated with the surveillance of university and college campuses. In mobilizing against invasive surveillance, campuses must put collective harms at the center of their resistance, placing emphasis on how individual choice making alone cannot remedy or prevent the harms associated with large scale surveillance initiatives. As already demonstrated by current mobilization across campus communities, this campaign entails not only a critique of technologies, but the conditions and systems which build and sustain them.
“About – Proctorio.” Accessed April 4, 2021. https://proctorio.com/about.
American Civil Liberties Union. “Fusion Center Declares Nation’s Oldest Universities Possible Terrorist Threat.” Accessed April 15, 2021. https://www.aclu.org/press-releases/fusion-center-declares-nations- oldest-universities-possible-terrorist-threat.
Ball, Kirstie, Kevin Haggerty, and David Lyon, eds. Routledge Handbook of Surveillance Studies. Routledge, 2012. https://doi.org/10.4324/9780203814949.
Benjamin, Ruha. “Catching Our Breath: Critical Race STS and the Carceral Imagination | Benjamin | Engaging Science, Technology, and Society.” Accessed February 6, 2021. https://estsjournal.org/index.php/ests/article/view/70.
Benjamin, Ruha. Race after Technology: Abolitionist Tools for the New Jim Code. Medford, MA: Polity, 2019.
Bizouati-Kennedy, Yaël. “Amazon Founder Jeff Bezos Stepping Down as CEO.” Accessed April 13, 2021. https://finance.yahoo.com/news/amazon-founder-jeff-bezos-stepping-220842749.html.
Carl T. Bergstrom. “28. We Can Transition from Testing Students at Lower Levels of Bloom’s Taxonomy (Whatever You May Think of Its Specifics, It Captures the General Notion I’m Trying to Convey), to Testing Their Ability to Analyze, Evaluate, and Create. Https://T.Co/1FBTbYPGMe.” Tweet. @CT_Bergstrom (blog), October 31, 2020. https://twitter.com/CT_Bergstrom/status/1322371863796871168.
Carlton, Genevieve. “A History of Student Activism and Protests | BestColleges.” BestColleges.com, May 18, 2020. https://www.bestcolleges.com/blog/history-student-activism-in-college/
Coccia, Mario. “Bureaucratization in Public Research Institutions.” Minerva 47, no. 1 (March 1, 2009): 31–50. https://doi.org/10.1007/s11024-008-9113-z.
Dubal, Veena. Veena Dubal, Surveillance Is Not a Social Good: Technocapital, Public Health, and the Pandemic [2020 C4eJ 35], 2020. https://c4ejournal.net/2020/06/05/veena-dubal-surveillance-is-not-a-social-good-technocapital-public-health-and-the-pandemic-2020-c4ej-35/.
Errick, Kirsten. “Students Sue Online Exam Proctoring Service ProctorU for Biometrics Violations Following Data Breach – Tech.” Law Street Media (blog), March 15, 2021. https://lawstreetmedia.com/tech/students- sue-online-exam-proctoring-service-proctoru-for-biometrics-violations-following-data-breach/.
Eubanks, Virgina. “How Big Data Could Undo Our Civil-Rights Laws.” The American Prospect. Accessed April 13, 2021. https://prospect.org/justice/big-data-undo-civil-rights-laws/.
Eubanks, Virgina. “Want to Predict the Future of Surveillance? Ask Poor Communities.” The American Prospect. Accessed April 13, 2021. https://prospect.org/power/want-predict-future-surveillance-ask-poor- communities./.
Future, Fight for the. “Stop Facial Recognition on Campus.” Stop Facial Recognition on Campus. Accessed August 1, 2021. https://www.banfacialrecognition.com/campus/.
Future, Fight for the. “Backlash Forces UCLA to Abandon Plans for Facial Recognition Surveillance on Campus.” Medium, February 20, 2020. https://fightfortheftr.medium.com/backlash-forces-ucla-to- abandon-plans-for-facial-recognition-surveillance-on-campus-ebe005e3f715.
Gellman, Barton, and Sam Adler-Bell. “The Disparate Impact of Surveillance.” The Century Foundation. Accessed March 19, 2021. https://tcf.org/content/report/disparate-impact-surveillance/?agreed=1&agreed=1#easy-footnote-bottom-10.
Harris, Margot. “A Student Says Test Proctoring AI Flagged Her as Cheating When She Read a Question out Loud. Others Say the Software Could Have More Dire Consequences.” Insider. Accessed July 31, 2021. https://www.insider.com/viral-tiktok-student-fails-exam-after-ai-software-flags-cheating-2020-10.
Hoffman, Jascha. “Sousveillance.” The New York Times, December 10, 2006, sec. Magazine. https://www.nytimes.com/2006/12/10/magazine/10section3b.t-3.html.
Inc, Docket Alarm. Thakkar et al v. ProctorU, Inc., 2:21-cv-02051, No. 1 (C.D.Ill. Mar. 12, 2021). Accessed April 14, 2021.
Jung, Chaelin. “Big Ed-Tech Is Watching You: Privacy, Prejudice, and Pedagogy in Online Proctoring.” Brown Political Review (blog), December 7, 2020. https://brownpoliticalreview.org/2020/12/big-ed-tech- is-watching-you-privacy-prejudice-and-pedagogy-in-online-proctoring/.
Korolov, Maria. “What Is Biometrics? 10 Physical and Behavioral Identifiers.” CSO Online, February 12, 2019. https://www.csoonline.com/article/3339565/what-is-biometrics-and-why-collecting-biometric-data- is-risky.html.
Lorenz, Chris. “If You’re So Smart, Why Are You under Surveillance? Universities, Neoliberalism, and New Public Management.” Critical Inquiry 38, no. 3 (2012): 599–629. https://doi.org/10.1086/664553.
Maass, Dave. “Scholars Under Surveillance: How Campus Police Use High Tech to Spy on Students.” Electronic Frontier Foundation, March 9, 2021. https://www.eff.org/deeplinks/2021/03/scholars-under- surveillance-how-campus-police-use-high-tech-spy-students.
Matzner, Tobias, and Carsten Ochs. “Privacy.” Internet Policy Review 8, no. 4 (2019). https://doi.org/10.14763/2019.4.1427.
Matzner, Tobias. “Why Privacy Is Not Enough Privacy in the Context of ‘Ubiquitous Computing’ and ‘Big Data.’” Journal of Information, Communication & Ethics in Society (Online) 12, no. 2 (2014): 93–106. https://doi.org/10.1108/JICES-08-2013-0030.
Piza, Eric L., Brandon C. Welsh, David P. Farrington, and Amanda L. Thomas. “CCTV Surveillance for Crime Prevention.” Criminology & Public Policy 18, no. 1 (2019): 135–59. https://doi.org/10.1111/1745-9133.12419.
“ProctorU – The Leading Proctoring Solution for Online Exams.” Accessed April 15, 2021. https://www.proctoru.com/.
Roessler, Beate, and Dorota Mokrosinska. “Privacy and Social Interaction.” Philosophy & Social Criticism 39, no. 8 (October 1, 2013): 771–91. https://doi.org/10.1177/0191453713494968.
Swauger, Shea. “Remote Testing Monitored by AI Is Failing the Students Forced to Undergo It.” Accessed April 14, 2021. https://www.nbcnews.com/think/opinion/remote-testing-monitored-ai-failing-students- forced-undergo-it-ncna1246769.
Swauger, Shea. “Software That Monitors Students during Tests Perpetuates Inequality and Violates Their Privacy.” MIT Technology Review. Accessed April 14, 2021. https://www.technologyreview.com/2020/08/07/1006132/software-algorithms-proctoring-online-tests-ai- ethics/.
Swauger, Shea. “Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education.” Hybrid Pedagogy, April 2, 2020. https://hybridpedagogy.org/our-bodies-encoded-algorithmic-test-proctoring-in-higher- education/.
“The CIA and Academic Freedom | Opinion | The Harvard Crimson.” Accessed April 14, 2021. https://www.thecrimson.com/article/1980/4/3/the-cia-and-academic-freedom-pithis/.
University of Missouri. “Attendance App Is Optional for Students Participating in Pilot Project.” Accessed April 14, 2021. https://showme.missouri.edu/2020/attendance-app-is-optional-for- students-participating-in-pilot-project/.
Warren, Samuel D., and Louis D. Brandeis. “The Right to Privacy.” Harvard Law Review 4, no. 5 (1890): 193– 220. https://doi.org/10.2307/1321160.
Wrenn, John. “Wrenn GS: The Surveillance School.” Brown Daily Herald, March 28, 2021. https://www.browndailyherald.com/2021/03/28/wrenn-gs-surveillance-school/.
* Tsitsi Macherera is a recent graduate from the University of Toronto. Their research interests include black feminist thought, urban planning, and more recently surveillance studies. In her free time, she enjoys film photography, jump rope, and finding new music.
 Veena Dubal, Veena Dubal, Surveillance Is Not a Social Good: Technocapital, Public Health, and the Pandemic [2020 C4eJ 35], 2020, https://c4ejournal.net/veena-dubal-surveillance-is-not-a-social-good-technocapital-public-health-and-the-pandemic-2020-c4ej-35/.
 Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code, accessed January 23, 2021, https://www.wiley.com/en-ca/Race+After+Technology%3A+Abolitionist+Tools+for+the+New+Jim+Code-p-9781509526437.; Virgina Eubanks, “Want to Predict the Future of Surveillance? Ask Poor Communities,” The American Prospect, accessed April 13, 2021, https://prospect.org/power/want-predict-future-surveillance-ask-poor-communities./.
 Dubal, supra note 1.
 Dave Maass, “Scholars Under Surveillance: How Campus Police Use High Tech to Spy on Students,” Electronic Frontier Foundation, March 9, 2021, https://www.eff.org/deeplinks/2021/03/scholars-under-surveillance-how-campus-police-use-high-tech-spy-students.
 Virginia Eubanks, “Want to Predict the Future of Surveillance? Ask Poor Communities,” The American Prospect, accessed April 13, 2021, https://prospect.org/power/want-predict-future-surveillance-ask-poor-communities./.
 Barton Gellman and Sam Adler-Bell, “The Disparate Impact of Surveillance,” The Century Foundation, accessed March 19, 2021, https://tcf.org/content/report/disparate-impact-surveillance/?agreed=1&agreed=1#easy-footnote-bottom-10.
 Eubanks, supra note 17.
 Gellman and Adler-Bell, supra note 18.
 Tobias Matzner, “Why Privacy Is Not Enough Privacy in the Context of ‘Ubiquitous Computing’ and ‘Big Data,’” Journal of Information, Communication & Ethics in Society (Online) 12, no. 2 (2014): 93–106, https://doi.org/10.1108/JICES-08-2013-0030.
 Ibid, 98.
 Ibid; Virginia Eubanks, “How Big Data Could Undo Our Civil-Rights Laws,” The American Prospect, accessed April 13, 2021, https://prospect.org/justice/big-data-undo-civil-rights-laws/.
 Yaël Bizouati-Kennedy, “Amazon Founder Jeff Bezos Stepping Down as CEO,” Yahoo Finance, accessed April 13, 2021, https://finance.yahoo.com/news/amazon-founder-jeff-bezos-stepping-220842749.html.
 Benjamin, supra note 2.
 Eubanks, supra note 27.
 Chris Lorenz, “If You’re So Smart, Why Are You under Surveillance? Universities, Neoliberalism, and New Public Management,” Critical Inquiry 38, no. 3 (2012): 599–629, https://doi.org/10.1086/664553.
 Ibid, 600.
 Lorenz, supra note 31, 602.
 No Writer Attributed, “The CIA and Academic Freedom,” The Harvard Crimson, accessed April 14, 2021, https://www.thecrimson.com/article/1980/4/3/the-cia-and-academic-freedom-pithis/.
 Maria Korolov, “What Is Biometrics? 10 Physical and Behavioral Identifiers,” CSO Online, February 12, 2019, https://www.csoonline.com/article/3339565/what-is-biometrics-and-why-collecting-biometric-data-is-risky.html.
 Fight for the Future, “Backlash Forces UCLA to Abandon Plans for Facial Recognition Surveillance on Campus,” Medium, February 20, 2020, https://fightfortheftr.medium.com/backlash-forces-ucla-to-abandon-plans-for-facial-recognition-surveillance-on-campus-ebe005e3f715.
 Future, supra note 37.
 Maass, supra note 4.
 Kirsten Errick, “Students Sue Online Exam Proctoring Service ProctorU for Biometrics Violations Following Data Breach – Tech,” Law Street Media (blog), March 15, 2021, https://lawstreetmedia.com/tech/students-sue-online-exam-proctoring-service-proctoru-for-biometrics-violations- following-data-breach/.
 University of Missouri, “Attendance App Is Optional for Students Participating in Pilot Project,” Show Me Mizzou, accessed April 14, 2021, https://showme.missouri.edu/2020/attendance-app-is-optional-for-students-participating-in-pilot-project/.
 Maass, supra note 4.
 Margot Harris, “A Student Says Test Proctoring AI Flagged Her as Cheating When She Read a Question Out Loud. Others Say the Software Could Have More Dire Consequences.,” Insider, accessed July 31, 2021, https://www.insider.com/viral-tiktok-student-fails-exam-after-ai-software-flags-cheating-2020-10.
 Ibid; Shea Swauger, “Remote Testing Monitored by AI Is Failing the Students Forced to Undergo It,” NBC News, accessed April 14, 2021, https://www.nbcnews.com/think/opinion/remote-testing-monitored-ai-failing-students-forced-undergo-it-ncna1246769.
 Benjamin, supra note 2.
 Shea Swauger, “Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education,” Hybrid Pedagogy, April 2, 2020, https://hybridpedagogy.org/our-bodies-encoded-algorithmic-test-proctoring-in-higher-education/.
 Benjamin, supra note 2.
 Genevieve Carlton, “A History of Student Activism and Protests,” BestColleges.com, May 18, 2020, https://www.bestcolleges.com/blog/history-student-activism-in-college/.
 John Wrenn, “Wrenn GS: The Surveillance School,” Brown Daily Herald, March 28, 2021, https://www.browndailyherald.com/2021/03/28/wrenn-gs-surveillance-school/.
 Chaelin Jung, “Big Ed-Tech Is Watching You: Privacy, Prejudice, and Pedagogy in Online Proctoring,” Brown Political Review (blog), December 7, 2020, https://brownpoliticalreview.org/2020/12/big-ed-tech-is-watching-you-privacy-prejudice-and-pedagogy-in-online-proctoring/’; Swauger, supra note 48.
 Carl T. Bergstrom, “28. We Can Transition from Testing Students at Lower Levels of Bloom’s Taxonomy (Whatever You May Think of Its Specifics, It Captures the General Notion I’m Trying to Convey), to Testing Their Ability to Analyze, Evaluate, and Create,” Tweet, @CT_Bergstrom (blog), October 31, 2020, https://twitter.com/CT_Bergstrom/status/1322371863796871168.
 Benjamin, supra note 57.
 American Civil Liberties Union, “Fusion Center Declares Nation’s Oldest Universities Possible Terrorist Threat,” accessed April 15, 2021, https://www.aclu.org/press-releases/fusion-center-declares-nations-oldest-universities-possible-terrorist-threat.
 Benjamin, supra note 57.
 Wrenn, supra note 54.
 Jascha Hoffman, “Sousveillance,” The New York Times, December 10, 2006, sec. Magazine, https://www.nytimes.com/2006/12/10/magazine/10section3b.t-3.html.
 Maass, supra note 4.