Writing for this blog in 2013, Dr. John Jones warned, “[D]igital technologies are increasingly important not just to learning, but to all facets of students’ lives[,] and reminders of the breaches of privacy made possible by these technologies are nearly ubiquitous.” Jones described how education technologies subject students to the invasive gaze of corporate actors who might use information gained from students for commercial purposes (e.g., micro-targeting ads to students or aggregating profiles across platforms). Like many scholars and technologists, Jones recognized that to deliver vital services (including education) via commercial platforms is to reconfigure the basic terms of collective life, a mingling of public duties and private benefit that does not sit easily with accepted norms of privacy.
Researchers have described the complex phenomenon of privacy in many ways, including Professor Helen Nissenbaum’s influential formulation of contextual integrity—the idea that people reasonably expect information divulged in one situation or relationship to stay there1—and legal scholar Daniel Solove’s social privacy—the idea that privacy is not simply an individual concern, but a shared attribute of a functional group or community.2 Student privacy is an important site of ongoing privacy research, including the design of privacy-enhancing technologies;3 qualitative exploration of attitudes of teachers, students, and administrators;4 and legal and policy analysis.5 And while this research is rich, well documented, and ongoing, the language of privacy has proved wholly inadequate to keep pace with the constant expansion of edtech of all shapes and flavors. More, a focus on privacy can prevent scholars from identifying real harms associated with education technology that have everything to do with the terms of public life.
Despite decades of findings that call for greater attention to privacy, students continue to be made vulnerable to many forms of harm as a result of everyday activities that involve learning technologies. In light of the rapid and largely unregulated expansion of education technology into all kinds of schools as a result of the COVID 19 pandemic, cautions about privacy seem tragically quaint. The ubiquity of commercial apps and platforms in teaching; the ever growing number of voluntary and involuntary users of these tools; the commercial value of the edtech market: in near-2023, the ongoing datafication of education has succeeded at a scale that would have surprised edtech optimists at the start of the decade.6 At the same time, the vocabulary of privacy has been unable to grasp the free-for-all capture of data that describes students and their families by many kinds of private and state actors. Given their role is designing, selecting, and vetting software for students, responsible educators and technologists interested in student privacy should engage with research on surveillance.
Surveillance has proliferated across many domains of life such as work, play, and romance, but in schools, the growth of many forms of surveillance is especially stark. In addition to the allure of edtech, pressure on schools to demonstrate desired learning outcomes, mold student behavior, and mitigate security risks has fueled a boom in wildly varied forms of surveillance. Alongside school-based police units, closed-circuit cameras, metal detectors, and physical inspections, various forms of data collection and analysis have also been used to significantly increase surveillance in schools.7 As Dr. Ben Williamson (2014) writes, data aggregation and analysis has taken on a life of its own, displacing conventional ideas about what learning is for. Public policy has started to play catch up to more visible threats to student privacy through laws that prohibit some commercial uses of student data, but these laws, because they rely on legal notions of privacy, may fail to apprehend the depth, reach, and growth of surveillance.
Surveillance has been a topic of heightened scholarly interest since the turn of the last century, when sociologists like Mark Poster8 (echoing Foucault and others) predicted that electronic traces of life would give rise to new forms of power and control. Canonical works on surveillance have defined contemporary surveillance as “watching, monitoring, tracking, or data analyzing for purposes of control.”9 Data captured, aggregated, and analyzed by contemporary computational systems (from social media to electronic card readers to search engine activity) produce new opportunities to sort people, a power that rests in the hands of technologically sophisticated corporate and state actors exclusively. Surveillance research has also frequently incorporated studies of visual art, fiction, film, and other kinds of cultural production to flesh out the prevalence of surveillance, the way that contemporary technologies embody relations of power and authority: who must be watched at all times, who must do the watching, who must remain unseen. Three accessible ideas from surveillance studies research are immediately applicable to the design and use of technologies deployed in formal or informal learning:
Surveillance is a vector of power for both private and state agents. Traditional distinctions between data captured by private actors and the state are blurred in edtech surveillance. For example, law enforcement agencies have sometimes re-used data collected by schools for purposes of criminal investigations. Seemingly inconsequential data captured in the context of learning can harm or injure a student or a student’s family when aggregated or analyzed. Fusion centers and other kinds of new business models exist to aggregate data from different sources (some public, some private) and to sell it. In some cases, police or other agents of the state have purchased data collected by commercial platforms, data that they themselves would be legally barred from collecting. Surveillance approaches would direct us to reconstruct the “assemblages” that have accomplished such extra-legal surveillance in order to describe and resist them.10
Surveillance of all kinds, including data capture, harms vulnerable populations. Predictably, the gaze of powerful authorities turns most often on those groups already marked by interlocking forms of difference, such as race, gender, class, sexuality, or citizenship.11 Dr. Simone Browne has linked contemporary surveillance to long histories of legal discrimination and state violence directed at Black people. Thinking about contemporary digital technologies in terms of surveillance forces us
to recognize that “race continues to structure surveillance practices.”12 Surveillance points to the largely unspoken rules about what kinds of people are allowed to be in public and what kinds of people are always suspect, subject to the invasive, controlling gaze of the state and its agents. As researchers at the Center for Democracy and Technology write, “Students from low-income families, Black students, and Hispanic students are at greater risk of harm” related to edtech and surveillance. They also caution that LGBTQ+ students are also very vulnerable to harms related to education technology and surveillance.
Surveillance is not inevitable. Despite popular fictional representation such as George Orwell’s 1984 and Netflix’s Black Mirror, surveillance regimes are neither omniscient nor omnipotent. As public intellectual and surveillance scholar Dr. Christopher Gilliard writes, technologists and other kinds of experts treat surveillance as if it cannot be stopped, but there already exist powerful community-based coalitions dedicated to pushing back against surveillance. Likewise, legislative action at the state level has shown some promising developments. While it is tempting to imagine that surveillance is someone else’s problem, meaningful change requires multiple forms of activism and contestation. There is nothing inevitable about the design of education technologies, no app, platform, or system that cannot be changed, updated, or, if needed, abolished. The shape and uses of technology are decided by powerful corporations and the state, but also by students, teachers, and activists working to combat surveillance abuse in their communities through collective action. Like other forms of struggle, individual choices are no substitute for organized collective action led by those most impacted by systems of power and privilege.
- Nissenbaum, H. (2010). Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford Law Books.
- Solove, D. J. (2004). The Digital Person. NYU Press.
- Amo, D., Prinsloo, P., Alier, M., Fonseca, D., Torres Kompen, R., Canaleta, X., & Herrero-Martín, J. (2021). Local Technology to Enhance Data Privacy and Security in Educational Technology. International Journal of Interactive Multimedia and Artificial Intelligence, 7(2), 262. https://doi.org/10.9781/ijimai.2021.11.006
- Rubel, A., & Jones, K. M. L. (2016). Student privacy in learning analytics: An information ethics perspective. The Information Society, 32(2), 143–159. https://doi.org/10.1080/01972243.2016.1130502
- Peterson, D. (2016). Edtech and student privacy: California law as a model. Berkeley Tech. LJ, 31, 961.
- Lupton, D., & Williamson, B. (2017). The datafied child: The dataveillance of children and implications for their rights. New Media & Society, 19(5), 780–794. https://doi.org/10.1177/1461444816686328
- Crooks, R. (2019). Cat-and-mouse games: Dataveillance and performativity in urban schools. Surveillance & Society, 17(3/4), 484–498. https://doi.org/10.24908/ss.v17i3/4.7098.
- Poster, M. (1994). The mode of information and postmodernity. In D. Crowley & D. Mitchell (Eds.), Communication theory today (pp. 173–192). Stanford University Press.
- Monahan, T., & Torres, R. D. (Eds.). (2010). Schools Under Surveillance: Cultures of Control in Public Education. Rutgers University Press.
- Haggerty, K., & Ericson, R. (2000). The surveillant assemblage. British Journal of Sociology, 51(4), 605–622.
- Eubanks, V. (2017). Automating inequality: How high-tech tools profile, police, and punish the poor. St. Martin’s Press.
- Browne, S. (2015). Dark matters: On the surveillance of blackness. Duke University Press, p. 11.
Blog post by Roderic Crooks, Assistant Professor of Informatics, UCI