We are happy to announce a new collection of research papers and opinion pieces on the topic of Algorithmic Rights and Protections of Children, as an MIT Press Works in Progress collection on the open access PubPub Platform. Here we offer an excerpt from our editors’ Introduction.
If you’d like to offer comment or criticism on this volume, the open review period for this title will end on October 5, 2021.
To comment, simply create a PubPub account, and sign in. As you’re reading, you can highlight the text you want to comment on, and an icon will pop up for you to write your comment inline. You can also make general comments at the end of each chapter.
One in three Internet users worldwide are children, and what they see and experience online is increasingly shaped by algorithms. Yet the dominant platforms of the online world have not been constructed with the needs and interests of children in mind. Children represent an especially marginalized and vulnerable population exposed to high levels of poverty and inequality, while being dependent on adults to advocate for their interests and structure their experiences. In 2021, as we emerge from a pandemic that has made us even more reliant on digital platforms, society is struggling to rein in the power of big tech and elevate the needs of marginalized groups. This tension is particularly acute when it comes to balancing opportunities and risks for children in online spaces.
Despite the important role that children’s protections and rights play in debates of the social impacts and responsibility of tech platforms, issues unique to children have not been a significant focus of debates over AI and ethics. Some notable exceptions include UNICEF’s AI for Children project, the work of organizations such as the Family Online Safety Institute, Common Sense Media, the 5Rights Foundation, and the UN Committee on the Rights of the Child’s General Comment 25, outlining children’s rights in digital spaces. A small but growing body of work on digital parenting and children’s experiences with algorithms seeks to inform this debate (see for example, Livingstone & Blum-Ross, 2020, Barassi, 2020, Lenhart & Owens, 2021, Livingstone et al., 2018). This collection of essays builds on this momentum, providing perspectives, frameworks, and research for understanding diverse children’s evolving relationships with algorithms, and how caregivers, educators, policy-makers, and other adult stakeholders might shape these relationships in productive ways. We introduce the collection by outlining three cross-cutting concerns: (1) the relationship between algorithms, culture and society; (2) the unique needs and positionality of children; and (3) inequality in children’s risks and opportunities.
Algorithms, Culture, and Society
Despite the often novel nature of algorithms, big data, and AI, our existing frameworks for understanding the relationship between technology, culture, and society are as relevant as ever. Science and technology studies scholars have insisted that we look at how technologies are shaped by our existing cultural biases and institutionalized practices, and also shape or “impact” culture and society (see for example Hine, 2016; MacKenzie & Waczman, 1985; Bijker et al., 2012). The time is ripe for critical scrutiny of how algorithms are shaped by and reflect historic inequities, problematic assumptions, and institutionalized power. We also need solution-oriented scholarship and design thinking that considers how these technologies can be shaped to be more equitable and serve the needs and interests of children. This volume includes work that critically analyzes how algorithms reflect existing structures and biases, as well as work centered on designing and reshaping technology to serve children.
Children’s Perspectives and Needs
AI challenges our assumptions, most obviously, about what counts as intelligence, and the boundaries between humans and machines. Perhaps less obviously, AI also challenges us to reconsider assumptions about childhood culture, what is “age appropriate,” and the balance between rights and protections for children. Negotiations over media and technology have long been a site of intergenerational struggle. Whether it was novels, television, video games, or today’s social media, the “new” media of the day have offered an arena for young people to exercise agency and develop new cultural forms, often provoking concern from parents and moral panics writ large (Livingstone and Blum-Ross, 2020; Ito et al., 2019; Jenkins, 1998; Seiter, 1995). The rapid incursion of digital, interactive, mobile, and networked media in young people’s lives since the nineties has been a particularly complex and fraught arena for navigating the tension between rights and protections for children. Media and tech companies, and the algorithms that pervade online spaces, are now powerful players in the everyday negotiations over even young childrens’ engagement with knowledge, media, and social networks. Many of the essays in this volume are centered on children’s voices and viewpoints, suggesting ways of shaping our algorithmic futures based on these perspectives.
The unequal power dynamics between children and adults are critical factors in considering algorithmic rights and protections for children; inequality between different populations of children is equally important. Safiya Noble (2020) opens her book, Algorithms of Oppression, with her experience of googling “black girls,” in hopes of finding interesting content for her stepdaughter and nieces, only to discover pornography featuring black girls as the first search result. Algorithmic biases and inequalities that pervade the adult world are doubly damaging for marginalized children. We now have a growing literature on the harm that AI and algorithms can cause when they reproduce the assumptions and structural inequalities of the dominant culture (eg., Brayne, 2020; Benjamin, 2019), but still relatively little work that looks at the impacts on unequal childhoods. Too often, research and public discourse makes generalizations about the experiences of “kids these days” that ignore the experiences of less privileged and marginalized youth. Essays in this volume build on a budding body of research that examines how social media, digital games, and learning technologies reflect and reinforce unequal childhoods. This includes work on how inequality in children’s experiences with technology differ across national contexts, as well as within them.
Understanding children’s algorithmic rights and protections requires multidisciplinary and cross-sector viewpoints and synthesis, given the range of institutional settings where children encounter algorithms, and the unique forms of inequality and risks that children encounter throughout their growing up. This collection of essays represents a variety of viewpoints, fields, and disciplinary voices in two genres. “Perspectives” are shorter conceptual pieces that share a unique viewpoint or apply a framework from a particular field of discipline to the topic at hand. “Research Papers” are longer contributions that report on empirical or design research. The essays offer critical and provocative analysis, frameworks for understanding, as well as practical approaches for how to productively engage with emerging technologies as designers, educators, and parents. We hope that this range of voices and contributions will foster more dialog, creative thinking, and coalition building at this unique but critical nexus of children, algorithms, care, and social justice. We invite engagement, critique, and dialog through the MIT Open online forum.