Skip to Content

Equity in Civic Technology, Privacy & Data

Emerging Safety Technologies in Schools: Addressing Privacy and Equity Concerns to Ensure a Safe In-Person School

by CDT intern Cecy Sanchez

Schools that have resumed in-person classes are trying to ensure a safe environment for students and school staff, a challenge that’s been exacerbated by the ongoing pandemic. In addition to protecting against threats such as weapons and unauthorized visitors, there is currently a need to enforce health guidelines, like requiring students to wear a face covering. Some schools are employing emerging AI-driven systems with the goal of enforcing school safety protocols and protecting those attending schools in-person.

Student safety technologies (e.g. software or applications, often using artificial intelligence, that are intended to keep students safe in schools) include tools like all-in-one security platforms and visitor management systems that purport to protect against a variety of threats, such as screening visitors (e.g., contractors, guardians, and volunteers) for unauthorized entries, and identifying guns and suspicious activity. In addition, they can conduct other tasks like marking students as tardy. In response to the COVID-19 pandemic, some student safety technologies  now also incorporate components that aim to evaluate compliance with health measures, such as maintaining social distancing parameters among people. 
Other software designed specifically for enforcing health guidelines includes mask compliance software that notifies school authorities when a person is not wearing a face covering or is wearing one incorrectly; wearable devices used for contact tracing for students who have contracted COVID-19 and those in contact with them, as well as for measuring and maintaining social distancing compliance; and thermal cameras that scan a person’s temperature upon entering a building.

Student Privacy and Equity Considerations and Recommendations

Student safety technologies are often deployed with good intentions, but their use in education contexts can raise concerns that may ultimately undermine student privacy, diminish trust in schools, and disproportionately impact students from historically marginalized populations. 

Transparency

One challenge is that, oftentimes, companies developing safety software are not transparent about their data management practices (e.g., where and how data is stored, for how long it will be retained, what technical steps the company has taken to secure that data, etc.). Deploying a technology without disclosing to stakeholders how their data will be managed or having security procedures in place can threaten student privacy and undermine families’ trust in the school’s administration. 

To address this, school authorities should ensure that vendors have adequate security policies and procedures, have plans for safekeeping data — including retention and deletion procedures — and disclose to students and their families how information will be managed and safeguarded. This is especially important for systems with various components (e.g., all-in-one management software), as it can be difficult to understand how data is shared across features. 

Consent and Responsible Data Practices

Transparency should be accompanied by responsible data practices (e.g. minimizing harm to individuals and supporting the public good) and, where possible, alternative options for those who are not comfortable using or interacting with the technology.

Schools should take steps to only gather necessary data (in accordance with data minimization principles) and evaluate whether using a system that includes a given feature, such as facial recognition, is necessary, or whether said feature can be disabled. Monitoring devices like thermal cameras and wearables can feel intrusive and overtly surveillant to students.

Efficacy 

Another concern involves the misleading promotion of AI-driven tools, namely software solutions that claim to improve student wellbeing and safety but in reality, are not proven to do so. Gun detection systems that assert they reduce crimes in schools and ensure a safe environment, for instance, might lack the evidence to support their claims. Similarly, systems not specifically created for schools might not be calibrated for that environment. For example, mask compliance software designed to be deployed in work places could fail to recognize children’s faces and thus perform poorly in a school context. 

Schools should request information from vendors about how they designed or adjusted their systems for a school context, and how they validate that their system is effective. If vendors are unable or unwilling to provide the school with evidence of the system’s effectiveness, the school should consider dropping the system or finding another vendor.

Disproportionate Impact

Emerging safety technologies can disproportionately impact different groups of people, raising equity concerns that must be considered. For instance, technologies might exhibit different false-positive rates for different populations because they fail to capture certain nuances. An example of this are tools to enforce social distance that might disproportionately flag students who require physical assistance or accompaniment. 

Likewise, threat-detection systems that wrongly identify a “threat” and notify security personnel or local law enforcement authorities can result in unwarranted encounters. Encounters with law enforcement disproportionately affect historically underrepresented groups, like Black and Hispanic students, and can undermine a student’s well being and negatively affect their development.

Schools should ask vendors how they audit their systems for disproportionate impact, and ask for data about how the system performs across different populations. If the company does not perform such testing, cannot provide the data, or performs differently for different groups, the school should not proceed with using that system.

Conclusion

To ensure a safe return to in-person instruction, schools are considering deploying student safety technologies. Although deployed with the expectation that these tools will safeguard student well being, they can often produce the opposite result by undermining student privacy and equity. Schools should be aware of the potential concerns that AI-driven decision-making systems can raise, evaluate whether employing them is the best option, and know what the best practices are for successfully using them.