{"id":91237,"date":"2021-08-31T14:05:44","date_gmt":"2021-08-31T18:05:44","guid":{"rendered":"https:\/\/cdt.org\/?post_type=insight&p=91237"},"modified":"2021-08-31T14:18:17","modified_gmt":"2021-08-31T18:18:17","slug":"protecting-student-privacy-and-ensuring-equitable-algorithmic-systems-in-education","status":"publish","type":"insight","link":"https:\/\/cdt.org\/insights\/protecting-student-privacy-and-ensuring-equitable-algorithmic-systems-in-education\/","title":{"rendered":"Protecting Student Privacy and Ensuring Equitable Algorithmic Systems in Education"},"content":{"rendered":"\n

Using student data responsibly<\/a> is about more than adhering to legal requirements \u2014 it also requires schools and their partners to use data in ways that help students and to guard against using technology to discriminate against, stigmatize, or otherwise harm them and their families. <\/p>\n\n\n\n

As the technology and data used in education continue to evolve, discrimination and bias have taken on new forms, including in the use of algorithms. This summer, the Center for Democracy & Technology (CDT<\/a>) submitted two sets of comments (here<\/a> and here<\/a>) to the U.S Department of Education (ED), asking it to address the discriminatory effects of some algorithmic systems on marginalized groups of students.<\/p>\n\n\n\n

An algorithm<\/a> is a process performed by a computer to answer a question or carry out a task, such as sorting students into schools, analyzing social media posts, or flagging students at risk for dropping out. Algorithms, however, are not neutral decision-makers. Subjective human judgments influence the design of the algorithm, dictate the purpose, design, and function of an algorithm and influence its outcomes. Moreover, data used to train algorithms may itself implicitly embed biases.<\/p>\n\n\n\n

Algorithmic bias has very real effects on students, especially those in marginalized groups such as Black and LGBTQ students and students with disabilities. Consider:<\/p>\n\n\n\n